Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Zhou, Zheng; Dai, Cong; Liu, Wei-Xin
2015-01-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with conventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR = 1.82; 95% CI 1.24 to 2.67) and clinical response (RR = 1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Zhou, Zheng; Dai, Cong; Liu, Wei-xin
2015-06-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with con- ventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR =1.82; 95% CI 1.24 to 2.67) and clinical response (RR =1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Identification of the isomers using principal component analysis (PCA) method
NASA Astrophysics Data System (ADS)
Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur
2016-03-01
In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.
Research Design and Statistics for Applied Linguistics.
ERIC Educational Resources Information Center
Hatch, Evelyn; Farhady, Hossein
An introduction to the conventions of research design and statistical analysis is presented for graduate students of applied linguistics. The chapters cover such concepts as the definition of research, variables, research designs, research report formats, sorting and displaying data, probability and hypothesis testing, comparing means,…
Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.
Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos
To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.
Shivasakthy, M; Asharaf Ali, Syed
2013-10-01
A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. RESULTS were analyzed statistically using Paired students t-test. The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Merocel strip produces more gingival displacement than the conventional retraction cord.
Shivasakthy, M.; Asharaf Ali, Syed
2013-01-01
Statement of Problem: A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. Purpose of the Study: This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Material and Methods: Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. Results were analyzed statistically using Paired students t-test. Results: The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Conclusion: Merocel strip produces more gingival displacement than the conventional retraction cord. PMID:24298531
Tambe, Varsha H; Nagmode, Pradnya S; Vishwas, Jayshree R; P, Saujanya K; Angadi, Prabakar; Ali, Fareedi Mukram
2013-01-01
Background: To compare the amount of debris extruded apically by using conventional syringe, Endovac & Ultrasonic irrigation. Materials & Methods: Thirty freshly extracted mandibular premolars were selected, working length was determined and mounted in a debris and collection apparatus. The canals were prepared. After each instrument change, 1 ml. of 3% sodium hypochlorite was used as irrigation. Debris extruded apically by using conventional syringe, endovac& ultrasonic irrigation tech, was measured using the electronic balance to determine its weight and statistical analysis was performed. The mean difference between the groups was determined using statistical analysis within the groups &between the groups for equal variances. Results: Among all the groups, significantly less debris were found apically in the Endovac group (0.96) compared to conventional and ultrasonic group (1.23) syringe. Conclusion: The present study showed that endovac system extrudes less amount of debris apically as compared to ultrasonic followed by conventional so incidence of flare up can be reduce by using endovac irrigation system. How to cite this article: Tambe V H, Nagmode P S, Vishwas J R, Saujanya K P, Angadi P, Ali F M. Evaluation of the Amount of Debris extruded apically by using Conventional Syringe, Endovac and Ultrasonic Irrigation Technique: An In Vitro Study. J Int Oral Health 2013; 5(3):63-66. PMID:24155604
Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves
2017-01-01
Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film.
Yi, Jianru; Li, Meile; Li, Yu; Li, Xiaobing; Zhao, Zhihe
2016-11-21
The aim of this study was to compare the external apical root resorption (EARR) in patients receiving fixed orthodontic treatment with self-ligating or conventional brackets. Studies comparing the EARR between orthodontic patients using self-ligating or conventional brackets were identified through electronic search in databases including CENTRAL, PubMed, EMBASE, China National Knowledge Infrastructure (CNKI) and SIGLE, and manual search in relevant journals and reference lists of the included studies until Apr 2016. The extraction of data and risk of bias evaluation were conducted by two investigators independently. The original outcome underwent statistical pooling by using Review Manager 5. Seven studies were included in the systematic review, out of which, five studies were statistically pooled in meta-analysis. The value of EARR of maxillary central incisors in the self-ligating bracket group was significantly lower than that in the conventional bracket group (SMD -0.31; 95% CI: -0.60--0.01). No significant differences in other incisors were observed between self-ligating and conventional brackets. Current evidences suggest self-ligating brackets do not outperform conventional brackets in reducing the EARR in maxillary lateral incisors, mandible central incisors and mandible lateral incisors. However, self-ligating brackets appear to have an advantage in protecting maxillary central incisor from EARR, which still needs to be confirmed by more high-quality studies.
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1993-01-01
This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.
The Importance of Proving the Null
ERIC Educational Resources Information Center
Gallistel, C. R.
2009-01-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is…
Bayesian Propensity Score Analysis: Simulation and Case Study
ERIC Educational Resources Information Center
Kaplan, David; Chen, Cassie J. S.
2011-01-01
Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Chen, Pei; Harnly, James M.; Lester, Gene E.
2013-01-01
Spectral fingerprints were acquired for Rio Red grapefruit using flow injection electrospray ionization with ion trap and time-of-flight mass spectrometry (FI-ESI-IT-MS and FI-ESI-TOF-MS). Rio Red grapefruits were harvested 3 times a year (early, mid, and late harvests) in 2005 and 2006 from conventionally and organically grown trees. Data analysis using analysis of variance principal component analysis (ANOVA-PCA) demonstrated that, for both MS systems, the chemical patterns were different as a function of farming mode (conventional vs organic), as well as growing year and time of harvest. This was visually obvious with PCA and was shown to be statistically significant using ANOVA. The spectral fingerprints provided a more inclusive view of the chemical composition of the grapefruit and extended previous conclusions regarding the chemical differences between conventionally and organically grown Rio Red grapefruit. PMID:20337420
Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom; Elbekai, Reem H
2014-01-01
Carcinogenicity studies have been performed in conventional 2-year rodent studies for at least 3 decades, whereas the short-term carcinogenicity studies in transgenic mice, such as Tg.rasH2, have only been performed over the last decade. In the 2-year conventional rodent studies, interlinked problems, such as increasing trends in the initial body weights, increased body weight gains, high incidence of spontaneous tumors, and low survival, that complicate the interpretation of findings have been well established. However, these end points have not been evaluated in the short-term carcinogenicity studies involving the Tg.rasH2 mice. In this article, we present retrospective analysis of data obtained from control groups in 26-week carcinogenicity studies conducted in Tg.rasH2 mice since 2004. Our analysis showed statistically significant decreasing trends in initial body weights of both sexes. Although the terminal body weights did not show any significant trends, there was a statistically significant increasing trend toward body weight gains, more so in males than in females, which correlated with increasing trends in the food consumption. There were no statistically significant alterations in mortality trends. In addition, the incidence of all common spontaneous tumors remained fairly constant with no statistically significant differences in trends. © The Author(s) 2014.
Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves
2017-01-01
ABSTRACT Introduction: Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Methods: Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Results: Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Conclusion: Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film. PMID:28902253
Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents
NASA Astrophysics Data System (ADS)
Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.
2016-12-01
Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.
Functional Relationships and Regression Analysis.
ERIC Educational Resources Information Center
Preece, Peter F. W.
1978-01-01
Using a degenerate multivariate normal model for the distribution of organismic variables, the form of least-squares regression analysis required to estimate a linear functional relationship between variables is derived. It is suggested that the two conventional regression lines may be considered to describe functional, not merely statistical,…
Photon counting statistics analysis of biophotons from hands.
Jung, Hyun-Hee; Woo, Won-Myung; Yang, Joon-Mo; Choi, Chunho; Lee, Jonghan; Yoon, Gilwon; Yang, Jong S; Soh, Kwang-Sup
2003-05-01
The photon counting statistics of biophotons emitted from hands is studied with a view to test its agreement with the Poisson distribution. The moments of observed probability up to seventh order have been evaluated. The moments of biophoton emission from hands are in good agreement while those of dark counts of photomultiplier tube show large deviations from the theoretical values of Poisson distribution. The present results are consistent with the conventional delta-value analysis of the second moment of probability.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin
2013-09-01
There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
Novotná, H; Kmiecik, O; Gałązka, M; Krtková, V; Hurajová, A; Schulzová, V; Hallmann, E; Rembiałkowska, E; Hajšlová, J
2012-01-01
The rapidly growing demand for organic food requires the availability of analytical tools enabling their authentication. Recently, metabolomic fingerprinting/profiling has been demonstrated as a challenging option for a comprehensive characterisation of small molecules occurring in plants, since their pattern may reflect the impact of various external factors. In a two-year pilot study, concerned with the classification of organic versus conventional crops, ambient mass spectrometry consisting of a direct analysis in real time (DART) ion source and a time-of-flight mass spectrometer (TOFMS) was employed. This novel methodology was tested on 40 tomato and 24 pepper samples grown under specified conditions. To calculate statistical models, the obtained data (mass spectra) were processed by the principal component analysis (PCA) followed by linear discriminant analysis (LDA). The results from the positive ionisation mode enabled better differentiation between organic and conventional samples than the results from the negative mode. In this case, the recognition ability obtained by LDA was 97.5% for tomato and 100% for pepper samples and the prediction abilities were above 80% for both sample sets. The results suggest that the year of production had stronger influence on the metabolomic fingerprints compared with the type of farming (organic versus conventional). In any case, DART-TOFMS is a promising tool for rapid screening of samples. Establishing comprehensive (multi-sample) long-term databases may further help to improve the quality of statistical classification models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.; Selvin, S.; Close, E.R.
In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less
Marolf, Angela; Blaik, Margaret; Ackerman, Norman; Watson, Elizabeth; Gibson, Nicole; Thompson, Margret
2008-01-01
The role of digital imaging is increasing as these systems are becoming more affordable and accessible. Advantages of computed radiography compared with conventional film/screen combinations include improved contrast resolution and postprocessing capabilities. Computed radiography's spatial resolution is inferior to conventional radiography; however, this limitation is considered clinically insignificant. This study prospectively compared digital imaging and conventional radiography in detecting small volume pneumoperitoneum. Twenty cadaver dogs (15-30 kg) were injected with 0.25, 0.25, and 0.5 ml for 1 ml total of air intra-abdominally, and radiographed sequentially using computed and conventional radiographic technologies. Three radiologists independently evaluated the images, and receiver operating curve (ROC) analysis compared the two imaging modalities. There was no statistical difference between computed and conventional radiography in detecting free abdominal air, but overall computed radiography was relatively more sensitive based on ROC analysis. Computed radiographic images consistently and significantly demonstrated a minimal amount of 0.5 ml of free air based on ROC analysis. However, no minimal air amount was consistently or significantly detected with conventional film. Readers were more likely to detect free air on lateral computed images than the other projections, with no significant increased sensitivity between film/screen projections. Further studies are indicated to determine the differences or lack thereof between various digital imaging systems and conventional film/screen systems.
[Evaluation of Wits appraisal with superimposition method].
Xu, T; Ahn, J; Baumrind, S
1999-07-01
To compare the conventional Wits appraisal with superimposed Wits appraisal in evaluation of sagittal jaw relationship change between pre and post orthodontic treatment. The sample consists of 48-case pre and post treatment lateral head films. Computerized digitizing is used to get the cephalometric landmarks and measure conventional Wits value, superimposed Wits value and ANB angle. The correlation analysis among these three measures was done by SAS statistical package. The change of ANB angle has higher correlation with the change of superimposed Wits than that of the conventional Wits. The r-value is as high as 0.849 (P < 0.001). The superimposed Wits appraisal reflects the change of sagittal jaw relationship more objectively than the conventional one.
Neither fixed nor random: weighted least squares meta-analysis.
Stanley, T D; Doucouliagos, Hristos
2015-06-15
This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.
Statistical analysis of modeling error in structural dynamic systems
NASA Technical Reports Server (NTRS)
Hasselman, T. K.; Chrostowski, J. D.
1990-01-01
The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.
Engineering analysis of shortfall for new technologies. Analysis memorandum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-03-11
The engineering principles that govern the mpg performance of alternative technologies on the EPA test procedure and under in-use conditions are examined. The results can be used to interpret the shortfall of alternative technologies derived from statistical analyses. The analysis examines each of the four technologies in comparison to the conventional technology counterpart. Manual transmissions are compared to automatics, fuel injected S.I. engines to carburetted S.I. engines, front-wheel drive vehicles to rear-wheel drive vehicles and diesel engines to carburetted S.I. engines. The changes in shortfall of the four technologies in comparison to conventional technologies are explained through differences in responsesmore » to the factors.« less
Ramadan, Wijdan H; Khreis, Noura A; Kabbara, Wissam K
2015-01-01
Background The aim of the study was to evaluate the simplicity, safety, patients’ preference, and convenience of the administration of insulin using the pen device versus the conventional vial/syringe in patients with diabetes. Methods This observational study was conducted in multiple community pharmacies in Lebanon. The investigators interviewed patients with diabetes using an insulin pen or conventional vial/syringe. A total of 74 questionnaires were filled over a period of 6 months. Answers were entered into the Statistical Package for Social Sciences (SPSS) software and Excel spreadsheet. t-test, logistic regression analysis, and correlation analysis were used in order to analyze the results. Results A higher percentage of patients from the insulin pen users group (95.2%) found the method easy to use as compared to only 46.7% of the insulin conventional users group (P 0.001, relative risk [RR]: 2.041, 95% confidence interval [CI]: 1.178–3.535). Moreover, 61.9% and 26.7% of pen users and conventional users, respectively, could read the scale easily (P 0.037, RR 2.321, 95% CI: 0.940–5.731), while 85.7% of pen users found it more convenient shifting to pen and 86.7% of the conventional users would want to shift to pen if it had the same cost. Pain perception was statistically different between the groups. A much higher percentage (76.2%) of pen users showed no pain during injection compared to only 26.7% of conventional users (P 0.003, RR 2.857, 95% CI: 1.194–6.838). Conclusion The insulin pen was significantly much easier to use and less painful than the conventional vial/syringe. Proper education on the methods of administration/storage and disposal of needles/syringes is needed in both groups. PMID:25848231
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
2018-01-01
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
Kurnia, H.; Noerhadi, N. A. I.
2017-08-01
Three-dimensional digital study models were introduced following advances in digital technology. This study was carried out to assess the reliability of digital study models scanned by a laser scanning device newly assembled. The aim of this study was to compare the digital study models and conventional models. Twelve sets of dental impressions were taken from patients with mild-to-moderate crowding. The impressions were taken twice, one with alginate and the other with polyvinylsiloxane. The alginate impressions were made into conventional models, and the polyvinylsiloxane impressions were scanned to produce digital models. The mesiodistal tooth width and Little’s irregularity index (LII) were measured manually with digital calipers on the conventional models and digitally on the digital study models. Bolton analysis was performed on each study models. Each method was carried out twice to check for intra-observer variability. The reproducibility (comparison of the methods) was assessed using independent-sample t-tests. The mesiodistal tooth width between conventional and digital models did not significantly differ (p > 0.05). Independent-sample t-tests did not identify statistically significant differences for Bolton analysis and LII (p = 0.603 for Bolton and p = 0894 for LII). The measurements of the digital study models are as accurate as those of the conventional models.
Kawashima, Hiroki; Hayashi, Norio; Ohno, Naoki; Matsuura, Yukihiro; Sanada, Shigeru
2015-08-01
To evaluate the patient identification ability of radiographers, previous and current chest radiographs were assessed with observer study utilizing a receiver operating characteristics (ROCs) analysis. This study included portable and conventional chest radiographs from 43 same and 43 different patients. The dataset used in this study was divided into the three following groups: (1) a pair of portable radiographs, (2) a pair of conventional radiographs, and (3) a combination of each type of radiograph. Seven observers participated in this ROC study, which aimed to identify same or different patients, using these datasets. ROC analysis was conducted to calculate the average area under ROC curve obtained by each observer (AUCave), and a statistical test was performed using the multi-reader multi-case method. Comparable results were obtained with pairs of portable (AUCave: 0.949) and conventional radiographs (AUCave: 0.951). In a comparison between the same modality, there were no significant differences. In contrast, the ability to identify patients by comparing a portable and conventional radiograph (AUCave: 0.873) was lower than with the matching datasets (p=0.002 and p=0.004, respectively). In conclusion, the use of different imaging modalities reduces radiographers' ability to identify their patients.
Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.
Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani
2007-03-01
Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.
Evaluation of bearing capacity of piles from cone penetration test data.
DOT National Transportation Integrated Search
2007-12-01
A statistical analysis and ranking criteria were used to compare the CPT methods and the conventional alpha design method. Based on the results, the de Ruiter/Beringen and LCPC methods showed the best capability in predicting the measured load carryi...
Testing Some Stereotypes About the Sexes in Organizations: Differential Centrality of Work?
ERIC Educational Resources Information Center
Golembiewski, Robert T.
1977-01-01
Analyzes 2,250 responses to employee questionnaires that measured seven variables related to employees' perceived centrality of work. Statistical analysis of the data generally supports the conventional wisdom that males generally consider work more central than females. (JG)
A Non-Intrusive Algorithm for Sensitivity Analysis of Chaotic Flow Simulations
NASA Technical Reports Server (NTRS)
Blonigan, Patrick J.; Wang, Qiqi; Nielsen, Eric J.; Diskin, Boris
2017-01-01
We demonstrate a novel algorithm for computing the sensitivity of statistics in chaotic flow simulations to parameter perturbations. The algorithm is non-intrusive but requires exposing an interface. Based on the principle of shadowing in dynamical systems, this algorithm is designed to reduce the effect of the sampling error in computing sensitivity of statistics in chaotic simulations. We compare the effectiveness of this method to that of the conventional finite difference method.
Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.
Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786
Kruppa, B; Rüden, H
1993-05-01
The question was if a reduction of airborne particles and bacteria in conventionally (turbulently), ventilated operating theatres in comparison to Laminar-Airflow (LAF) operating theatres does occur at high air-exchange-rates. Within the framework of energy consumption measures the influence of air-exchange-rates on airborne particle and bacteria concentrations was determined in two identical operating theatres with conventional ventilation (wall diffusor panel) at the air-exchange-rates 7.5, 10, 15 and 20/h without surgical activity. This was established by means of the statistical procedure of analysis of variance. Especially for the comparison of the air-exchange-rates 7.5 and 15/h statistical differences were found for airborne particle concentrations in supply and ambient air. Concerning airborne bacteria concentrations no differences were found among the various air-exchange-rates. Explanation of variance is quite high for non-viable particles (supply air: 37%, ambient air: 81%) but negligible for viable particles (bacteria) with values below 15%.
Tani, Kazuki; Mio, Motohira; Toyofuku, Tatsuo; Kato, Shinichi; Masumoto, Tomoya; Ijichi, Tetsuya; Matsushima, Masatoshi; Morimoto, Shoichi; Hirata, Takumi
2017-01-01
Spatial normalization is a significant image pre-processing operation in statistical parametric mapping (SPM) analysis. The purpose of this study was to clarify the optimal method of spatial normalization for improving diagnostic accuracy in SPM analysis of arterial spin-labeling (ASL) perfusion images. We evaluated the SPM results of five spatial normalization methods obtained by comparing patients with Alzheimer's disease or normal pressure hydrocephalus complicated with dementia and cognitively healthy subjects. We used the following methods: 3DT1-conventional based on spatial normalization using anatomical images; 3DT1-DARTEL based on spatial normalization with DARTEL using anatomical images; 3DT1-conventional template and 3DT1-DARTEL template, created by averaging cognitively healthy subjects spatially normalized using the above methods; and ASL-DARTEL template created by averaging cognitively healthy subjects spatially normalized with DARTEL using ASL images only. Our results showed that ASL-DARTEL template was small compared with the other two templates. Our SPM results obtained with ASL-DARTEL template method were inaccurate. Also, there were no significant differences between 3DT1-conventional and 3DT1-DARTEL template methods. In contrast, the 3DT1-DARTEL method showed higher detection sensitivity, and precise anatomical location. Our SPM results suggest that we should perform spatial normalization with DARTEL using anatomical images.
Dimensional changes of acrylic resin denture bases: conventional versus injection-molding technique.
Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam
2014-07-01
Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding.
Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique
Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam
2014-01-01
Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Life-cycle analysis of shale gas and natural gas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, C.E.; Han, J.; Burnham, A.
2012-01-27
The technologies and practices that have enabled the recent boom in shale gas production have also brought attention to the environmental impacts of its use. Using the current state of knowledge of the recovery, processing, and distribution of shale gas and conventional natural gas, we have estimated up-to-date, life-cycle greenhouse gas emissions. In addition, we have developed distribution functions for key parameters in each pathway to examine uncertainty and identify data gaps - such as methane emissions from shale gas well completions and conventional natural gas liquid unloadings - that need to be addressed further. Our base case results showmore » that shale gas life-cycle emissions are 6% lower than those of conventional natural gas. However, the range in values for shale and conventional gas overlap, so there is a statistical uncertainty regarding whether shale gas emissions are indeed lower than conventional gas emissions. This life-cycle analysis provides insight into the critical stages in the natural gas industry where emissions occur and where opportunities exist to reduce the greenhouse gas footprint of natural gas.« less
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
78 FR 8682 - Shipping Coordinating Committee; Notice of Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... the Protocol of 1978 (MARPOL 73/78); Casualty statistics and investigations; Harmonization of port State control activities; Port State Control (PSC) Guidelines on seafarers' hours of rest and PSC... control under the 2004 Ballast Water Management (BWM) Convention; Comprehensive analysis of difficulties...
Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.
Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind
2016-04-01
Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.
Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin
2013-01-01
Background There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. Objectives To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. Patients and Methods A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Results Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. Conclusion For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee. PMID:24348599
Ahmad, Aqeel; Negri, Ignacio; Oliveira, Wladecir; Brown, Christopher; Asiimwe, Peter; Sammons, Bernard; Horak, Michael; Jiang, Changjian; Carson, David
2016-02-01
As part of an environmental risk assessment, the potential impact of genetically modified (GM) maize MON 87411 on non-target arthropods (NTAs) was evaluated in the field. MON 87411 confers resistance to corn rootworm (CRW; Diabrotica spp.) by expressing an insecticidal double-stranded RNA (dsRNA) transcript and the Cry3Bb1 protein and tolerance to the herbicide glyphosate by producing the CP4 EPSPS protein. Field trials were conducted at 14 sites providing high geographic and environmental diversity within maize production areas from three geographic regions including the U.S., Argentina, and Brazil. MON 87411, the conventional control, and four commercial conventional reference hybrids were evaluated for NTA abundance and damage. Twenty arthropod taxa met minimum abundance criteria for valid statistical analysis. Nine of these taxa occurred in at least two of the three regions and in at least four sites across regions. These nine taxa included: aphid, predatory earwig, lacewing, ladybird beetle, leafhopper, minute pirate bug, parasitic wasp, sap beetle, and spider. In addition to wide regional distribution, these taxa encompass the ecological functions of herbivores, predators and parasitoids in maize agro-ecosystems. Thus, the nine arthropods may serve as representative taxa of maize agro-ecosystems, and thereby support that analysis of relevant data generated in one region can be transportable for the risk assessment of the same or similar GM crop products in another region. Across the 20 taxa analyzed, no statistically significant differences in abundance were detected between MON 87411 and the conventional control for 123 of the 128 individual-site comparisons (96.1%). For the nine widely distributed taxa, no statistically significant differences in abundance were detected between MON 87411 and the conventional control. Furthermore, no statistically significant differences were detected between MON 87411 and the conventional control for 53 out of 56 individual-site comparisons (94.6 %) of NTA pest damage to the crop. In each case where a significant difference was observed in arthropod abundance or damage, the mean value for MON 87411 was within the reference range and/or the difference was not consistently observed across collection methods and/or sites. Thus, the differences were not representative of an adverse effect unfamiliar to maize and/or were not indicative of a consistent plant response associated with the GM traits. Results from this study support a conclusion of no adverse environmental impact of MON 87411 on NTAs compared to conventional maize and demonstrate the utility of relevant transportable data across regions for the ERA of GM crops.
Kandiah, P; Tahmassebi, J F
2012-11-01
This prospective, randomised, parallel, controlled study was conducted firstly to compare the onset of local anaesthesia (LA) when using the conventional technique versus the Wand computer-controlled LA and secondly to assess the pain experience in children. Thirty children were randomly allocated to the treatment group (Wand) or the control group (conventional). Lidocaine 2% with adrenaline (1:80,000) was given as a buccal infiltration. The onset of pulpal anaesthesia was tested using an analytic electric pulp tester (EPT). The pain experience during the LA was recorded using a modified visual analogue score (VAS). Median time for the onset of LA was 6.30 minutes for the control and 7.25 minutes for the Wand group. Mean pain experience score for the control group was 9.78% as opposed to 8.46% in the Wand group. Statistical analysis showed that there was no statistically significant difference in the onset of LA (p = 0.486) and the pain experience (p = 0.713) between the two groups. When placing a buccal infiltration on upper first permanent molars, the onset of LA and the pain experience was no different using the Wand and the conventional technique.
Digital vs. conventional implant prosthetic workflows: a cost/time analysis.
Joda, Tim; Brägger, Urs
2015-12-01
The aim of this prospective cohort trial was to perform a cost/time analysis for implant-supported single-unit reconstructions in the digital workflow compared to the conventional pathway. A total of 20 patients were included for rehabilitation with 2 × 20 implant crowns in a crossover study design and treated consecutively each with customized titanium abutments plus CAD/CAM-zirconia-suprastructures (test: digital) and with standardized titanium abutments plus PFM-crowns (control conventional). Starting with prosthetic treatment, analysis was estimated for clinical and laboratory work steps including measure of costs in Swiss Francs (CHF), productivity rates and cost minimization for first-line therapy. Statistical calculations were performed with Wilcoxon signed-rank test. Both protocols worked successfully for all test and control reconstructions. Direct treatment costs were significantly lower for the digital workflow 1815.35 CHF compared to the conventional pathway 2119.65 CHF [P = 0.0004]. For subprocess evaluation, total laboratory costs were calculated as 941.95 CHF for the test group and 1245.65 CHF for the control group, respectively [P = 0.003]. The clinical dental productivity rate amounted to 29.64 CHF/min (digital) and 24.37 CHF/min (conventional) [P = 0.002]. Overall, cost minimization analysis exhibited an 18% cost reduction within the digital process. The digital workflow was more efficient than the established conventional pathway for implant-supported crowns in this investigation. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Kim, Hyunji; Cha, Joo Hee; Oh, Ha-Yeun; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young
2014-07-01
To compare the performance of radiologists in the use of conventional ultrasound (US) and automated breast volume ultrasound (ABVU) for the characterization of benign and malignant solid breast masses based on breast imaging and reporting data system (BI-RADS) criteria. Conventional US and ABVU images were obtained in 87 patients with 106 solid breast masses (52 cancers, 54 benign lesions). Three experienced radiologists who were blinded to all examination results independently characterized the lesions and reported a BI-RADS assessment category and a level of suspicion of malignancy. The results were analyzed by calculation of Cohen's κ coefficient and by receiver operating characteristic (ROC) analysis. Assessment of the agreement of conventional US and ABVU indicated that the posterior echo feature was the most discordant feature of seven features (κ = 0.371 ± 0.225) and that orientation had the greatest agreement (κ = 0.608 ± 0.210). The final assessment showed substantial agreement (κ = 0.773 ± 0.104). The areas under the ROC curves (Az) for conventional US and ABVU were not statistically significant for each reader, but the mean Az values of conventional US and ABVU by multi-reader multi-case analysis were significantly different (conventional US 0.991, ABVU 0.963; 95 % CI -0.0471 to -0.0097). The means for sensitivity, specificity, positive predictive value, and negative predictive value of conventional US and ABVU did not differ significantly. There was substantial inter-observer agreement in the final assessment of solid breast masses by conventional US and ABVU. ROC analysis comparing the performance of conventional US and ABVU indicated a marginally significant difference in mean Az, but not in mean sensitivity, specificity, positive predictive value, or negative predictive value.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.
1979-01-01
A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
Santamaría-Arrieta, Gorka; Brizuela-Velasco, Aritza; Fernández-González, Felipe J.; Chávarri-Prado, David; Chento-Valiente, Yelko; Solaberrieta, Eneko; Diéguez-Pereira, Markel; Yurrebaso-Asúa, Jaime
2016-01-01
Background This study evaluated the influence of implant site preparation depth on primary stability measured by insertion torque and resonance frequency analysis (RFA). Material and Methods Thirty-two implant sites were prepared in eight veal rib blocks. Sixteen sites were prepared using the conventional drilling sequence recommended by the manufacturer to a working depth of 10mm. The remaining 16 sites were prepared using an oversize drilling technique (overpreparation) to a working depth of 12mm. Bone density was determined using cone beam computerized tomography (CBCT). The implants were placed and primary stability was measured by two methods: insertion torque (Ncm), and RFA (implant stability quotient [ISQ]). Results The highest torque values were achieved by the conventional drilling technique (10mm). The ANOVA test confirmed that there was a significant correlation between torque and drilling depth (p<0.05). However, no statistically significant differences were obtained between ISQ values at 10 or 12 mm drilling depths (p>0.05) at either measurement direction (cortical and medullar). No statistical relation between torque and ISQ values was identified, or between bone density and primary stability (p >0.05). Conclusions Vertical overpreparation of the implant bed will obtain lower insertion torque values, but does not produce statistically significant differences in ISQ values. Key words:Implant stability quotient, overdrilling, primary stability, resonance frequency analysis, torque. PMID:27398182
Lee, Ki Song; Choe, Young Chan; Park, Sung Hee
2015-10-01
This study examined the structural variables affecting the environmental effects of organic farming compared to those of conventional farming. A meta-analysis based on 107 studies and 360 observations published from 1977 to 2012 compared energy efficiency (EE) and greenhouse gas emissions (GHGE) for organic and conventional farming. The meta-analysis systematically analyzed the results of earlier comparative studies and used logistic regression to identify the structural variables that contributed to differences in the effects of organic and conventional farming on the environment. The statistical evidence identified characteristics that differentiated the environmental effects of organic and conventional farming, which is controversial. The results indicated that data sources, sample size and product type significantly affected EE, whereas product type, cropping pattern and measurement unit significantly affected the GHGE of organic farming compared to conventional farming. Superior effects of organic farming on the environment were more likely to appear for larger samples, primary data rather than secondary data, monocropping rather than multicropping, and crops other than fruits and vegetables. The environmental effects of organic farming were not affected by the study period, geographic location, farm size, cropping pattern, or measurement method. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Al-Moraissi, E A; Elmansi, Y A; Al-Sharaee, Y A; Alrmali, A E; Alkhutari, A S
2016-03-01
A systematic review and meta-analysis was conducted to answer the clinical question "Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments?" A systematic and electronic search of several databases with specific key words, a reference search, and a manual search were performed from respective dates of inception through November 2014. The inclusion criteria were clinical human studies, including randomized controlled trials (RCTs), controlled clinical trials (CCTs), and retrospective studies, with the aim of comparing the piezoelectric surgical osteotomy technique to the standard rotary instrument technique in lower third molar surgery. Postoperative sequelae (oedema, trismus, and pain), the total number of analgesics taken, and the duration of surgery were analyzed. A total of nine articles were included, six RCTs, two CCTs, and one retrospective study. Six studies had a low risk of bias and three had a moderate risk of bias. A statistically significant difference was found between piezoelectric surgery and conventional rotary instrument surgery for lower third molar extraction with regard to postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken (P=0.0001, P=0.0001, P<0.00001, and P<0.0001, respectively). However, a statistically significant increased surgery time was required in the piezoelectric osteotomy group (P<0.00001). The results of the meta-analysis showed that piezoelectric surgery significantly reduced the occurrence of postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken compared to the conventional rotary instrument technique in lower third molar surgery, but required a longer surgery time. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Split-mouth comparison of the accuracy of computer-generated and conventional surgical guides.
Farley, Nathaniel E; Kennedy, Kelly; McGlumphy, Edwin A; Clelland, Nancy L
2013-01-01
Recent clinical studies have shown that implant placement is highly predictable with computer-generated surgical guides; however, the reliability of these guides has not been compared to that of conventional guides clinically. This study aimed to compare the accuracy of reproducing planned implant positions with computer-generated and conventional surgical guides using a split-mouth design. Ten patients received two implants each in symmetric locations. All implants were planned virtually using a software program and information from cone beam computed tomographic scans taken with scan appliances in place. Patients were randomly selected for computer-aided design/computer-assisted manufacture (CAD/CAM)-guided implant placement on their right or left side. Conventional guides were used on the contralateral side. Patients underwent operative cone beam computed tomography postoperatively. Planned and actual implant positions were compared using three-dimensional analyses capable of measuring volume overlap as well as differences in angles and coronal and apical positions. Results were compared using a mixed-model repeated-measures analysis of variance and were further analyzed using a Bartlett test for unequal variance (α = .05). Implants placed with CAD/CAM guides were closer to the planned positions in all eight categories examined. However, statistically significant differences were shown only for coronal horizontal distances. It was also shown that CAD/CAM guides had less variability than conventional guides, which was statistically significant for apical distance. Implants placed using CAD/CAM surgical guides provided greater accuracy in a lateral direction than conventional guides. In addition, CAD/CAM guides were more consistent in their deviation from the planned locations than conventional guides.
NIRS-SPM: statistical parametric mapping for near infrared spectroscopy
NASA Astrophysics Data System (ADS)
Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul
2008-02-01
Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.
Geminiani, Alessandro; Weitz, Daniel S; Ercoli, Carlo; Feng, Changyong; Caton, Jack G; Papadimitriou, Dimitrios E V
2015-04-01
Sonic instruments may reduce perforation rates of the schneiderian membrane during lateral window sinus augmentation procedures. This study compares the incidence of membrane perforations using a sonic handpiece with an oscillating diamond insert versus a turbine handpiece with a conventional rotary diamond stone during lateral window sinus augmentation procedures. A retrospective chart analysis identified all lateral window sinus augmentation procedures done during a defined period. Among these procedures, those performed with a sonic handpiece and an oscillating diamond insert (experimental) and those performed with a conventional turbine and rotary diamond stone (conventional) were selected for this study. Reported occurrences of sinus membrane perforations during preparation of the osteotomy and elevation of the sinus membrane, as well as postoperative complications, were recorded and compared between treatment groups. Ninety-three consecutive patients were identified for a total of 130 sinus augmentation procedures (51 conventional, 79 experimental). Schneiderian membrane perforations were noted during preparation of the lateral window osteotomy in 27.5% of the sinuses in the conventional group and 12.7% of sinuses in the experimental group. During membrane elevation, perforations were noted in 43.1% of the sinuses in the conventional group and 25.3% of sinuses in the experimental group. Both differences in perforation rates were statistically significant (p < .05). There was no statistically significant difference in postoperative complications. In this study, the use of a sonic instrument to prepare the lateral window osteotomy during sinus elevation procedures resulted in a reduced perforation rate of the Schneiderian membrane compared with the conventional turbine instrument. © 2013 Wiley Periodicals, Inc.
Ghoveizi, Rahab; Alikhasi, Marzieh; Siadat, Mohammad-Reza; Siadat, Hakimeh; Sorouri, Majid
2013-01-01
Objective: Crestal bone loss is a biological complication in implant dentistry. The aim of this study was to compare the effect of progressive and conventional loading on crestal bone height and bone density around single osseointegrated implants in the posterior maxilla by a longitudinal radiographic assessment technique. Materials and Methods: Twenty micro thread implants were placed in 10 patients (two implants per patient). One of the two implants in each patient was assigned to progressive and the other to conventional loading groups. Eight weeks after surgery, conventional implants were restored with a metal ceramic crown and the progressive group underwent a progressive loading protocol. The progressive loading group took different temporary acrylic crowns at 2, 4 and 6 months. After eight months, acrylic crowns were replaced with a metal ceramic crown. Computer radiography of both progressive and conventional implants was taken at 2, 4, 6, and 12 months. Image analysis was performed to measure the height of crestal bone loss and bone density. Results: The mean values of crestal bone loss at month 12 were 0.11 (0.19) mm for progressively and 0.36 (0.36) mm for conventionally loaded implants, with a statistically significant difference (P < 0.05) using Wilcoxon sign rank. Progressively loaded group showed a trend for higher bone density gain compared to the conventionally loaded group, but when tested with repeated measure ANOVA, the differences were not statistically significant (P > 0.05). Conclusion: The progressive group showed less crestal bone loss in single osseointegrated implant than the conventional group. Bone density around progressively loaded implants showed increase in crestal, middle and apical areas. PMID:23724215
Griffiths, Andrea M; Cook, David M; Eggett, Dennis L; Christensen, Merrill J
2012-06-01
Whether or not all foods marketed to consumers as organic meet specified standards for use of that descriptor, or are nutritionally different from conventional foods, is uncertain. In a retail market study in a Western US metropolitan area, differences in mineral composition between conventional potatoes and those marketed as organic were analysed. Potatoes marketed as organic had more copper and magnesium (p < 0.0001), less iron (p < 0.0001) and sodium (p < 0.02), and the same concentration of calcium, potassium and zinc as conventional potatoes. Comparison of individual mineral concentrations between foodstuffs sold as organic or conventional is unlikely to establish a chemical fingerprint to objectively distinguish between organic and conventional produce, but more sophisticated chemometric analysis of multi-element fingerprints holds promise of doing so. Although statistically significant, these differences would only minimally affect total dietary intake of these minerals and be unlikely to result in measurable health benefits.
Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A
2018-02-01
To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Fetit, Ahmed E; Novak, Jan; Peet, Andrew C; Arvanitits, Theodoros N
2015-09-01
The aim of this study was to assess the efficacy of three-dimensional texture analysis (3D TA) of conventional MR images for the classification of childhood brain tumours in a quantitative manner. The dataset comprised pre-contrast T1 - and T2-weighted MRI series obtained from 48 children diagnosed with brain tumours (medulloblastoma, pilocytic astrocytoma and ependymoma). 3D and 2D TA were carried out on the images using first-, second- and higher order statistical methods. Six supervised classification algorithms were trained with the most influential 3D and 2D textural features, and their performances in the classification of tumour types, using the two feature sets, were compared. Model validation was carried out using the leave-one-out cross-validation (LOOCV) approach, as well as stratified 10-fold cross-validation, in order to provide additional reassurance. McNemar's test was used to test the statistical significance of any improvements demonstrated by 3D-trained classifiers. Supervised learning models trained with 3D textural features showed improved classification performances to those trained with conventional 2D features. For instance, a neural network classifier showed 12% improvement in area under the receiver operator characteristics curve (AUC) and 19% in overall classification accuracy. These improvements were statistically significant for four of the tested classifiers, as per McNemar's tests. This study shows that 3D textural features extracted from conventional T1 - and T2-weighted images can improve the diagnostic classification of childhood brain tumours. Long-term benefits of accurate, yet non-invasive, diagnostic aids include a reduction in surgical procedures, improvement in surgical and therapy planning, and support of discussions with patients' families. It remains necessary, however, to extend the analysis to a multicentre cohort in order to assess the scalability of the techniques used. Copyright © 2015 John Wiley & Sons, Ltd.
Visual Data Analysis for Satellites
NASA Technical Reports Server (NTRS)
Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick
2008-01-01
The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.
Cınar, Yasin; Cingü, Abdullah Kürşat; Türkcü, Fatih Mehmet; Çınar, Tuba; Yüksel, Harun; Özkurt, Zeynep Gürsel; Çaça, Ihsan
2014-09-01
To compare outcomes of accelerated and conventional corneal cross-linking (CXL) for progressive keratoconus (KC). Patients were divided into two groups as the accelerated CXL group and the conventional CXL group. The uncorrected distant visual acuity (UDVA), corrected distant visual acuity (CDVA), refraction and keratometric values were measured preoperatively and postoperatively. The data of the two groups were compared statistically. The mean UDVA and CDVA were better at the six month postoperative when compared with preoperative values in two groups. While change in UDVA and CDVA was statistically significant in the accelerated CXL group (p = 0.035 and p = 0.047, respectively), it did not reach statistical significance in the conventional CXL group (p = 0.184 and p = 0.113, respectively). The decrease in the mean corneal power (Km) and maximum keratometric value (Kmax) were statistically significant in both groups (p = 0.012 and 0.046, respectively in the accelerated CXL group, p = 0.012 and 0.041, respectively, in the conventional CXL group). There was no statistically significant difference in visual and refractive results between the two groups (p > 0.05). Refractive and visual results of the accelerated CXL method and the conventional CXL method for the treatment of KC in short time period were similar. The accelerated CXL method faster and provide high throughput of the patients.
PROBABILITY SAMPLING AND POPULATION INFERENCE IN MONITORING PROGRAMS
A fundamental difference between probability sampling and conventional statistics is that "sampling" deals with real, tangible populations, whereas "conventional statistics" usually deals with hypothetical populations that have no real-world realization. he focus here is on real ...
Statistical science: a grammar for research.
Cox, David R
2017-06-01
I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.
Conventional drilling versus piezosurgery for implant site preparation: a meta-analysis.
Sendyk, Daniel Isaac; Oliveira, Natacha Kalline; Pannuti, Claudio Mendes; Naclério-Homem, Maria da Graça; Wennerberg, Ann; Zindel Deboni, Maria Cristina
2018-03-27
The aim of this study was to evaluate the evidence of a correlation between the stability of dental implants placed by piezosurgery, compared with implants placed by conventional drilling. An electronic search in MEDLINE, SCOPUS and the Cochrane Library was undertaken until August 2016 and was supplemented by manual searches and by unpublished studies at OpenGray. Only randomized controlled clinical trials that reported implant site preparation with piezosurgery and with conventional drilling were considered eligible for inclusion in this review. Meta-analyses were performed to evaluate the impact of piezosurgery on implant stability. Of 456 references electronically retrieved, 3 were included in the qualitative analysis and quantitative synthesis. The pooled estimates suggest that there is no significant difference between piezosurgery and conventional drilling at baseline (WMD: 2.20; 95% CI: -5.09, 9,49; p = 0.55). At 90 days, the pooled estimates revealed a statistically significant difference (WMD: 3.63; 95% CI: 0.58, 6.67, p = 0.02) favouring piezosurgery. Implant stability is slightly improved when osteotomy was performed by a piezoelectric device. More randomized controlled clinical trials are needed to verify these findings.
Song, Y; Yoon, Y C; Chong, Y; Seo, S W; Choi, Y-L; Sohn, I; Kim, M-J
2017-08-01
To compare the abilities of conventional magnetic resonance imaging (MRI) and apparent diffusion coefficient (ADC) in differentiating between benign and malignant soft-tissue tumours (STT). A total of 123 patients with STT who underwent 3 T MRI, including diffusion-weighted imaging (DWI), were retrospectively analysed using variate conventional MRI parameters, ADC mean and ADC min . For the all-STT group, the correlation between the malignant STT conventional MRI parameters, except deep compartment involvement, compared to those of benign STT were statistically significant with univariate analysis. Maximum diameter of the tumour (p=0.001; odds ratio [OR], 8.97) and ADC mean (p=0.020; OR, 4.30) were independent factors with multivariate analysis. For the non-myxoid non-haemosiderin STT group, signal heterogeneity on axial T1-weighted imaging (T1WI; p=0.017), ADC mean , and ADC min (p=0.001, p=0.001), showed significant differences with univariate analysis between malignancy and benignity. Signal heterogeneity in axial T1WI (p=0.025; OR, 12.64) and ADC mean (p=0.004; OR, 33.15) were independent factors with multivariate analysis. ADC values as well as conventional MRI parameters were useful in differentiating between benign and malignant STT. The ADC mean was the most powerful diagnostic parameter in non-myxoid non-haemosiderin STT. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Computer-Based Instruction and Health Professions Education: A Meta-Analysis of Outcomes.
ERIC Educational Resources Information Center
Cohen, Peter A.; Dacanay, Lakshmi S.
1992-01-01
The meta-analytic techniques of G. V. Glass were used to statistically integrate findings from 47 comparative studies on computer-based instruction (CBI) in health professions education. A clear majority of the studies favored CBI over conventional methods of instruction. Results show higher-order applications of computers to be especially…
Harrison, Jay M; Breeze, Matthew L; Harrigan, George G
2011-08-01
Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Simatos, N.; Perivolaropoulos, L.
2001-01-01
We use the publicly available code CMBFAST, as modified by Pogosian and Vachaspati, to simulate the effects of wiggly cosmic strings on the cosmic microwave background (CMB). Using the modified CMBFAST code, which takes into account vector modes and models wiggly cosmic strings by the one-scale model, we go beyond the angular power spectrum to construct CMB temperature maps with a resolution of a few degrees. The statistics of these maps are then studied using conventional and recently proposed statistical tests optimized for the detection of hidden temperature discontinuities induced by the Gott-Kaiser-Stebbins effect. We show, however, that these realistic maps cannot be distinguished in a statistically significant way from purely Gaussian maps with an identical power spectrum.
Analyzing the Influence of a New Dental Implant Design on Primary Stability.
Valente, Mariana Lima da Costa; de Castro, Denise Tornavoi; Shimano, Antonio Carlos; Lepri, César Penazzo; dos Reis, Andréa Cândido
2016-02-01
The macrogeometry of dental implants strongly influences the primary stability and hence the osseointegration process. Compare the performance of conventional and modified implant models in terms of primary stability. A total of 36 implants (Neodent®) with two different formats (n = 18): Alvim CM (Conical CM, Ø 4.3 mm × 10 mm in length) and Titamax Ti (Cylindrical HE, Ø 4.0 mm × 11 mm in length) were inserted into artificial bone blocks. Nine implants from each set were selected to undergo external geometry changes. The primary stability was quantified by insertion torque and resonance frequency using an Osstell device and the pullout test. One-way analysis of variance and Tukey's test were used for statistical evaluation. The comparative analysis of the implants showed a significant increase of the insertion torque for the modified Conical CM implants (p = 0.000) and Cylindrical HE (p = 0.043); for the resonance frequency the modified Cylindrical HE showed a lower statistical mean (p = 0.002) when compared to the conventional model, and in the pullout test both modified implants showed significant reduction (p = 0.000). Within the limitations of this study, the proposed modification showed good stability levels and advantages when compared to the conventional implants. © 2015 Wiley Periodicals, Inc.
A dose-response model for the conventional phototherapy of the newborn.
Osaku, Nelson Ossamu; Lopes, Heitor Silvério
2006-06-01
Jaundice of the newborn is a common problem as a consequence of the rapid increment of blood bilirubin in the first days of live. In most cases, it is considered a physiological transient situation, but unmanaged hyperbilirubinemia can lead to death or serious injuries for the survivors. For decades, phototherapy has been used as the main method for prevention and treatment of hyperbilirubinaemia of the newborn. This work aims at finding a predictive model for the decrement of blood bilirubin for patients submitted to conventional phototherapy. Data from the phototherapy of 90 term newborns were collected and used in a multiple regression method. A rigorous statistical analysis was done in order to guarantee a correct and valid model. The obtained model was able to explain 78% of the variation of the dependent variable. We show that it is possible to predict the total serum bilirubin of the patient under conventional phototherapy by knowing its birth weight, bilirubin level at the beginning of treatment and the radiant energy density (dose). Besides, it is possible to infer the time necessary for a given decrement of bilirubin, under approximately constant irradiance. Statistical analysis of the obtained model shows that it is valid for several ranges of birth weight, initial bilirubin level, and radiant energy density. It is expected that the proposed model can be useful in the clinical management of hyperbilirubinemia of the newborn.
The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.
Carey, James R
1989-01-01
The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.
Soto-Gordoa, Myriam; Arrospide, Arantzazu; Merino Hernández, Marisa; Mora Amengual, Joana; Fullaondo Zabala, Ane; Larrañaga, Igor; de Manuel, Esteban; Mar, Javier
2017-01-01
To develop a framework for the management of complex health care interventions within the Deming continuous improvement cycle and to test the framework in the case of an integrated intervention for multimorbid patients in the Basque Country within the CareWell project. Statistical analysis alone, although necessary, may not always represent the practical significance of the intervention. Thus, to ascertain the true economic impact of the intervention, the statistical results can be integrated into the budget impact analysis. The intervention of the case study consisted of a comprehensive approach that integrated new provider roles and new technological infrastructure for multimorbid patients, with the aim of reducing patient decompensations by 10% over 5 years. The study period was 2012 to 2020. Given the aging of the general population, the conventional scenario predicts an increase of 21% in the health care budget for care of multimorbid patients during the study period. With a successful intervention, this figure should drop to 18%. The statistical analysis, however, showed no significant differences in costs either in primary care or in hospital care between 2012 and 2014. The real costs in 2014 were by far closer to those in the conventional scenario than to the reductions expected in the objective scenario. The present implementation should be reappraised, because the present expenditure did not move closer to the objective budget. This work demonstrates the capacity of budget impact analysis to enhance the implementation of complex interventions. Its integration in the context of the continuous improvement cycle is transferable to other contexts in which implementation depth and time are important. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Zhu, Zheng; Zhao, Xin-ming; Zhao, Yan-feng; Wang, Xiao-yi; Zhou, Chun-wu
2015-01-01
To prospectively investigate the effect of using Gemstone Spectral Imaging (GSI) and adaptive statistical iterative reconstruction (ASIR) for reducing radiation and iodine contrast dose in abdominal CT patients with high BMI values. 26 patients (weight > 65kg and BMI ≥ 22) underwent abdominal CT using GSI mode with 300mgI/kg contrast material as study group (group A). Another 21 patients (weight ≤ 65kg and BMI ≥ 22) were scanned with a conventional 120 kVp tube voltage for noise index (NI) of 11 with 450mgI/kg contrast material as control group (group B). GSI images were reconstructed at 60keV with 50%ASIR and the conventional 120kVp images were reconstructed with FBP reconstruction. The CT values, standard deviation (SD), signal-noise-ratio (SNR), contrast-noise-ratio (CNR) of 26 landmarks were quantitatively measured and image quality qualitatively assessed using statistical analysis. As for the quantitative analysis, the difference of CNR between groups A and B was all significant except for the mesenteric vein. The SNR in group A was higher than B except the mesenteric artery and splenic artery. As for the qualitative analysis, all images had diagnostic quality and the agreement for image quality assessment between the reviewers was substantial (kappa = 0.684). CT dose index (CTDI) values for non-enhanced, arterial phase and portal phase in group A were decreased by 49.04%, 40.51% and 40.54% compared with group B (P = 0.000), respectively. The total dose and the injection rate for the contrast material were reduced by 14.40% and 14.95% in A compared with B. The use of GSI and ASIR provides similar enhancement in vessels and image quality with reduced radiation dose and contrast dose, compared with the use of conventional scan protocol.
Allen, R W; Harnsberger, H R; Shelton, C; King, B; Bell, D A; Miller, R; Parkin, J L; Apfelbaum, R I; Parker, D
1996-08-01
To determine whether unenhanced high-resolution T2-weighted fast spin-echo MR imaging provides an acceptable and less expensive alternative to contrast-enhanced conventional T1-weighted spin-echo MR techniques in the diagnosis of acoustic schwannoma. We reviewed in a blinded fashion the records of 25 patients with pathologically documented acoustic schwannoma and of 25 control subjects, all of whom had undergone both enhanced conventional spin-echo MR imaging and unenhanced fast spin-echo MR imaging of the cerebellopontine angle/internal auditory canal region. The patients were imaged with the use of a quadrature head receiver coil for the conventional spin-echo sequences and dual 3-inch phased-array receiver coils for the fast spin-echo sequences. The size of the acoustic schwannomas ranged from 2 to 40 mm in maximum dimension. The mean maximum diameter was 12 mm, and 12 neoplasms were less than 10 mm in diameter. Acoustic schwannoma was correctly diagnosed on 98% of the fast spin-echo images and on 100% of the enhanced conventional spin-echo images. Statistical analysis of the data using the kappa coefficient demonstrated agreement beyond chance between these two imaging techniques for the diagnosis of acoustic schwannoma. There is no statistically significant difference in the sensitivity and specificity of unenhanced high-resolution fast spin-echo imaging and enhance T1-weighted conventional spin-echo imaging in the detection of acoustic schwannoma. We believe that the unenhanced high-resolution fast spin-echo technique provides a cost-effective method for the diagnosis of acoustic schwannoma.
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
Jin, Hong-Ying; Li, Da-Wei; Zhang, Na; Gu, Zhen; Long, Yi-Tao
2015-06-10
We demonstrated a practical method to analyze carbohydrate-protein interaction based on single plasmonic nanoparticles by conventional dark field microscopy (DFM). Protein concanavalin A (ConA) was modified on large sized gold nanoparticles (AuNPs), and dextran was conjugated on small sized AuNPs. As the interaction between ConA and dextran resulted in two kinds of gold nanoparticles coupled together, which caused coupling of plasmonic oscillations, apparent color changes (from green to yellow) of the single AuNPs were observed through DFM. Then, the color information was instantly transformed into a statistic peak wavelength distribution in less than 1 min by a self-developed statistical program (nanoparticleAnalysis). In addition, the interaction between ConA and dextran was proved with biospecific recognition. This approach is high-throughput and real-time, and is a convenient method to analyze carbohydrate-protein interaction at the single nanoparticle level efficiently.
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
Bautista, Ami C; Zhou, Lei; Jawa, Vibha
2013-10-01
Immunogenicity support during nonclinical biotherapeutic development can be resource intensive if supported by conventional methodologies. A universal indirect species-specific immunoassay can eliminate the need for biotherapeutic-specific anti-drug antibody immunoassays without compromising quality. By implementing the R's of sustainability (reduce, reuse, rethink), conservation of resources and greener laboratory practices were achieved in this study. Statistical analysis across four biotherapeutics supported identification of consistent product performance standards (cut points, sensitivity and reference limits) and a streamlined universal anti-drug antibody immunoassay method implementation strategy. We propose an efficient, fit-for-purpose, scientifically and statistically supported nonclinical immunogenicity assessment strategy. Utilization of a universal method and streamlined validation, while retaining comparability to conventional immunoassays and meeting the industry recommended standards, provides environmental credits in the scientific laboratory. Collectively, individual reductions in critical material consumption, energy usage, waste and non-environment friendly consumables, such as plastic and paper, support a greener laboratory environment.
Correlative weighted stacking for seismic data in the wavelet domain
Zhang, S.; Xu, Y.; Xia, J.; ,
2004-01-01
Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.
Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments
Dorazio, Robert; Hunter, Margaret
2015-01-01
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
Dorazio, Robert M; Hunter, Margaret E
2015-11-03
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
Statistical Analysis of Hit/Miss Data (Preprint)
2012-07-01
HDBK-1823A, 2009). Other agencies and industries have also made use of this guidance (Gandossi et al., 2010) and ( Drury et al., 2006). It should...better accounting of false call rates such that the POD curve doesn’t converge to 0 for small flaw sizes. The difficulty with conventional methods...2002. Drury , Ghylin, and Holness, Error Analysis and Threat Magnitude for Carry-on Bag Inspection, Proceedings of the Human Factors and Ergonomic
Gil, Luiz Fernando; Sarendranath, Alvin; Neiva, Rodrigo; Marão, Heloisa F; Tovar, Nick; Bonfante, Estevam A; Janal, Malvin N; Castellano, Arthur; Coelho, Paulo G
This study evaluated whether simplified drilling protocols would provide comparable histologic and histomorphometric results to conventional drilling protocols at a low rotational speed. A total of 48 alumina-blasted and acid-etched Ti-6Al-4V implants with two diameters (3.75 and 4.2 mm, n = 24 per group) were bilaterally placed in the tibiae of 12 dogs, under a low-speed protocol (400 rpm). Within the same diameter group, half of the implants were inserted after a simplified drilling procedure (pilot drill + final diameter drill), and the other half were placed using the conventional drilling procedure. After 3 and 5 weeks, the animals were euthanized, and the retrieved bone-implant samples were subjected to nondecalcified histologic sectioning. Histomorphology, bone-to-implant contact (BIC), and bone area fraction occupancy (BAFO) analysis were performed. Histology showed that new bone was formed around implants, and inflammation or bone resorption was not evident for both groups. Histomorphometrically, when all independent variables were collapsed over drilling technique, no differences were detected for BIC and BAFO; when drilling technique was analyzed as a function of time, the conventional groups reached statistically higher BIC and BAFO at 3 weeks, but comparable values between techniques were observed at 5 weeks; 4.2-mm implants obtained statistically higher BAFO relative to 3.75-mm implants. Based on the present methodology, the conventional technique improved bone formation at 3 weeks, and narrower implants were associated with less bone formation.
Jadhav, Vivek Dattatray; Motwani, Bhagwan K.; Shinde, Jitendra; Adhapure, Prasad
2017-01-01
Aims: The aim of this study was to evaluate the marginal fit and surface roughness of complete cast crowns made by a conventional and an accelerated casting technique. Settings and Design: This study was divided into three parts. In Part I, the marginal fit of full metal crowns made by both casting techniques in the vertical direction was checked, in Part II, the fit of sectional metal crowns in the horizontal direction made by both casting techniques was checked, and in Part III, the surface roughness of disc-shaped metal plate specimens made by both casting techniques was checked. Materials and Methods: A conventional technique was compared with an accelerated technique. In Part I of the study, the marginal fit of the full metal crowns as well as in Part II, the horizontal fit of sectional metal crowns made by both casting techniques was determined, and in Part III, the surface roughness of castings made with the same techniques was compared. Statistical Analysis Used: The results of the t-test and independent sample test do not indicate statistically significant differences in the marginal discrepancy detected between the two casting techniques. Results: For the marginal discrepancy and surface roughness, crowns fabricated with the accelerated technique were significantly different from those fabricated with the conventional technique. Conclusions: Accelerated casting technique showed quite satisfactory results, but the conventional technique was superior in terms of marginal fit and surface roughness. PMID:29042726
Modeling and replicating statistical topology and evidence for CMB nonhomogeneity
Agami, Sarit
2017-01-01
Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301
NASA Astrophysics Data System (ADS)
Decraene, Carolina; Dijckmans, Arne; Reynders, Edwin P. B.
2018-05-01
A method is developed for computing the mean and variance of the diffuse field sound transmission loss of finite-sized layered wall and floor systems that consist of solid, fluid and/or poroelastic layers. This is achieved by coupling a transfer matrix model of the wall or floor to statistical energy analysis subsystem models of the adjacent room volumes. The modal behavior of the wall is approximately accounted for by projecting the wall displacement onto a set of sinusoidal lateral basis functions. This hybrid modal transfer matrix-statistical energy analysis method is validated on multiple wall systems: a thin steel plate, a polymethyl methacrylate panel, a thick brick wall, a sandwich panel, a double-leaf wall with poro-elastic material in the cavity, and a double glazing. The predictions are compared with experimental data and with results obtained using alternative prediction methods such as the transfer matrix method with spatial windowing, the hybrid wave based-transfer matrix method, and the hybrid finite element-statistical energy analysis method. These comparisons confirm the prediction accuracy of the proposed method and the computational efficiency against the conventional hybrid finite element-statistical energy analysis method.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
Miyazawa, Arata; Hong, Young-Joo; Makita, Shuichi; Kasaragod, Deepa; Yasuno, Yoshiaki
2017-01-01
Jones matrix-based polarization sensitive optical coherence tomography (JM-OCT) simultaneously measures optical intensity, birefringence, degree of polarization uniformity, and OCT angiography. The statistics of the optical features in a local region, such as the local mean of the OCT intensity, are frequently used for image processing and the quantitative analysis of JM-OCT. Conventionally, local statistics have been computed with fixed-size rectangular kernels. However, this results in a trade-off between image sharpness and statistical accuracy. We introduce a superpixel method to JM-OCT for generating the flexible kernels of local statistics. A superpixel is a cluster of image pixels that is formed by the pixels’ spatial and signal value proximities. An algorithm for superpixel generation specialized for JM-OCT and its optimization methods are presented in this paper. The spatial proximity is in two-dimensional cross-sectional space and the signal values are the four optical features. Hence, the superpixel method is a six-dimensional clustering technique for JM-OCT pixels. The performance of the JM-OCT superpixels and its optimization methods are evaluated in detail using JM-OCT datasets of posterior eyes. The superpixels were found to well preserve tissue structures, such as layer structures, sclera, vessels, and retinal pigment epithelium. And hence, they are more suitable for local statistics kernels than conventional uniform rectangular kernels. PMID:29082073
Effect of different mixing methods on the bacterial microleakage of calcium-enriched mixture cement.
Shahi, Shahriar; Jeddi Khajeh, Soniya; Rahimi, Saeed; Yavari, Hamid R; Jafari, Farnaz; Samiei, Mohammad; Ghasemi, Negin; Milani, Amin S
2016-10-01
Calcium-enriched mixture (CEM) cement is used in the field of endodontics. It is similar to mineral trioxide aggregate in its main ingredients. The present study investigated the effect of different mixing methods on the bacterial microleakage of CEM cement. A total of 55 human single-rooted human permanent teeth were decoronated so that 14-mm-long samples were obtained and obturated with AH26 sealer and gutta-percha using lateral condensation technique. Three millimeters of the root end were cut off and randomly divided into 3 groups of 15 each (3 mixing methods of amalgamator, ultrasonic and conventional) and 2 negative and positive control groups (each containing 5 samples). BHI (brain-heart infusion agar) suspension containing Enterococcus faecalis was used for bacterial leakage assessment. Statistical analysis was carried out using descriptive statistics, Kaplan-Meier survival analysis with censored data and log rank test. Statistical significance was set at P<0.05. The survival means for conventional, amalgamator and ultrasonic methods were 62.13±12.44, 68.87±12.79 and 77.53±12.52 days, respectively. The log rank test showed no significant differences between the groups. Based on the results of the present study it can be concluded that different mixing methods had no significant effect on the bacterial microleakage of CEM cement.
A refined method for multivariate meta-analysis and meta-regression
Jackson, Daniel; Riley, Richard D
2014-01-01
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects’ standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:23996351
High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets
NASA Astrophysics Data System (ADS)
Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong
2008-02-01
Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.
Histological analysis of effects of 24% EDTA gel for nonsurgical treatment of periodontal tissues.
de Vasconcellos, Luana Marotta Reis; Ricardo, Lucilene Hernandes; Balducci, Ivan; de Vasconcellos, Luis Gustavo Oliveira; Carvalho, Yasmin Rodarte
2006-12-01
The aim of this study was to investigate, by means of histological and histomorphometric analysis, the effects of 24% ethylenediaminetetraacetic acid (EDTA) gel in periodontal tissue when used in combination with conventional periodontal treatment. Periodontitis was induced in the 2nd upper left permanent molars of 45 male Wistar rats by means of ligature. After 5 weeks, this was removed and debridement was performed. The animals were then randomly divided into 3 groups; group 1: mechanical treatment, group 2: mechanical treatment and EDTA gel application for 2 min, and group 3: mechanical treatment and placebo gel application for 2 min. After the treatment, rinsing was done with 0.9% saline solution for 1 min in all cases, followed by root notching in the deepest part of the pocket. After 4, 10, and 28 days the animals were sacrificed. The averages obtained were evaluated by means of test two-way analysis of variance (ANOVA) and Tukey statistical tests (P < 0.05). The results showed that with respect to the type of treatment employed, there were no statistically significant differences in the vitality of the periodontal tissue. It was concluded that 24% EDTA gel did not interfere with periodontal tissue repair when used in combination with conventional periodontal treatment.
ROC Analysis of Chest Radiographs Using Computed Radiography and Conventional Analog Films
NASA Astrophysics Data System (ADS)
Morioka, Craig A.; Brown, Kathy; Hayrapetian, Alek S.; Kangarloo, Hooshang; Balter, Stephen; Huang, H. K.
1989-05-01
Receiver operating characteristic is used to compare the image quality of films obtained digitally using computed radiography (CR) and conventionally using analog film following fluoroscopic examination. Similar radiological views were obtained by both modalities. Twenty-four cases, some with a solitary noncalcified nodule and/or pneumothorax, were collected. Ten radiologists have been tested viewing analog and CR digital films separately. Final results indicate that there is no statistically significant difference in the ability to detect either a pneumothorax or a solitary noncalcified nodule when comparing CR digital film with conventional analog film. However, there is a trend that indicated the area under the ROC curves for detection of either a pneumothorax or solitary noncalcified nodule were greater for the analog film than for the digital film.
Lee, Hyoung Shin; Lee, Dongwon; Koo, Yong Cheol; Shin, Hyang Ae; Koh, Yoon Woo; Choi, Eun Chang
2013-03-01
In this study, the authors introduce and evaluate the feasibility of endoscopic resection using the retroauricular approach for various benign lesions of the upper neck. A retrospective comparative analysis was performed on the clinical outcomes of patients who underwent surgery for upper neck masses as endoscopic resection using the retroauricular approach or conventional transcervical resection at the authors' center from January 2010 through August 2011. The primary outcome was the cosmetic satisfaction of the patients in each group. In addition, the feasibility of the procedure was evaluated by comparing the operation time; hospital stay; amount and duration of drainage; complications such as marginal mandibular nerve, lingual, or hypoglossal nerve palsy; paresthesia of the ear lobe; and wound problems such as hematoma and skin necrosis. Statistical analysis was performed by independent-samples t test and the Fisher exact test, and a P value less than .05 was considered statistically significant. Thirty-six patients underwent endoscopic resection (endo group; 15 men, 21 women; mean age, 38.8 ± 15.0 years) and 40 patients underwent conventional transcervical resection (conventional group; 18 men, 22 women; mean age, 45.1 ± 14.1 years). The operating time in the endo group was longer than in the conventional group (P = .003). No significant difference was observed in the overall perioperative complications between the 2 groups. Cosmetic satisfaction evaluated with a graded scale showed much better results in the endo group (P < .001). Endoscopic resection using the retroauricular approach is feasible for various benign upper neck masses when conducted by an experienced endoscopic surgeon, with excellent cosmetic results. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J
2017-11-24
Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Shrout, Patrick E; Rodgers, Joseph L
2018-01-04
Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.
Accuracy of Digital vs. Conventional Implant Impressions
Lee, Sang J.; Betensky, Rebecca A.; Gianneschi, Grace E.; Gallucci, German O.
2015-01-01
The accuracy of digital impressions greatly influences the clinical viability in implant restorations. The aim of this study is to compare the accuracy of gypsum models acquired from the conventional implant impression to digitally milled models created from direct digitalization by three-dimensional analysis. Thirty gypsum and 30 digitally milled models impressed directly from a reference model were prepared. The models were scanned by a laboratory scanner and 30 STL datasets from each group were imported to an inspection software. The datasets were aligned to the reference dataset by a repeated best fit algorithm and 10 specified contact locations of interest were measured in mean volumetric deviations. The areas were pooled by cusps, fossae, interproximal contacts, horizontal and vertical axes of implant position and angulation. The pooled areas were statistically analysed by comparing each group to the reference model to investigate the mean volumetric deviations accounting for accuracy and standard deviations for precision. Milled models from digital impressions had comparable accuracy to gypsum models from conventional impressions. However, differences in fossae and vertical displacement of the implant position from the gypsum and digitally milled models compared to the reference model, exhibited statistical significance (p<0.001, p=0.020 respectively). PMID:24720423
Li, Haoyan; Liang, Yongqiang; Zheng, Qiang
2015-01-01
To evaluate correlations between marginal bone resorption and high insertion torque value (> 50 Ncm) of dental implants and to assess the significance of immediate and early/conventional loading of implants under a certain range torque value. Specific inclusion and exclusion criteria were used to retrieve eligible articles from Ovid, PubMed, and EBSCO up to December 2013. Screening of eligible studies, quality assessment, and data extraction were conducted in duplicate. The results were expressed as random/fixed-effects models using weighted mean differences for continuous outcomes with 95% confidence intervals. Initially, 154 articles were selected (11 from Ovid, 112 from PubMed, and 31 from EBSCO). After exclusion of duplicate articles and articles that did not meet the inclusion criteria, six clinical studies were selected. Assessment of P values revealed that correlations between marginal bone resorption and high insertion torque were not statistically significant and that there was no difference between immediately versus early/conventionally loaded implants under a certain range of torque. None of the meta-analyses revealed any statistically significant differences between high insertion torque and conventional insertion torque in terms of effects on marginal bone resorption.
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Lancaster, Cady; Espinoza, Edgard
2012-05-15
International trade of several Dalbergia wood species is regulated by The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). In order to supplement morphological identification of these species, a rapid chemical method of analysis was developed. Using Direct Analysis in Real Time (DART) ionization coupled with Time-of-Flight (TOF) Mass Spectrometry (MS), selected Dalbergia and common trade species were analyzed. Each of the 13 wood species was classified using principal component analysis and linear discriminant analysis (LDA). These statistical data clusters served as reliable anchors for species identification of unknowns. Analysis of 20 or more samples from the 13 species studied in this research indicates that the DART-TOFMS results are reproducible. Statistical analysis of the most abundant ions gave good classifications that were useful for identifying unknown wood samples. DART-TOFMS and LDA analysis of 13 species of selected timber samples and the statistical classification allowed for the correct assignment of unknown wood samples. This method is rapid and can be useful when anatomical identification is difficult but needed in order to support CITES enforcement. Published 2012. This article is a US Government work and is in the public domain in the USA.
1994-02-01
appears with plots no. 1- 30. This refers to the default convention to delete from analysis that data acquired in any bi-hourly acquisition window...FREQUENCY - 35 MHZ BASED ON OBSERVED NOISE MEASUREMENTS - VERTICAL rIEN~ lOS1 06-1 13-OCT-ii PLO 8.00 186 GEOPHYSICS LAB METEOR SCATTER PROGRAM I AVER
Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems
2016-06-28
harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release
Strength and life criteria for corrugated fiberboard by three methods
Thomas J. Urbanik
1997-01-01
The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...
FMRI group analysis combining effect estimates and their variances
Chen, Gang; Saad, Ziad S.; Nath, Audrey R.; Beauchamp, Michael S.; Cox, Robert W.
2012-01-01
Conventional functional magnetic resonance imaging (FMRI) group analysis makes two key assumptions that are not always justified. First, the data from each subject is condensed into a single number per voxel, under the assumption that within-subject variance for the effect of interest is the same across all subjects or is negligible relative to the cross-subject variance. Second, it is assumed that all data values are drawn from the same Gaussian distribution with no outliers. We propose an approach that does not make such strong assumptions, and present a computationally efficient frequentist approach to FMRI group analysis, which we term mixed-effects multilevel analysis (MEMA), that incorporates both the variability across subjects and the precision estimate of each effect of interest from individual subject analyses. On average, the more accurate tests result in higher statistical power, especially when conventional variance assumptions do not hold, or in the presence of outliers. In addition, various heterogeneity measures are available with MEMA that may assist the investigator in further improving the modeling. Our method allows group effect t-tests and comparisons among conditions and among groups. In addition, it has the capability to incorporate subject-specific covariates such as age, IQ, or behavioral data. Simulations were performed to illustrate power comparisons and the capability of controlling type I errors among various significance testing methods, and the results indicated that the testing statistic we adopted struck a good balance between power gain and type I error control. Our approach is instantiated in an open-source, freely distributed program that may be used on any dataset stored in the universal neuroimaging file transfer (NIfTI) format. To date, the main impediment for more accurate testing that incorporates both within- and cross-subject variability has been the high computational cost. Our efficient implementation makes this approach practical. We recommend its use in lieu of the less accurate approach in the conventional group analysis. PMID:22245637
Meta-analysis inside and outside particle physics: two traditions that should converge?
Baker, Rose D; Jackson, Dan
2013-06-01
The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle physics. In this paper, we use statistical models to study two major differences in practice. The first is the usefulness of systematic errors, which physicists are now beginning to quote in addition to statistical errors. The second is whether it is better to treat heterogeneity by scaling up errors as do the Particle Data Group or by adding a random effect as does the rest of the community. Besides fitting models, we derive and use an exact test of the error-scaling hypothesis. We also discuss the other methodological differences between the two streams of meta-analysis. Our conclusion is that systematic errors are not currently very useful and that the conventional random effects model, as routinely used in meta-analysis, has a useful role to play in particle physics. The moral we draw for statisticians is that we should be more willing to explore 'grassroots' areas of statistical application, so that good statistical practice can flow both from and back to the statistical mainstream. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
Precision of guided scanning procedures for full-arch digital impressions in vivo.
Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert
2017-11-01
System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.
Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin
2014-06-05
In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.
2014-01-01
Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483
Gowda, Dhananjaya; Airaksinen, Manu; Alku, Paavo
2017-09-01
Recently, a quasi-closed phase (QCP) analysis of speech signals for accurate glottal inverse filtering was proposed. However, the QCP analysis which belongs to the family of temporally weighted linear prediction (WLP) methods uses the conventional forward type of sample prediction. This may not be the best choice especially in computing WLP models with a hard-limiting weighting function. A sample selective minimization of the prediction error in WLP reduces the effective number of samples available within a given window frame. To counter this problem, a modified quasi-closed phase forward-backward (QCP-FB) analysis is proposed, wherein each sample is predicted based on its past as well as future samples thereby utilizing the available number of samples more effectively. Formant detection and estimation experiments on synthetic vowels generated using a physical modeling approach as well as natural speech utterances show that the proposed QCP-FB method yields statistically significant improvements over the conventional linear prediction and QCP methods.
Saha, Sonali; Jaiswal, JN; Samadi, Firoza
2014-01-01
ABSTRACT Aim: The present study was taken up to clinically evaluate and compare effectiveness of transcutaneous electrical nerve stimulator (TENS) and comfort control syringe (CCS) in various pediatric dental procedures as an alternative to the conventional method of local anesthesia (LA) administration. Materials and methods: Ninety healthy children having at least one deciduous molar tooth indicated for extraction in either maxillary right or left quadrant in age group of 6 to 10 years were randomly divided into three equal groups having 30 subjects each. Group I: LA administration using conventional syringe, group II: LA administration using TENS along with the conventional syringe, group III: LA administration using CCS. After LA by the three techniques, pain, anxiety and heart rate were measured. Statistical analysis: The observations, thus, obtained were subjected to statistical analysis using analysis of variance (ANOVA), student t-test and paired t-test. Results: The mean pain score was maximum in group I followed by group II, while group III revealed the minimum pain, where LA was administered using CCS. Mean anxiety score was maximum in group I followed by group II, while group III revealed the minimum score. Mean heart rate was maximum in group I followed in descending order by groups II and III. Conclusion: The study supports the belief that CCS could be a viable alternative in comparison to the other two methods of LA delivery in children. How to cite this article: Bansal N, Saha S, Jaiswal JN, Samadi F. Pain Elimination during Injection with Newer Electronic Devices: A Comparative Evaluation in Children. Int J Clin Pediatr Dent 2014;7(2):71-76. PMID:25356003
Digital versus conventional techniques for pattern fabrication of implant-supported frameworks
Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour
2018-01-01
Objective: The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Materials and Methods: Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method (n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal–Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. Statistical Analysis Used: One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. Results: The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups (P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group (P < 0.001), while there was no significant difference between the two other groups (P = 0.54). CAD/CAM group required significantly more adjustments (P < 0.001). Conclusions: CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations. PMID:29657528
Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting
Husen, Mohd Nizam; Lee, Sukhan
2016-01-01
A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis. PMID:27845711
Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting.
Husen, Mohd Nizam; Lee, Sukhan
2016-11-11
A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis.
On the establishment and maintenance of a modern conventional terrestrial reference system
NASA Technical Reports Server (NTRS)
Bock, Y.; Zhu, S. Y.
1982-01-01
The frame of the Conventional Terrestrial Reference System (CTS) is defined by an adopted set of coordinates, at a fundamental epoxh, of a global network of stations which contribute the vertices of a fundamental polyhedron. A method to estimate this set of coordinates using a combination of modern three dimensional geodetic systems is presented. Once established, the function of the CTS is twofold. The first is to monitor the external (or global) motions of the polyhedron with respect to the frame of a Conventional Inertial Reference System, i.e., those motions common to all stations. The second is to monitor the internal motions (or deformations) of the polyhedron, i.e., those motions that are not common to all stations. Two possible estimators for use in earth deformation analysis are given and their statistical and physical properties are described.
Research on ionospheric tomography based on variable pixel height
NASA Astrophysics Data System (ADS)
Zheng, Dunyong; Li, Peiqing; He, Jie; Hu, Wusheng; Li, Chaokui
2016-05-01
A novel ionospheric tomography technique based on variable pixel height was developed for the tomographic reconstruction of the ionospheric electron density distribution. The method considers the height of each pixel as an unknown variable, which is retrieved during the inversion process together with the electron density values. In contrast to conventional computerized ionospheric tomography (CIT), which parameterizes the model with a fixed pixel height, the variable-pixel-height computerized ionospheric tomography (VHCIT) model applies a disturbance to the height of each pixel. In comparison with conventional CIT models, the VHCIT technique achieved superior results in a numerical simulation. A careful validation of the reliability and superiority of VHCIT was performed. According to the results of the statistical analysis of the average root mean square errors, the proposed model offers an improvement by 15% compared with conventional CIT models.
Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S
2018-04-01
One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Bringing a transgenic crop to market: where compositional analysis fits.
Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine
2013-09-04
In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.
New heterogeneous test statistics for the unbalanced fixed-effect nested design.
Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming
2011-05-01
When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.
Arnold, Heino; Stukenborg-Colsman, Christina; Hurschler, Christof; Seehaus, Frank; Bobrowitsch, Evgenij; Waizy, Hazibullah
2012-01-01
The aim of this study was to examine resistance to angulation and displacement of the internal fixation of a proximal first metatarsal lateral displacement osteotomy, using a locking plate system compared with a conventional crossed screw fixation. Seven anatomical human specimens were tested. Each specimen was tested with a locking screw plate as well as a crossed cancellous srew fixation. The statistical analysis was performed by the Friedman test. The level of significance was p = 0.05. We found larger stability about all three axes of movement analyzed for the PLATE than the crossed screws osteosynthesis (CSO). The Friedman test showed statistical significance at a level of p = 0.05 for all groups and both translational and rotational movements. The results of our study confirm that the fixation of the lateral proximal first metatarsal displacement osteotomy with a locking plate fixation is a technically simple procedure of superior stability.
Arnold, Heino; Stukenborg-Colsman, Christina; Hurschler, Christof; Seehaus, Frank; Bobrowitsch, Evgenij; Waizy, Hazibullah
2012-01-01
Introduction: The aim of this study was to examine resistance to angulation and displacement of the internal fixation of a proximal first metatarsal lateral displacement osteotomy, using a locking plate system compared with a conventional crossed screw fixation. Materials and Methodology: Seven anatomical human specimens were tested. Each specimen was tested with a locking screw plate as well as a crossed cancellous srew fixation. The statistical analysis was performed by the Friedman test. The level of significance was p = 0.05. Results: We found larger stability about all three axes of movement analyzed for the PLATE than the crossed screws osteosynthesis (CSO). The Friedman test showed statistical significance at a level of p = 0.05 for all groups and both translational and rotational movements. Conclusion: The results of our study confirm that the fixation of the lateral proximal first metatarsal displacement osteotomy with a locking plate fixation is a technically simple procedure of superior stability. PMID:22675409
Bjorgan, Asgeir; Randeberg, Lise Lyngsnes
2015-01-01
Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717
Ullattuthodi, Sujana; Cherian, Kandathil Phillip; Anandkumar, R; Nambiar, M Sreedevi
2017-01-01
This in vitro study seeks to evaluate and compare the marginal and internal fit of cobalt-chromium copings fabricated using the conventional and direct metal laser sintering (DMLS) techniques. A master model of a prepared molar tooth was made using cobalt-chromium alloy. Silicone impression of the master model was made and thirty standardized working models were then produced; twenty working models for conventional lost-wax technique and ten working models for DMLS technique. A total of twenty metal copings were fabricated using two different production techniques: conventional lost-wax method and DMLS; ten samples in each group. The conventional and DMLS copings were cemented to the working models using glass ionomer cement. Marginal gap of the copings were measured at predetermined four points. The die with the cemented copings are standardized-sectioned with a heavy duty lathe. Then, each sectioned samples were analyzed for the internal gap between the die and the metal coping using a metallurgical microscope. Digital photographs were taken at ×50 magnification and analyzed using measurement software. Statistical analysis was done by unpaired t -test and analysis of variance (ANOVA). The results of this study reveal that no significant difference was present in the marginal gap of conventional and DMLS copings ( P > 0.05) by means of ANOVA. The mean values of internal gap of DMLS copings were significantly greater than that of conventional copings ( P < 0.05). Within the limitations of this in vitro study, it was concluded that the internal fit of conventional copings was superior to that of the DMLS copings. Marginal fit of the copings fabricated by two different techniques had no significant difference.
Comprehensive non-dimensional normalization of gait data.
Pinzone, Ornella; Schwartz, Michael H; Baker, Richard
2016-02-01
Normalizing clinical gait analysis data is required to remove variability due to physical characteristics such as leg length and weight. This is particularly important for children where both are associated with age. In most clinical centres conventional normalization (by mass only) is used whereas there is a stronger biomechanical argument for non-dimensional normalization. This study used data from 82 typically developing children to compare how the two schemes performed over a wide range of temporal-spatial and kinetic parameters by calculating the coefficients of determination with leg length, weight and height. 81% of the conventionally normalized parameters had a coefficient of determination above the threshold for a statistical association (p<0.05) compared to 23% of those normalized non-dimensionally. All the conventionally normalized parameters exceeding this threshold showed a reduced association with non-dimensional normalization. In conclusion, non-dimensional normalization is more effective that conventional normalization in reducing the effects of height, weight and age in a comprehensive range of temporal-spatial and kinetic parameters. Copyright © 2015 Elsevier B.V. All rights reserved.
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko
2014-07-01
Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Eun-Sook Kim; Cheol-Min Kim; Jisun Lee; Jong-Su Yim
2015-01-01
Since 1971, South Korea has implemented national forest inventory (NFI) in pursuance of understanding current state and change trend of national forest resources. NFI1 (1971~1975), NFI2 (1978~1981), NFI3 (1986~1992) and NFI4 (1996~2005) were implemented in order to produce national forest resources statistics. However, since the early 1990s, international conventions...
Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Kikuchi, Kazufumi; Yoshikawa, Hiroshi; Obara, Makoto; Honda, Hiroshi
2018-06-01
To differentiate cystic from solid solitary intraorbital tumors using 3D turbo field echo with diffusion-sensitized driven-equilibrium preparation without contrast material. This retrospective study was approved by our institutional review boards, and written informed consent was waived. A total of 26 patients with intraorbital tumors were studied. Motion probing gradients were conducted at one direction with b‑values of 0 and 500 s/mm 2 . The voxel size was 1.5 × 1.5 × 1.5 mm 3 , and acquisition time was 5 min 22 s. Additionally, fat-suppressed T2-weighted imaging (T2WI) and T1WI were obtained. The apparent diffusion coefficients (ADC) of the lesions were measured. Signal intensity on conventional magnetic resonance imaging (MRI) compared to normal appearing white matter was also measured. Statistical analysis was performed with Mann-Whitney U-test, the Steel-Dwass test and the receiver operating characteristic (ROC) analysis. There were 10 cystic (7 dermoids, 2 epidermoids, and 1 cystadenoma) and 16 solid (8 cavernous hemangiomas, 6 pleomorphic adenomas, 1 adenocarcinoma, and 1 sebaceous carcinoma) tumors. The ADC of the cystic tumors (mean ± SD; 2.21 ± 0.76 × 10 -3 mm 2 /s) was statistically significantly lower than that of solid tumors (1.43 ± 0.41 × 10 -3 mm 2 /s; P < 0.05).; however, there were no statistically significant differences on conventional MRI (P > 0.05). There were no statistically significant differences among tumor subtypes in all parameters (P > 0.05). The ROC analysis showed the best diagnostic performance with ADC (Az = 0.77). With its insensitivity to field inhomogeneity and high spatial resolution, the 3D DSDE-TFE technique enabled us to discriminate cystic tumors from solid tumors.
Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K
2015-01-01
Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (p<0.001) and integrated discrimination improvement (p=0.04). The HALT-C model had a c-statistic of 0.60 (95%CI 0.50-0.70) in the validation cohort and was outperformed by the machine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273
Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V B
2017-01-01
The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block.
Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V. B.
2017-01-01
Objective: The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). Materials and Methods: A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Results: Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Conclusion: Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block. PMID:29264294
Park, Yoonah; Yong, Yuen Geng; Yun, Seong Hyeon; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung
2015-05-01
This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase.
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Using machine learning to assess covariate balance in matching studies.
Linden, Ariel; Yarnold, Paul R
2016-12-01
In order to assess the effectiveness of matching approaches in observational studies, investigators typically present summary statistics for each observed pre-intervention covariate, with the objective of showing that matching reduces the difference in means (or proportions) between groups to as close to zero as possible. In this paper, we introduce a new approach to distinguish between study groups based on their distributions of the covariates using a machine-learning algorithm called optimal discriminant analysis (ODA). Assessing covariate balance using ODA as compared with the conventional method has several key advantages: the ability to ascertain how individuals self-select based on optimal (maximum-accuracy) cut-points on the covariates; the application to any variable metric and number of groups; its insensitivity to skewed data or outliers; and the use of accuracy measures that can be widely applied to all analyses. Moreover, ODA accepts analytic weights, thereby extending the assessment of covariate balance to any study design where weights are used for covariate adjustment. By comparing the two approaches using empirical data, we are able to demonstrate that using measures of classification accuracy as balance diagnostics produces highly consistent results to those obtained via the conventional approach (in our matched-pairs example, ODA revealed a weak statistically significant relationship not detected by the conventional approach). Thus, investigators should consider ODA as a robust complement, or perhaps alternative, to the conventional approach for assessing covariate balance in matching studies. © 2016 John Wiley & Sons, Ltd.
Duc, Anh Nguyen; Wolbers, Marcel
2017-02-10
Composite endpoints are widely used as primary endpoints of randomized controlled trials across clinical disciplines. A common critique of the conventional analysis of composite endpoints is that all disease events are weighted equally, whereas their clinical relevance may differ substantially. We address this by introducing a framework for the weighted analysis of composite endpoints and interpretable test statistics, which are applicable to both binary and time-to-event data. To cope with the difficulty of selecting an exact set of weights, we propose a method for constructing simultaneous confidence intervals and tests that asymptotically preserve the family-wise type I error in the strong sense across families of weights satisfying flexible inequality or order constraints based on the theory of χ¯2-distributions. We show that the method achieves the nominal simultaneous coverage rate with substantial efficiency gains over Scheffé's procedure in a simulation study and apply it to trials in cardiovascular disease and enteric fever. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Ocular Biocompatibility of Nitinol Intraocular Clips
Velez-Montoya, Raul; Erlanger, Michael
2012-01-01
Purpose. To evaluate the tolerance and biocompatibility of a preformed nitinol intraocular clip in an animal model after anterior segment surgery. Methods. Yucatan mini-pigs were used. A 30-gauge prototype injector was used to attach a shape memory nitinol clip to the iris of five pigs. Another five eyes received conventional polypropylene suture with a modified Seipser slip knot. The authors compared the surgical time of each technique. All eyes underwent standard full-field electroretinogram at baseline and 8 weeks after surgery. The animals were euthanized and eyes collected for histologic analysis after 70 days (10 weeks) postsurgery. The corneal thickness, corneal endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram parameters were compared between the groups. A two sample t-test for means and a P value of 0.05 were use for assessing statistical differences between measurements. Results. The injection of the nitinol clip was 15 times faster than conventional suturing. There were no statistical differences between the groups for corneal thickness, endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram measurements. Conclusions. The nitinol clip prototype is well tolerated and showed no evidence of toxicity in the short-term. The injectable delivery system was faster and technically less challenging than conventional suture techniques. PMID:22064995
Strappini, Francesca; Gilboa, Elad; Pitzalis, Sabrina; Kay, Kendrick; McAvoy, Mark; Nehorai, Arye; Snyder, Abraham Z
2017-03-01
Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed-width Gaussian filters, remove fine-scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine-scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP-based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop-in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies. Hum Brain Mapp 38:1438-1459, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Reliability of patient specific instrumentation in total knee arthroplasty.
Jennart, Harold; Ngo Yamben, Marie-Ange; Kyriakidis, Theofylaktos; Zorman, David
2015-12-01
The aim of this study was to compare the precision between Patient Specific Instrumentation (PSI) and Conventional Instrumentation (CI) as determined intra-operatively by a pinless navigation system. Eighty patients were included in this prospective comparative study and they were divided into two homogeneous groups. We defined an original score from 6 to 30 points to evaluate the accuracy of the position of the cutting guides. This score is based on 6 objective criteria. The analysis indicated that PSI was not superior to conventional instrumentation in the overall score (p = 0.949). Moreover, no statistically significant difference was observed for any individual criteria of our score. Level of evidence II.
Ratcliffe, J; Van Haselen, R; Buxton, M; Hardy, K; Colehan, J; Partridge, M
2002-01-01
Background: A study was undertaken to investigate the preferences of patients with asthma for attributes or characteristics associated with treatment for their asthma and to investigate the extent to which such preferences may differ between patient subgroups. Methods: The economic technique of conjoint analysis (CA) was used to investigate patients' strength of preference for several key attributes associated with services for the treatment of asthma. A CA questionnaire was administered to two groups of asthma outpatients aged 18 years or older, 150 receiving conventional treatment at Whipps Cross Hospital (WC) and 150 receiving homeopathic treatment at the Royal London Homoeopathic Hospital (RL). Results: An overall response rate of 47% (n=142) was achieved. Statistically significant attributes in influencing preferences for both the WC and RL respondents were (1) the extent to which the doctor gave sufficient time to listen to what the patient has to say, (2) the extent to which the treatment seemed to relieve symptoms, and (3) the travel costs of attending for an asthma consultation. The extent to which the doctor treated the patient as a whole person was also a statistically significant attribute for the RL respondents. Conclusions: This study has shown that aspects associated with the process of delivery of asthma services are important to patients in addition to treatment outcomes. The homeopathic respondents expressed stronger preferences for the doctor to treat them as a whole person than the patients receiving conventional treatment. Overall, the preferences for the attributes included in the study were similar for both groups. PMID:12037224
Human factors flight trial analysis for 3D SVS: part II
NASA Astrophysics Data System (ADS)
Schiefele, Jens; Howland, Duncan; Maris, John; Pschierer, Christian; Wipplinger, Patrick; Meuter, Michael
2005-05-01
This paper describes flight trials performed in Centennial, CO using a Piper Cheyenne owned and operated by Marinvent. The goal of the flight trial was to evaluate the objective performance of pilots using conventional paper charts or a 3D SVS display. Six pilots flew thirty-six approaches to the Colorado Springs airport to accomplish this goal. As dependent variables, positional accuracy and situational awareness probe (SAP) statistics were measured while analysis was conducted by an ANOVA test. In parallel, all pilots answered subjective Cooper-Harper, NASA TLX, situation awareness rating technique (SART), Display Readability Rating, Display Flyability Rating and debriefing questionnaires. Three different settings (paper chart, electronic navigation chart, 3D SVS display) were evaluated in a totally randomized manner. This paper describes the comparison between the conventional paper chart and the 3D SVS display. The 3D SVS primary flight display provides a depiction of primary flight data as well as a 3D depiction of airports, terrain and obstacles. In addition, a 3D dynamic channel visualizing the selected approach procedure can be displayed. The result shows that pilots flying the 3D SVS display perform no worse than pilots with the conventional paper chart. Flight technical error and workload are lower, situational awareness is equivalent with conventional paper charts.
Zheng, Yingyan; Xiao, Zebin; Zhang, Hua; She, Dejun; Lin, Xuehua; Lin, Yu; Cao, Dairong
2018-04-01
To evaluate the discriminative value of conventional magnetic resonance imaging between benign and malignant palatal tumors. Conventional magnetic resonance imaging features of 130 patients with palatal tumors confirmed by histopathologic examination were retrospectively reviewed. Clinical data and imaging findings were assessed between benign and malignant tumors and between benign and low-grade malignant salivary gland tumors. The variables that were significant in differentiating benign from malignant lesions were further identified using logistic regression analysis. Moreover, imaging features of each common palatal histologic entity were statistically analyzed with the rest of the tumors to define their typical imaging features. Older age, partially defined and ill-defined margins, and absence of a capsule were highly suggestive of malignant palatal tumors, especially ill-defined margins (β = 6.400). The precision in determining malignant palatal tumors achieved a sensitivity of 92.8% and a specificity of 85.6%. In addition, irregular shape, ill-defined margins, lack of a capsule, perineural spread, and invasion of surrounding structures were more often associated with low-grade malignant salivary gland tumors. Conventional magnetic resonance imaging is useful for differentiating benign from malignant palatal tumors as well as benign salivary gland tumors from low-grade salivary gland malignancies. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Clark, D Angus; Bowles, Ryan P
2018-04-23
In exploratory item factor analysis (IFA), researchers may use model fit statistics and commonly invoked fit thresholds to help determine the dimensionality of an assessment. However, these indices and thresholds may mislead as they were developed in a confirmatory framework for models with continuous, not categorical, indicators. The present study used Monte Carlo simulation methods to investigate the ability of popular model fit statistics (chi-square, root mean square error of approximation, the comparative fit index, and the Tucker-Lewis index) and their standard cutoff values to detect the optimal number of latent dimensions underlying sets of dichotomous items. Models were fit to data generated from three-factor population structures that varied in factor loading magnitude, factor intercorrelation magnitude, number of indicators, and whether cross loadings or minor factors were included. The effectiveness of the thresholds varied across fit statistics, and was conditional on many features of the underlying model. Together, results suggest that conventional fit thresholds offer questionable utility in the context of IFA.
Statistical geometric affinity in human brain electric activity
NASA Astrophysics Data System (ADS)
Chornet-Lurbe, A.; Oteo, J. A.; Ros, J.
2007-05-01
The representation of the human electroencephalogram (EEG) records by neurophysiologists demands standardized time-amplitude scales for their correct conventional interpretation. In a suite of graphical experiments involving scaling affine transformations we have been able to convert electroencephalogram samples corresponding to any particular sleep phase and relaxed wakefulness into each other. We propound a statistical explanation for that finding in terms of data collapse. As a sequel, we determine characteristic time and amplitude scales and outline a possible physical interpretation. An analysis for characteristic times based on lacunarity is also carried out as well as a study of the synchrony between left and right EEG channels.
Initial assessment of hearing loss using a mobile application for audiological evaluation.
Derin, S; Cam, O H; Beydilli, H; Acar, E; Elicora, S S; Sahan, M
2016-03-01
This study aimed to compare an Apple iOS mobile operating system application for audiological evaluation with conventional audiometry, and to determine its accuracy and reliability in the initial evaluation of hearing loss. The study comprised 32 patients (16 females) diagnosed with hearing loss. The patients were first evaluated with conventional audiometry and the degree of hearing loss was recorded. Then they underwent a smartphone-based hearing test and the data were compared using Cohen's kappa analysis. Patients' mean age was 53.59 ± 18.01 years (range, 19-85 years). The mobile phone audiometry results for 39 of the 64 ears were fully compatible with the conventional audiometry results. There was a statistically significant concordant relationship between the two sets of audiometry results (p < 0.05). Ear Trumpet version 1.0.2 is a compact and simple mobile application on the Apple iPhone 5 that can measure hearing loss with reliable results.
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
Low statistical power in biomedical science: a review of three human research domains.
Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R
2017-02-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.
Low statistical power in biomedical science: a review of three human research domains
Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois
2017-01-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409
Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.
Yalch, Matthew M
2016-03-01
Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Lodwick, G. D. (Principal Investigator)
1976-01-01
A digital computer and multivariate statistical techniques were used to analyze 4-band multispectral data. A representation of the original data for each of the four bands allows a certain degree of terrain interpretation; however, variations in appearance of sites within and between bands, without additional criteria for deciding which representation should be preferred, create difficulties for classification. Investigation of the video data groups produced by principal components analysis and cluster analysis techniques shows that effective correlations with classifications of terrain produced by conventional methods could be carried out. The analyses also highlighted underlying relationships between the various elements. The approach used allows large areas (185 cm by 185 cm) to be classified into fundamental units within a matter of hours and can be applied to those parts of the Earth where facilities for conventional studies are poor or lacking.
Amidžić Klarić, Daniela; Klarić, Ilija; Mornar, Ana; Velić, Darko; Velić, Natalija
2015-08-01
This study brings out the data on the content of 21 mineral and heavy metal in 15 blackberry wines made of conventionally and organically grown blackberries. The objective of this study was to classify the blackberry wine samples based on their mineral composition and the applied cultivation method of the starting raw material by using chemometric analysis. The metal content of Croatian blackberry wine samples was determined by AAS after dry ashing. The comparison between an organic and conventional group of investigated blackberry wines showed statistically significant difference in concentrations of Si and Li, where the organic group contained higher concentrations of these compounds. According to multivariate data analysis, the model based on the original metal content data set finally included seven original variables (K, Fe, Mn, Cu, Ba, Cd and Cr) and gave a satisfactory separation of two applied cultivation methods of the starting raw material.
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
Farjood, Ehsan; Vojdani, Mahroo; Torabi, Kiyanoosh; Khaledi, Amir Ali Reza
2017-01-01
Given the limitations of conventional waxing, computer-aided design and computer-aided manufacturing (CAD-CAM) technologies have been developed as alternative methods of making patterns. The purpose of this in vitro study was to compare the marginal and internal fit of metal copings derived from wax patterns fabricated by rapid prototyping (RP) to those created by the conventional handmade technique. Twenty-four standardized brass dies were milled and divided into 2 groups (n=12) according to the wax pattern fabrication method. The CAD-RP group was assigned to the experimental group, and the conventional group to the control group. The cross-sectional technique was used to assess the marginal and internal discrepancies at 15 points on the master die by using a digital microscope. An independent t test was used for statistical analysis (α=.01). The CAD-RP group had a total mean (±SD) for absolute marginal discrepancy of 117.1 (±11.5) μm and a mean marginal discrepancy of 89.8 (±8.3) μm. The conventional group had an absolute marginal discrepancy 88.1 (±10.7) μm and a mean marginal discrepancy of 69.5 (±15.6) μm. The overall mean (±SD) of the total internal discrepancy, separately calculated as the axial internal discrepancy and occlusal internal discrepancy, was 95.9 (±8.0) μm for the CAD-RP group and 76.9 (±10.2) μm for the conventional group. The independent t test results showed significant differences between the 2 groups. The CAD-RP group had larger discrepancies at all measured areas than the conventional group, which was statistically significant (P<.01). Within the limitations of this in vitro study, the conventional method of wax pattern fabrication produced copings with better marginal and internal fit than the CAD-RP method. However, the marginal and internal fit for both groups were within clinically acceptable ranges. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
[Role of BoBs technology in early missed abortion chorionic villi].
Li, Z Y; Liu, X Y; Peng, P; Chen, N; Ou, J; Hao, N; Zhou, J; Bian, X M
2018-05-25
Objective: To investigate the value of bacterial artificial chromosome-on-beads (BoBs) technology in the genetic analysis of early missed abortion chorionic villi. Methods: Early missed abortion chorionic villi were detected with both conventional karyotyping method and BoBs technology in Peking Union Medical Hospital from July 2014 to March 2015. Compared the results of BoBs with conventional karyotyping analysis to evaluate the sensitivity, specificity and accuracy of this new method. Results: (1) A total of 161 samples were tested successfully in the technology of BoBs, 131 samples were tested successfully in the method of conventional karyotyping. (2) All of the cases obtained from BoBs results in (2.7±0.6) days and obtained from conventional karyotyping results in (22.5±1.9) days. There was significant statistical difference between the two groups ( t= 123.315, P< 0.01) . (3) Out of 161 cases tested in BoBs, 85 (52.8%, 85/161) cases had the abnormal chromosomes, including 79 cases chromosome number abnormality, 4 cases were chromosome segment deletion, 2 cases mosaic. Out of 131 cases tested successfully in conventional karyotyping, 79 (60.3%, 79/131) cases had the abnormal chromosomes including 62 cases chromosome number abnormality, 17 cases other chromosome number abnormality, and the rate of chromosome abnormality between two methods was no significant differences ( P =0.198) . (4) Conventional karyotyping results were served as the gold standard, the accuracy of BoBs for abnormal chromosomes was 82.4% (108/131) , analysed the normal chromosomes (52 cases) and chromosome number abnormality (62 cases) tested in conventional karyotyping, the accuracy of BoBs for chromosome number abnormality was 94.7% (108/114) . Conclusion: BoBs is a rapid reliable and easily operated method to test early missed abortion chorionic villi chromosomal abnormalities.
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
NASA Astrophysics Data System (ADS)
Zorin, A. B.
1985-03-01
In the present, quantum-statistical analysis of SIS heterodyne mixer performance, the conventional three-port model of the mixer circuit and the microscopic theory of superconducting tunnel junctions are used to derive a general expression for a noise parameter previously used for the case of parametric amplifiers. This expression is numerically evaluated for various quasiparticle current step widths, dc bias voltages, local oscillator powers, signal frequencies, signal source admittances, and operation temperatures.
Bashir, Saba; Qamar, Usman; Khan, Farhan Hassan
2015-06-01
Conventional clinical decision support systems are based on individual classifiers or simple combination of these classifiers which tend to show moderate performance. This research paper presents a novel classifier ensemble framework based on enhanced bagging approach with multi-objective weighted voting scheme for prediction and analysis of heart disease. The proposed model overcomes the limitations of conventional performance by utilizing an ensemble of five heterogeneous classifiers: Naïve Bayes, linear regression, quadratic discriminant analysis, instance based learner and support vector machines. Five different datasets are used for experimentation, evaluation and validation. The datasets are obtained from publicly available data repositories. Effectiveness of the proposed ensemble is investigated by comparison of results with several classifiers. Prediction results of the proposed ensemble model are assessed by ten fold cross validation and ANOVA statistics. The experimental evaluation shows that the proposed framework deals with all type of attributes and achieved high diagnosis accuracy of 84.16 %, 93.29 % sensitivity, 96.70 % specificity, and 82.15 % f-measure. The f-ratio higher than f-critical and p value less than 0.05 for 95 % confidence interval indicate that the results are extremely statistically significant for most of the datasets.
Neither fixed nor random: weighted least squares meta-regression.
Stanley, T D; Doucouliagos, Hristos
2017-03-01
Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of 'mixed-effects' or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the 'true' regression coefficient. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
13C NMR spectroscopic analysis of poly(electrolyte) cement liquids.
Watts, D C
1979-05-01
13C NMR spectroscopy has been applied to the analysis of carboxylic poly-acid cement liquids. Monomer incorporation, composition ratio, sequence statistics, and stereochemical configuration have been considered theoretically, and determined experimentally, from the spectra. Conventionally polymerized poly(acrylic acid) has an approximately random configuration, but other varieties may be synthesized. Two commercial glass-ionomer cement liquids both contain tartaric acid as a chelating additive but the composition of their poly-acids are different. Itaconic acid units, distributed randomly, constitute 21% of the repeating units in one of these polyelectrolytes.
The application of satellite data in monitoring strip mines
NASA Technical Reports Server (NTRS)
Sharber, L. A.; Shahrokhi, F.
1977-01-01
Strip mines in the New River Drainage Basin of Tennessee were studied through use of Landsat-1 imagery and aircraft photography. A multilevel analysis, involving conventional photo interpretation techniques, densitometric methods, multispectral analysis and statistical testing was applied to the data. The Landsat imagery proved adequate for monitoring large-scale change resulting from active mining and land-reclamation projects. However, the spatial resolution of the satellite imagery rendered it inadequate for assessment of many smaller strip mines, in the region which may be as small as a few hectares.
ICAP - An Interactive Cluster Analysis Procedure for analyzing remotely sensed data
NASA Technical Reports Server (NTRS)
Wharton, S. W.; Turner, B. J.
1981-01-01
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. ICAP differs from conventional clustering algorithms by allowing the analyst to optimize the cluster configuration by inspection, rather than by manipulating process parameters. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters, and the analyst, who can evaluate and elect to modify the cluster structure. Clusters can be deleted, or lumped together pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The principal advantage of this approach is that it allows prior information (when available) to be used directly in the analysis, since the analyst interacts with ICAP in a straightforward manner, using basic terms with which he is more likely to be familiar. Results from testing ICAP showed that an informed use of ICAP can improve classification, as compared to an existing cluster analysis procedure.
Win, Khin Thanda; Vegas, Juan; Zhang, Chunying; Song, Kihwan; Lee, Sanghyeob
2017-01-01
QTL mapping using NGS-assisted BSA was successfully applied to an F 2 population for downy mildew resistance in cucumber. QTLs detected by NGS-assisted BSA were confirmed by conventional QTL analysis. Downy mildew (DM), caused by Pseudoperonospora cubensis, is one of the most destructive foliar diseases in cucumber. QTL mapping is a fundamental approach for understanding the genetic inheritance of DM resistance in cucumber. Recently, many studies have reported that a combination of bulked segregant analysis (BSA) and next-generation sequencing (NGS) can be a rapid and cost-effective way of mapping QTLs. In this study, we applied NGS-assisted BSA to QTL mapping of DM resistance in cucumber and confirmed the results by conventional QTL analysis. By sequencing two DNA pools each consisting of ten individuals showing high resistance and susceptibility to DM from a F 2 population, we identified single nucleotide polymorphisms (SNPs) between the two pools. We employed a statistical method for QTL mapping based on these SNPs. Five QTLs, dm2.2, dm4.1, dm5.1, dm5.2, and dm6.1, were detected and dm2.2 showed the largest effect on DM resistance. Conventional QTL analysis using the F 2 confirmed dm2.2 (R 2 = 10.8-24 %) and dm5.2 (R 2 = 14-27.2 %) as major QTLs and dm4.1 (R 2 = 8 %) as two minor QTLs, but could not detect dm5.1 and dm6.1. A new QTL on chromosome 2, dm2.1 (R 2 = 28.2 %) was detected by the conventional QTL method using an F 3 population. This study demonstrated the effectiveness of NGS-assisted BSA for mapping QTLs conferring DM resistance in cucumber and revealed the unique genetic inheritance of DM resistance in this population through two distinct major QTLs on chromosome 2 that mainly harbor DM resistance.
Developing Statistical Evaluation Model of Introduction Effect of MSW Thermal Recycling
NASA Astrophysics Data System (ADS)
Aoyama, Makoto; Kato, Takeyoshi; Suzuoki, Yasuo
For the effective utilization of municipal solid waste (MSW) through a thermal recycling, new technologies, such as an incineration plant using a Molten Carbonate Fuel Cell (MCFC), are being developed. The impact of new technologies should be evaluated statistically for various municipalities, so that the target of technological development or potential cost reduction due to the increased cumulative number of installed system can be discussed. For this purpose, we developed a model for discussing the impact of new technologies, where a statistical mesh data set was utilized to estimate the heat demand around the incineration plant. This paper examines a case study by using a developed model, where a conventional type and a MCFC type MSW incineration plant is compared in terms of the reduction in primary energy and the revenue by both electricity and heat supply. Based on the difference in annual revenue, we calculate the allowable investment in MCFC-type MSW incineration plant in addition to conventional plant. The results suggest that allowable investment can be about 30 millions yen/(t/day) in small municipalities, while it is only 10 millions yen/(t/day) in large municipalities. The sensitive analysis shows the model can be useful for discussing the difference of impact of material recycling of plastics on thermal recycling technologies.
Keystroke dynamics in the pre-touchscreen era
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.
2013-01-01
Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568
Keystroke dynamics in the pre-touchscreen era.
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A
2013-12-19
Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.
Postoperative discomfort after Nd:YAG laser and conventional frenectomy: comparison of both genders.
Akpınar, A; Toker, H; Lektemur Alpan, A; Çalışır, M
2016-03-01
Evidence has suggested that males and females experience and report feeling pain differently. The aim of this study was to determine the postoperative perception levels of both females and males after neodymium-doped yttrium aluminum garnet (Nd:YAG) laser frenectomy and conventional frenectomy, and to compare the perceptions between genders. Eighty-nine patients requiring frenectomy were randomly assigned to have treatment with either the conventional frenectomy or with the Nd:YAG laser. Postoperative discomfort (pain, chewing, talking) was recorded using a visual analog scale (VAS) on the operation day and postoperative days 1, 3, 7 and 10. According to the female VAS scores of the pain, chewing and speaking discomfort were statistically higher in the conventional group than those of the laser group on the operation day, and on the first and third postoperative days. Pain discomfort in males was statistically higher in the conventional group than those of the laser group on the operation day. Speaking discomfort in males was statistically higher in the conventional group than those of the laser group on the operation day and the first postoperative day. The present study indicated that Nd:YAG laser treatment used for frenectomies provides better postoperative comfort for each gender, especially in females in terms of pain, chewing and speaking than the conventional procedure up to the seventh postoperative day. According to our results, Nd:YAG laser may provide a safe, bloodless, painless surgery and an impressive alternative for frenectomy operations. © 2015 Australian Dental Association.
Shook, Corey; Kim, Sohyon Michelle; Burnheimer, John
2016-07-01
To evaluate the effect of Damon self-ligating and conventional bracket systems on buccal corridor widths and areas. A retrospective sample of consecutively treated patients using either conventional (CG, n = 45) or Damon self-ligating (SL, n = 39) brackets was analyzed to determine any differences in buccal corridor widths and areas both within and between groups. Pretreatment and posttreatment frontal photographs were transferred to Photoshop CC, standardized using intercanthal width, and linear and area measurements were performed with tools in Photoshop CC. Ratios were then calculated for statistical analysis. Relationships between arch widths and buccal corridors were also examined. There were no significant differences in the posttreatment intercanine or intermolar widths either within or between the CG and SL groups. There were no significant differences in any buccal corridor width or area measurement either within or between the CG and SL groups. There were strong correlations with the intercanine width and the corresponding buccal corridor smile width measurements. There was an inverse correlation with the buccal corridor area in relation to the canine and the total smile width. It is likely that posttreatment increases in arch width can be seen in patients treated with either a conventional bracket system or the Damon system. It is highly unlikely that there is any significant difference in buccal corridor width or area in patients treated with the Damon self-ligating system or a conventional bracket system.
Excoffier, L; Smouse, P E; Quattro, J M
1992-06-01
We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as phi-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivision. The method is flexible enough to accommodate several alternative input matrices, corresponding to different types of molecular data, as well as different types of evolutionary assumptions, without modifying the basic structure of the analysis. The significance of the variance components and phi-statistics is tested using a permutational approach, eliminating the normality assumption that is conventional for analysis of variance but inappropriate for molecular data. Application of AMOVA to human mitochondrial DNA haplotype data shows that population subdivisions are better resolved when some measure of molecular differences among haplotypes is introduced into the analysis. At the intraspecific level, however, the additional information provided by knowing the exact phylogenetic relations among haplotypes or by a nonlinear translation of restriction-site change into nucleotide diversity does not significantly modify the inferred population genetic structure. Monte Carlo studies show that site sampling does not fundamentally affect the significance of the molecular variance components. The AMOVA treatment is easily extended in several different directions and it constitutes a coherent and flexible framework for the statistical analysis of molecular data.
Cochand-Priollet, Béatrix; Cartier, Isabelle; de Cremoux, Patricia; Le Galès, Catherine; Ziol, Marianne; Molinié, Vincent; Petitjean, Alain; Dosda, Anne; Merea, Estelle; Biaggi, Annonciade; Gouget, Isabelle; Arkwright, Sylviane; Vacher-Lavenu, Marie-Cécile; Vielh, Philippe; Coste, Joël
2005-11-01
Many articles concerning conventional Pap smears, ThinPrep liquid-based cytology (LBC) and Hybrid-Capture II HPV test (HC II) have been published. This study, carried out by the French Society of Clinical Cytology, may be conspicuous for several reasons: it was financially independent; it compared the efficiency of the conventional Pap smear and LBC, of the conventional Pap smear and HC II, and included an economic study based on real costs; for all the women, a "gold standard" reference method, colposcopy, was available and biopsies were performed whenever a lesion was detected; The conventional Pap smear, the LBC (split-sample technique), the colposcopy, and the biopsies were done at the same time. This study included 2,585 women shared into two groups: a group A of a high-risk population, a group B of a screening population. The statistical analysis of the results showed that conventional Pap smears consistently had superior or equivalent sensitivity and specificity than LBC for the lesions at threshold CIN-I (Cervical Intraepithelial Neoplasia) or CIN-II or higher. It underlined the low specificity of the HC II. Finally, the LBC mean cost was never covered by the Social Security tariff.
Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang
2013-01-01
Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.
Jonsen, Ian D; Myers, Ransom A; James, Michael C
2006-09-01
1. Biological and statistical complexity are features common to most ecological data that hinder our ability to extract meaningful patterns using conventional tools. Recent work on implementing modern statistical methods for analysis of such ecological data has focused primarily on population dynamics but other types of data, such as animal movement pathways obtained from satellite telemetry, can also benefit from the application of modern statistical tools. 2. We develop a robust hierarchical state-space approach for analysis of multiple satellite telemetry pathways obtained via the Argos system. State-space models are time-series methods that allow unobserved states and biological parameters to be estimated from data observed with error. We show that the approach can reveal important patterns in complex, noisy data where conventional methods cannot. 3. Using the largest Atlantic satellite telemetry data set for critically endangered leatherback turtles, we show that the diel pattern in travel rates of these turtles changes over different phases of their migratory cycle. While foraging in northern waters the turtles show similar travel rates during day and night, but on their southward migration to tropical waters travel rates are markedly faster during the day. These patterns are generally consistent with diving data, and may be related to changes in foraging behaviour. Interestingly, individuals that migrate southward to breed generally show higher daytime travel rates than individuals that migrate southward in a non-breeding year. 4. Our approach is extremely flexible and can be applied to many ecological analyses that use complex, sequential data.
Hulshof, Tessa A; Zuidema, Sytse U; Ostelo, Raymond W J G; Luijendijk, Hendrika J
2015-10-01
Numerous observational studies have reported an increased risk of mortality for conventional antipsychotics in elderly patients, and for haloperidol in particular. Subsequently, health authorities have warned against use of conventional antipsychotics in dementia. Experimental evidence is lacking. To assess the mortality risk of conventional antipsychotics in elderly patients with a meta-analysis of trials. Original studies were identified in electronic databases, online trial registers, and hand-searched references of published reviews. Two investigators found 28 potentially eligible studies, and they selected 17 randomized placebo-controlled trials in elderly patients with dementia, delirium, or a high risk of delirium. Two investigators independently abstracted trial characteristics and deaths, and 3 investigators assessed the risk of bias. Deaths were pooled with RevMan to obtain risk differences and risk ratios. Data of 17 trials with a total of 2387 participants were available. Thirty-two deaths occurred. The pooled risk difference of 0.1% was not statistically significant (95% confidence interval (CI) -1.0%-1.2%). The risk ratio was 1.07 (95% CI 0.54-2.13). Eleven of 17 trials tested haloperidol (n = 1799). The risk difference was 0.4% (95% CI -0.9%-1.6%), the risk ratio was 1.25 (95% CI 0.59-2.65). This meta-analysis of placebo-controlled randomized trials does not show that conventional antipsychotics in general or haloperidol in particular increase the risk of mortality in elderly patients. It questions the observational findings and the warning based on these findings. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Comparison of ThinPrep and conventional preparations on fine needle aspiration cytology material.
Dey, P; Luthra, U K; George, J; Zuhairy, F; George, S S; Haji, B I
2000-01-01
To compare the various cytologic features on ThinPrep 2000 (TP) (Cytyc Corporation, Marlborough, Massachusetts, U.S.A.) and conventional preparation (CP) specimens from fine needle aspiration cytology (FNAC) material by a semiquantitative scoring system. In this prospective study a total of 71 consecutive cases were included. In each case, two passes were performed. The first pass was used for conventional preparations, with direct smears made and fixed immediately in 95% alcohol for Papanicolaou stain. For TP preparation a second pass produced material for processing in the ThinPrep 2000. The TP and CP slides were studied independently by two observers and representative slides of CP and TP compared for cellularity, background blood and necrotic cell debris, cell architecture, informative background, presence of monolayer cells, and nuclear and cytoplasmic details by a semiquantitative scoring system. Statistical analysis was performed by Wilcoxon's signed rank test on an SPSS program (Chicago, Illinois, U.S.A.). TP preparations contained adequate diagnostic cells in all cases and were tangibly superior to CP preparations concerning monolayer cells, absence of blood and necrosis, and preservation of nuclear and cytoplasmic detail (statistically significant, Wilcoxon's signed rank test, P < .000). TP preparations are superior to conventional preparations with regard to clear background, monolayer cell preparation and cell preservation. It is easier and less time consuming to screen and interpret TP preparations because the cells are limited to smaller areas on clear backgrounds, with excellent cellular preservation. However, TP preparations are more expensive than CP and require some experience for interpretation.
Reuschel, Anna; Bogatsch, Holger; Barth, Thomas; Wiedemann, Renate
2010-11-01
To compare the intraoperative and postoperative outcomes of conventional longitudinal phacoemulsification and torsional phacoemulsification. Department of Ophthalmology, University of Leipzig, Germany. Randomized single-center clinical trial. Eyes with senile cataract were randomized to have phacoemulsification using the Infiniti Vision System and the torsional mode (OZil) or conventional longitudinal mode. Primary outcomes were corrected distance visual acuity (CDVA) and central endothelial cell density (ECD), calculated according to the Conference on Harmonisation-E9 Guidelines in which missing values were substituted by the median in each group (primary analysis) and the loss was then calculated using actual data (secondary analysis). Secondary outcomes were ultrasound (US) time, cumulative dissipated energy (CDE), and percentage total equivalent power in position 3. Postoperative follow-up was at 3 months. The mean preoperative CDVA was 0.41 logMAR in the torsional group and 0.38 logMAR in the longitudinal group, improving to 0.07 logMAR postoperatively in both groups. The mean ECD loss was 7.2% ± 4.6% in the torsional group (72 patients) and 7.1% ± 4.4% in the longitudinal group (76 patients), with no statistically significant differences in the primary analysis (P = .342) or secondary analysis (P = .906). The mean US time, CDE, and percentage total equivalent power in position 3 were statistically significantly lower in the torsional group (98 patients) than in the longitudinal group (94 patients) (P<.001). The torsional mode was as safe as the longitudinal mode in phacoemulsification for age-related cataract. Copyright © 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Park, Yoonah; Yong, Yuen Geng; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung
2015-01-01
Purpose This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). Methods This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Results Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). Conclusion The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase. PMID:25960990
Adaptive filtering in biological signal processing.
Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A
1990-01-01
The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
NASA Astrophysics Data System (ADS)
Ren, Weiwei; Yang, Tao; Shi, Pengfei; Xu, Chong-yu; Zhang, Ke; Zhou, Xudong; Shao, Quanxi; Ciais, Philippe
2018-06-01
Climate change imposes profound influence on regional hydrological cycle and water security in many alpine regions worldwide. Investigating regional climate impacts using watershed scale hydrological models requires a large number of input data such as topography, meteorological and hydrological data. However, data scarcity in alpine regions seriously restricts evaluation of climate change impacts on water cycle using conventional approaches based on global or regional climate models, statistical downscaling methods and hydrological models. Therefore, this study is dedicated to development of a probabilistic model to replace the conventional approaches for streamflow projection. The probabilistic model was built upon an advanced Bayesian Neural Network (BNN) approach directly fed by the large-scale climate predictor variables and tested in a typical data sparse alpine region, the Kaidu River basin in Central Asia. Results show that BNN model performs better than the general methods across a number of statistical measures. The BNN method with flexible model structures by active indicator functions, which reduce the dependence on the initial specification for the input variables and the number of hidden units, can work well in a data limited region. Moreover, it can provide more reliable streamflow projections with a robust generalization ability. Forced by the latest bias-corrected GCM scenarios, streamflow projections for the 21st century under three RCP emission pathways were constructed and analyzed. Briefly, the proposed probabilistic projection approach could improve runoff predictive ability over conventional methods and provide better support to water resources planning and management under data limited conditions as well as enable a facilitated climate change impact analysis on runoff and water resources in alpine regions worldwide.
Robust Linear Models for Cis-eQTL Analysis.
Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C
2015-01-01
Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.
Ellison, Jonathan S; Montgomery, Jeffrey S; Wolf, J Stuart; Hafez, Khaled S; Miller, David C; Weizer, Alon Z
2012-07-01
Minimally invasive nephron sparing surgery is gaining popularity for small renal masses. Few groups have evaluated robot-assisted partial nephrectomy compared to other approaches using comparable patient populations. We present a matched pair analysis of a heterogeneous group of surgeons who performed robot-assisted partial nephrectomy and a single experienced laparoscopic surgeon who performed conventional laparoscopic partial nephrectomy. Perioperative outcomes and complications were compared. All 249 conventional laparoscopic and robot-assisted partial nephrectomy cases from January 2007 to June 2010 were reviewed from our prospectively maintained institutional database. Groups were matched 1:1 (108 matched pairs) by R.E.N.A.L. (radius, exophytic/endophytic properties, nearness of tumor to collecting system or sinus, anterior/posterior, location relative to polar lines) nephrometry score, transperitoneal vs retroperitoneal approach, patient age and hilar nature of the tumor. Statistical analysis was done to compare operative outcomes and complications. Matched analysis revealed that nephrometry score, age, gender, tumor side and American Society of Anesthesia physical status classification were similar. Operative time favored conventional laparoscopic partial nephrectomy. During the study period robot-assisted partial nephrectomy showed significant improvements in estimated blood loss and warm ischemia time compared to those of the experienced conventional laparoscopic group. Postoperative complication rates, and complication distributions by Clavien classification and type were similar for conventional laparoscopic and robot-assisted partial nephrectomy (41.7% and 35.0%, respectively). Robot-assisted partial nephrectomy has a noticeable but rapid learning curve. After it is overcome the robotic procedure results in perioperative outcomes similar to those achieved with conventional laparoscopic partial nephrectomy done by an experienced surgeon. Robot-assisted partial nephrectomy likely improves surgeon and patient accessibility to minimally invasive nephron sparing surgery. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Kane, Christopher
2013-02-01
Minimally invasive nephron sparing surgery is gaining popularity for small renal masses. Few groups have evaluated robot-assisted partial nephrectomy compared to other approaches using comparable patient populations. We present a matched pair analysis of a heterogeneous group of surgeons who performed robot-assisted partial nephrectomy and a single experienced laparoscopic surgeon who performed conventional laparoscopic partial nephrectomy. Perioperative outcomes and complications were compared. All 249 conventional laparoscopic and robot-assisted partial nephrectomy cases from January 2007 to June 2010 were reviewed from our prospectively maintained institutional database. Groups were matched 1:1 (108 matched pairs) by R.E.N.A.L. (radius, exophytic/endophytic properties, nearness of tumor to collecting system or sinus, anterior/posterior, location relative to polar lines) nephrometry score, transperitoneal vs retroperitoneal approach, patient age and hilar nature of the tumor. Statistical analysis was done to compare operative outcomes and complications. Matched analysis revealed that nephrometry score, age, gender, tumor side and American Society of Anesthesia physical status classification were similar. Operative time favored conventional laparoscopic partial nephrectomy. During the study period robot-assisted partial nephrectomy showed significant improvements in estimated blood loss and warm ischemia time compared to those of the experienced conventional laparoscopic group. Postoperative complication rates, and complication distributions by Clavien classification and type were similar for conventional laparoscopic and robot-assisted partial nephrectomy (41.7% and 35.0%, respectively). Robot-assisted partial nephrectomy has a noticeable but rapid learning curve. After it is overcome the robotic procedure results in perioperative outcomes similar to those achieved with conventional laparoscopic partial nephrectomy done by an experienced surgeon. Robot-assisted partial nephrectomy likely improves surgeon and patient accessibility to minimally invasive nephron sparing surgery. Copyright © 2013 Elsevier Inc. All rights reserved.
Sadeghi, Rokhsareh; Miremadi, Asghar
2015-01-01
Objectives: Implant primary stability is one of the important factors in achieving implant success. The osteotome technique may improve primary stability in patients with poor bone quality. The aim of this study was to compare implant stability using two different techniques namely osteotome versus conventional drilling in the posterior maxilla. Materials and Methods: In this controlled randomized clinical trial, 54 dental implants were placed in 32 patients; 29 implants were placed in the osteotome group and 25 in the conventional drilling group. Implant stability was assessed at four time intervals namely at baseline, one, two and three months after implant placement using resonance frequency analysis (RFA). Results: Primary stability based on implant stability quotient (ISQ) units was 71.4±7 for the osteotome group and 67.4±10 for the control group. There was no statistically significant difference between the two groups in implant stability at any of the measurement times. In each group, changes in implant stability from baseline to one month and also from two months to three months post-operatively were not significant but from one month to two months after implant placement, implant stability showed a significant increase in both groups. Conclusion: The results of this study revealed that in both techniques, good implant stability was achieved and osteotome technique did not have any advantage compared to conventional drilling in this regard. PMID:27148375
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Karataşlıoglu, E; Aydın, U; Yıldırım, C
2018-02-01
The aim of this in vitro study was to compare the static cyclic fatigue resistance of thermal treated rotary files with a conventional nickel-titanium (NiTi) rotary file. Four groups of 60 rotary files with similar file dimensions, geometries, and motion were selected. Groups were set as HyFlex Group [controlled memory wire (CM-Wire)], ProfileVortex Group (M-Wire), Twisted File Group (R-Phase Wire), and OneShape Group (conventional NiTi wire)] and tested using a custom-made static cyclic fatigue testing apparatus. The fracture time and fragment length of the each file was also recorded. Statistical analysis was performed using one-way analysis of variance and Tukey's test at the 95% confidence level (P = 0.05). The HyFlex group had a significantly higher mean cyclic fatigue resistance than the other three groups (P < 0.001). The OneShape groups had the least fatigue resistance. CM-Wire alloy represented the best performance in cyclic fatigue resistance, and NiTi alloy in R-Phase had the second highest fatigue resistance. CM and R-Phase manufacturing technology processed to the conventional NiTi alloy enhance the cyclic fatigue resistance of files that have similar design and size. M-wire alloy did not show any superiority in cyclic fatigue resistance when compared with conventional NiTi wire.
Su, Dejun; Zhou, Junmin; Kelley, Megan S; Michaud, Tzeyu L; Siahpush, Mohammad; Kim, Jungyoon; Wilson, Fernando; Stimpson, Jim P; Pagán, José A
2016-06-01
To assess the overall effect of telemedicine on diabetes management and to identify features of telemedicine interventions that are associated with better diabetes management outcomes. Hedges's g was estimated as the summary measure of mean difference in HbA1c between patients with diabetes who went through telemedicine care and those who went through conventional, non-telemedicine care using a random-effects model. Q statistics were calculated to assess if the effect of telemedicine on diabetes management differs by types of diabetes, age groups of patients, duration of intervention, and primary telemedicine approaches used. The analysis included 55 randomized controlled trials with a total of 9258 patients with diabetes, out of which 4607 were randomized to telemedicine groups and 4651 to conventional, non-telemedicine care groups. The results favored telemedicine over conventional care (Hedges's g=-0.48, p<0.001) in diabetes management. The beneficial effect of telemedicine were more pronounced among patients with type 2 diabetes (Hedges's g=-0.63, p<0.001) than among those with type 1 diabetes (Hedges's g=-0.27, p=0.027) (Q=4.25, p=0.04). Compared to conventional care, telemedicine is more effective in improving treatment outcomes for diabetes patients, especially for those with type 2 diabetes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Liu, Weixiang
2017-01-01
The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules’ 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively. PMID:29228030
Pang, Tiantian; Huang, Leidan; Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Gong, Xuehao; Liu, Weixiang
2017-01-01
The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules' 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively.
NASA Astrophysics Data System (ADS)
Magazù, Salvatore; Mezei, Ferenc; Migliardo, Federica
2018-05-01
In a variety of applications of inelastic neutron scattering spectroscopy the goal is to single out the elastic scattering contribution from the total scattered spectrum as a function of momentum transfer and sample environment parameters. The elastic part of the spectrum is defined in such a case by the energy resolution of the spectrometer. Variable elastic energy resolution offers a way to distinguish between elastic and quasi-elastic intensities. Correlation spectroscopy lends itself as an efficient, high intensity approach for accomplishing this both at continuous and pulsed neutron sources. On the one hand, in beam modulation methods the Liouville theorem coupling between intensity and resolution is relaxed and time-of-flight velocity analysis of the neutron velocity distribution can be performed with 50 % duty factor exposure for all available resolutions. On the other hand, the (quasi)elastic part of the spectrum generally contains the major part of the integrated intensity at a given detector, and thus correlation spectroscopy can be applied with most favorable signal to statistical noise ratio. The novel spectrometer CORELLI at SNS is an example for this type of application of the correlation technique at a pulsed source. On a continuous neutron source a statistical chopper can be used for quasi-random time dependent beam modulation and the total time-of-flight of the neutron from the statistical chopper to detection is determined by the analysis of the correlation between the temporal fluctuation of the neutron detection rate and the statistical chopper beam modulation pattern. The correlation analysis can either be used for the determination of the incoming neutron velocity or for the scattered neutron velocity, depending of the position of the statistical chopper along the neutron trajectory. These two options are considered together with an evaluation of spectrometer performance compared to conventional spectroscopy, in particular for variable resolution elastic neutron scattering (RENS) studies of relaxation processes and the evolution of mean square displacements. A particular focus of our analysis is the unique feature of correlation spectroscopy of delivering high and resolution independent beam intensity, thus the same statistical chopper scan contains both high intensity and high resolution information at the same time, and can be evaluated both ways. This flexibility for variable resolution data handling represents an additional asset for correlation spectroscopy in variable resolution work. Changing the beam width for the same statistical chopper allows us to additionally trade resolution for intensity in two different experimental runs, similarly for conventional single slit chopper spectroscopy. The combination of these two approaches is a capability of particular value in neutron spectroscopy studies requiring variable energy resolution, such as the systematic study of quasi-elastic scattering and mean square displacement. Furthermore the statistical chopper approach is particularly advantageous for studying samples with low scattering intensity in the presence of a high, sample independent background.
THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures
Theobald, Douglas L.; Wuttke, Deborah S.
2008-01-01
Summary THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. PMID:16777907
Revealing representational content with pattern-information fMRI--an introductory guide.
Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus
2009-03-01
Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.
Borrowing of strength and study weights in multivariate and network meta-analysis.
Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D
2017-12-01
Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of 'borrowing of strength'. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis).
Borrowing of strength and study weights in multivariate and network meta-analysis
Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D
2016-01-01
Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of ‘borrowing of strength’. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis). PMID:26546254
Early-Stage Estimated Value of Blend Sign on the Prognosis of Patients with Intracerebral Hemorrhage
Zhou, Ningquan; Wang, Chao
2018-01-01
Background and Purpose This study aimed to explore the relationship between blend sign and prognosis of patients with intracerebral hemorrhage (ICH). Methods Between January 2014 and December 2016, the results of cranial computed tomography imaging within 24 h after the onset of symptoms from 275 patients with ICH were retrospectively analyzed. The patients with or without blend sign were compared to observe and analyze the difference in coagulation function abnormality, rebleeding, mortality, and bad prognosis rates in the early stages. Results Of the 275 patients with ICH, 47 patients had Blend Sign I (17.09%) and 17 patients had Blend Sign II (6.18%). The coagulation function abnormality rate had no statistical difference among Blend Sign I, Blend Sign II, and conventional groups (P > 0.05). In the Blend Sign I group, the rebleeding rate was 4.26%, bad prognosis rate was 25.53%, and mortality rate was 6.38%, which were not statistically significantly different compared with those in the conventional group (P > 0.05). The rebleeding rate in the Blend Sign II group was 47.06%, bad prognosis rate was 82.35%, and mortality rate was 47.06%, which were statistically significantly different compared with those in the conventional and Blend Sign I groups (P < 0.05). Conclusions For the patients associated with Blend Sign I, the prognosis was equivalent to that in the conventional group, with no statistically significant difference. The rebleeding, bad prognosis, and mortality rates were higher in the Blend Sign II group than in the conventional group and deserved more attention.
Joda, Tim; Brägger, Urs
2015-01-01
To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.
Hydrotherapy after total knee arthroplasty. A follow-up study.
Giaquinto, S; Ciotola, E; Dall'Armi, V; Margutti, F
2010-01-01
The study evaluated the subjective functional outcome following total knee arthroplasty (TKA) in participants who underwent hydrotherapy (HT) six months after discharge from a rehabilitation unit. A total of 70 subjects, 12 of which were lost at follow-up, were randomly assigned to either a conventional gym treatment (N=30) or HT (N=28). A prospective design was performed. Participants were interviewed with Western-Ontario McMasters Universities Osteoarthritis Index (WOMAC) at admission, at discharge and six months later. Kruskal-Wallis and Wilcoxon tests were applied for statistical analysis. Both groups improved. The WOMAC subscales, namely pain, stiffness and function, were all positively affected. Statistical analysis indicates that scores on all subscales were significantly lower for the HT group. The benefits gained by the time of discharge were still found after six months. HT is recommended after TKA in a geriatric population. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Hydrotherapy after total hip arthroplasty: a follow-up study.
Giaquinto, S; Ciotola, E; Dall'armi, V; Margutti, F
2010-01-01
The aim of the study was to evaluate the subjective functional outcome of total hip arthroplasty (THA) in patients who underwent hydrotherapy (HT) 6 months after discharge. A prospective randomized study was performed on 70 elderly inpatients with recent THA, who completed a rehabilitation program. After randomization, 33 of them were treated in conventional gyms (no-hydrotherapy group=NHTG) and 31 received HT (hydrotherapy group=HTG). Interviews with the Western-Ontario MacMasters Universities Osteoarthritis Index (WOMAC) were performed at admission, at discharge and 6 months later. Kruskal-Wallis, Mann-Whitney and Wilcoxon tests were applied for statistical analysis. Both groups improved. Pain, stiffness and function were all positively affected. Statistical analysis indicated that WOMAC sub-scales were significantly lower for all patients treated with HT. The benefits at discharge still remained after 6 months. We conclude that HT is recommended after THA in a geriatric population.
He, Jingzhen; Zu, Yuliang; Wang, Qing; Ma, Xiangxing
2014-12-01
The purpose of this study was to determine the performance of low-dose computed tomography (CT) scanning with integrated circuit (IC) detector in defining fine structures of temporal bone in children by comparing with the conventional detector. The study was performed with the approval of our institutional review board and the patients' anonymity was maintained. A total of 86 children<3 years of age underwent imaging of temporal bone with low-dose CT (80 kV/150 mAs) equipped with either IC detector or conventional discrete circuit (DC) detector. The image noise was measured for quantitative analysis. Thirty-five structures of temporal bone were further assessed and rated by 2 radiologists for qualitative analysis. κ Statistics were performed to determine the agreement reached between the 2 radiologists on each image. Mann-Whitney U test was used to determine the difference in image quality between the 2 detector systems. Objective analysis showed that the image noise was significantly lower (P<0.001) with the IC detector than with the DC detector. The κ values for qualitative assessment of the 35 fine anatomical structures revealed high interobserver agreement. The delineation for 30 of the 35 landmarks (86%) with the IC detector was superior to that with the conventional DC detector (P<0.05) although there were no differences in the delineation of the remaining 5 structures (P>0.05). The low-dose CT images acquired with the IC detector provide better depiction of fine osseous structures of temporal bone than that with the conventional DC detector.
Lukas, Roman-Patrik; Gräsner, Jan Thorsten; Seewald, Stephan; Lefering, Rolf; Weber, Thomas Peter; Van Aken, Hugo; Fischer, Matthias; Bohn, Andreas
2012-10-01
Investigating the effects of any intervention during cardiac arrest remains difficult. The ROSC after cardiac arrest score was introduced to facilitate comparison of rates of return of spontaneous circulation (ROSC) between different ambulance services. To study the influence of chest compression quality management (including training, real-time feedback devices, and debriefing) in comparison with conventional cardiopulmonary resuscitation (CPR), a matched-pair analysis was conducted using data from the German Resuscitation Registry, with the calculated ROSC after cardiac arrest score as the baseline. Matching for independent ROSC after cardiac arrest score variables yielded 319 matched cases from the study period (January 2007-March 2011). The score predicted a 45% ROSC rate for the matched pairs. The observed ROSC increased significantly with chest compression quality management, to 52% (P=0.013; 95% CI, 46-57%). No significant differences were seen in the conventional CPR group (47%; 95% CI, 42-53%). The difference between the observed ROSC rates was not statistically significant. Chest compression quality management leads to significantly higher ROSC rates than those predicted by the prognostic score (ROSC after cardiac arrest score). Matched-pair analysis shows that with conventional CPR, the observed ROSC rate was not significantly different from the predicted rate. Analysis shows a trend toward a higher ROSC rate for chest compression quality management in comparison with conventional CPR. It is unclear whether a single aspect of chest compression quality management or the combination of training, real-time feedback, and debriefing contributed to this result. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
Benic, Goran I; Mühlemann, Sven; Fehmer, Vincent; Hämmerle, Christoph H F; Sailer, Irena
2016-11-01
Trials comparing the overall performance of fully digital and conventional workflows in reconstructive dentistry are needed. The purpose of the first part of this randomized controlled clinical trial was to determine whether optical impressions produce different results from conventional impressions with respect to time efficiency and patient and operator perceptions of the clinical workflow. Three digital impressions and 1 conventional impression were made in each of 10 participants according to a randomly generated sequence. The digital systems were Lava COS, iTero, and Cerec Bluecam. The conventional impression was made with the closed-mouth technique and polyvinyl siloxane material. The time needed for powdering, impressions, and interocclusal record was recorded. Patient and clinician perceptions of the procedures were rated by means of visual analog scales. The paired t test with Bonferroni correction was applied to detect differences (α=.05/6=.0083). The mean total working time ±standard deviation amounted to 260 ±66 seconds for the conventional impression, 493 ±193 seconds for Lava, 372 ±126 seconds for iTero, and 357 ±55 seconds for Cerec. The total working time for the conventional impression was significantly lower than that for Lava and Cerec. With regard to the working time without powdering, the differences between the methods were not statistically significant. The patient rating (very uncomfortable=0; comfortable=100) measured 61 ±34 for conventional impression, 71 ±18 for Lava, 66 ±20 for iTero, and 48 ±18 for Cerec. The differences were not statistically significant. The clinician rating (simple=0; very difficult=100) was 13 ±13 for the conventional impression, 54 ±27 for Lava, 22 ±11 for iTero, and 36 ±23 for Cerec. The differences between the conventional impression and Lava and between iTero and Lava were statistically significant. The conventional impression was more time-effective than the digital impressions. In terms of patient comfort, no differences were found between the conventional and the digital techniques. With respect to the clinician perception of difficulty, the conventional impression and the digital impression with iTero revealed more favorable outcomes than the digital impression with Lava. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan
2017-09-01
Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.
A biological compression model and its applications.
Cao, Minh Duc; Dix, Trevor I; Allison, Lloyd
2011-01-01
A biological compression model, expert model, is presented which is superior to existing compression algorithms in both compression performance and speed. The model is able to compress whole eukaryotic genomes. Most importantly, the model provides a framework for knowledge discovery from biological data. It can be used for repeat element discovery, sequence alignment and phylogenetic analysis. We demonstrate that the model can handle statistically biased sequences and distantly related sequences where conventional knowledge discovery tools often fail.
[The informational support of statistical observation related to children disability].
Son, I M; Polikarpov, A V; Ogrizko, E V; Golubeva, T Yu
2016-01-01
Within the framework of the Convention on rights of the disabled the revision is specified concerning criteria of identification of disability of children and reformation of system of medical social expertise according international standards of indices of health and indices related to health. In connection with it, it is important to consider the relationship between alterations in forms of the Federal statistical monitoring in the part of registration of disabled children in the Russian Federation and classification of health indices and indices related to health applied at identification of disability. The article presents analysis of relationship between alterations in forms of the Federal statistical monitoring in the part of registration of disabled children in the Russian Federation and applied classifications used at identification of disability (International classification of impairments, disabilities and handicap (ICDH), international classification of functioning, disability and health (ICF), international classification of functioning, disability and health, version for children and youth (ICF-CY). The intersectorial interaction is considered within the framework of statistics of children disability.
NASA Astrophysics Data System (ADS)
Chauhan, H.; Krishna Mohan, B.
2014-11-01
The present study was undertaken with the objective to check effectiveness of spectral similarity measures to develop precise crop spectra from the collected hyperspectral field spectra. In Multispectral and Hyperspectral remote sensing, classification of pixels is obtained by statistical comparison (by means of spectral similarity) of known field or library spectra to unknown image spectra. Though these algorithms are readily used, little emphasis has been placed on use of various spectral similarity measures to select precise crop spectra from the set of field spectra. Conventionally crop spectra are developed after rejecting outliers based only on broad-spectrum analysis. Here a successful attempt has been made to develop precise crop spectra based on spectral similarity. As unevaluated data usage leads to uncertainty in the image classification, it is very crucial to evaluate the data. Hence, notwithstanding the conventional method, the data precision has been performed effectively to serve the purpose of the present research work. The effectiveness of developed precise field spectra was evaluated by spectral discrimination measures and found higher discrimination values compared to spectra developed conventionally. Overall classification accuracy for the image classified by field spectra selected conventionally is 51.89% and 75.47% for the image classified by field spectra selected precisely based on spectral similarity. KHAT values are 0.37, 0.62 and Z values are 2.77, 9.59 for image classified using conventional and precise field spectra respectively. Reasonable higher classification accuracy, KHAT and Z values shows the possibility of a new approach for field spectra selection based on spectral similarity measure.
Reynders, Reint Meursinge; de Lange, Jan
2014-12-01
Cochrane Oral Health Groups Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Medline, Embase, key international orthodontic and dental journals and the World Health Organization (WHO) International Clinical Trials Registry Platform. Randomised controlled trials comparing surgical anchorage with conventional anchorage in orthodontic patients. Trials comparing two types of surgical anchorage were also included. Data extraction was performed independently and in duplicate by three review authors and the Cochrane risk of bias tool was used to assess bias. Random-effects meta-analysis was used for more than three studies when pooling of the data was clinically and statistically appropriate. Fixed-effect analysis was undertaken with two or three studies. Fifteen studies, involving 543 analysed participants, were included. Five ongoing studies were identified. Eight studies were assessed to be at high overall risk of bias, six at unclear risk and one study at low risk of bias. Ten studies (407 randomised and 390 analysed patients) compared surgical anchorage with conventional anchorage for the primary outcome. A random-effects meta-analysis of seven studies for the primary outcome found strong evidence of an effect of surgical anchorage. Compared with conventional anchorage, surgical anchorage was more effective in the reinforcement of anchorage by 1.68 mm (95% CI -2.27 mm to -1.09 mm) (moderate quality evidence). This result should be interpreted with some caution, however, as there was a substantial degree of heterogeneity for this comparison. There was no evidence of a difference in overall duration of treatment between surgical and conventional anchorage (low quality of evidence).Information on patient-reported outcomes such as pain and acceptability was limited and inconclusive. When direct comparisons were made between two types of surgical anchorage, there was a lack of evidence to suggest that any one technique was better than another. There is moderate quality evidence that reinforcement of anchorage is more effective with surgical anchorage than conventional anchorage, and that results from mini-screw implants are particularly promising. While surgical anchorage is not associated with the inherent risks and compliance issues related to extra-oral headgear, none of the included studies reported on harms of surgical or conventional anchorage.
Near-term hybrid vehicle program, phase 1. Appendix D: Sensitivity analysis resport
NASA Technical Reports Server (NTRS)
1979-01-01
Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.
Llano, Sandra M; Muñoz-Jiménez, Ana M; Jiménez-Cartagena, Claudio; Londoño-Londoño, Julián; Medina, Sonia
2018-04-01
The agronomic production systems may affect the levels of food metabolites. Metabolomics approaches have been applied as useful tool for the characterization of fruit metabolome. In this study, metabolomics techniques were used to assess the differences in phytochemical composition between goldenberry samples produced by organic and conventional systems. To verify that the organic samples were free of pesticides, individual pesticides were analyzed. Principal component analysis showed a clear separation of goldenberry samples from two different farming systems. Via targeted metabolomics assays, whereby carotenoids and ascorbic acid were analyzed, not statistical differences between both crops were found. Conversely, untargeted metabolomics allowed us to identify two withanolides and one fatty acyl glycoside as tentative metabolites to differentiate goldenberry fruits, recording organic fruits higher amounts of these compounds than conventional samples. Hence, untargeted metabolomics technology could be suitable to research differences on phytochemicals under different agricultural management practices and to authenticate organic products. Copyright © 2017 Elsevier Ltd. All rights reserved.
Garaguso, Ivana; Nardini, Mirella
2015-07-15
Wine exerts beneficial effects on human health when it is drunk with moderation. Nevertheless, wine may also contain components negatively affecting human health. Among these, sulfites may induce adverse effects after ingestion. We examined total polyphenols and flavonoids content, phenolics profile and antioxidant activity of eight organic red wines produced without sulfur dioxide/sulfites addition in comparison to those of eight conventional red wines. Polyphenols and flavonoids content were slightly higher in organic wines in respect to conventional wines, however differences did not reach statistical significance. The phenolic acids profile was quite similar in both groups of wines. Antioxidant activity was higher in organic wines compared to conventional wines, although differences were not statistically significant. Our results indicate that organic red wines produced without sulfur dioxide/sulfites addition are comparable to conventional red wines with regard to the total polyphenols and flavonoids content, the phenolics profile and the antioxidant activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Demirel, Gamze; Uras, Nurdan; Celik, Istemi H; Aksoy, Hatice T; Oguz, Serife S; Erdeve, Omer; Erel, Ozcan; Dilmen, Ugur
2010-10-01
We evaluated and compared the oxidant and antioxidant status of hyperbilirubinemic infants before and after the two forms of phototherapy: conventional and LED phototherapy, in order to identify the optimal treatment method. Thirty newborns exposed to conventional (Group I) phototherapy and 30 infants exposed to LED phototherapy (Group II) were studied. The serum total antioxidant capacity (TAC) and the total oxidant status (TOS) were assessed by EREL's method. There were no statistically significant differences in TAC or TOS levels between Group I and Group II prior to phototherapy, and no statistically significant difference in TAC levels between the two groups after phototherapy; however, TOS levels were significantly lower in Group II compared to Group I after phototherapy. Oxidative stress index (OSI) increased after conventional phototherapy (p < 0.05) The increase in TOS following conventional phototherapy was not not observed following LED phototherapy. This difference should be considered when using phototherapy.
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
Chan, Kwun Chuen Gary; Qin, Jing
2015-10-01
Existing linear rank statistics cannot be applied to cross-sectional survival data without follow-up since all subjects are essentially censored. However, partial survival information are available from backward recurrence times and are frequently collected from health surveys without prospective follow-up. Under length-biased sampling, a class of linear rank statistics is proposed based only on backward recurrence times without any prospective follow-up. When follow-up data are available, the proposed rank statistic and a conventional rank statistic that utilizes follow-up information from the same sample are shown to be asymptotically independent. We discuss four ways to combine these two statistics when follow-up is present. Simulations show that all combined statistics have substantially improved power compared with conventional rank statistics, and a Mantel-Haenszel test performed the best among the proposal statistics. The method is applied to a cross-sectional health survey without follow-up and a study of Alzheimer's disease with prospective follow-up. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Kashif, Muhammad; Andersson, Claes; Åberg, Magnus; Nygren, Peter; Sjöblom, Tobias; Hammerling, Ulf; Larsson, Rolf; Gustafsson, Mats G
2014-07-01
For decades, the standard procedure when screening for candidate anticancer drug combinations has been to search for synergy, defined as any positive deviation from trivial cases like when the drugs are regarded as diluted versions of each other (Loewe additivity), independent actions (Bliss independence), or no interaction terms in a response surface model (no interaction). Here, we show that this kind of conventional synergy analysis may be completely misleading when the goal is to detect if there is a promising in vitro therapeutic window. Motivated by this result, and the fact that a drug combination offering a promising therapeutic window seldom is interesting if one of its constituent drugs can provide the same window alone, the largely overlooked concept of therapeutic synergy (TS) is reintroduced. In vitro TS is said to occur when the largest therapeutic window obtained by the best drug combination cannot be achieved by any single drug within the concentration range studied. Using this definition of TS, we introduce a procedure that enables its use in modern massively parallel experiments supported by a statistical omnibus test for TS designed to avoid the multiple testing problem. Finally, we suggest how one may perform TS analysis, via computational predictions of the reference cell responses, when only the target cell responses are available. In conclusion, the conventional error-prone search for promising drug combinations may be improved by replacing conventional (toxicology-rooted) synergy analysis with an analysis focused on (clinically motivated) TS. ©2014 American Association for Cancer Research.
Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas
2016-11-01
Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Tsirogiannis, Panagiotis; Reissmann, Daniel R; Heydecke, Guido
2016-09-01
In existing published reports, some studies indicate the superiority of digital impression systems in terms of the marginal accuracy of ceramic restorations, whereas others show that the conventional method provides restorations with better marginal fit than fully digital fabrication. Which impression method provides the lowest mean values for marginal adaptation is inconclusive. The findings from those studies cannot be easily generalized, and in vivo studies that could provide valid and meaningful information are limited in the existing publications. The purpose of this study was to systematically review existing reports and evaluate the marginal fit of ceramic single-tooth restorations after either digital or conventional impression methods by combining the available evidence in a meta-analysis. The search strategy for this systematic review of the publications was based on a Population, Intervention, Comparison, and Outcome (PICO) framework. For the statistical analysis, the mean marginal fit values of each study were extracted and categorized according to the impression method to calculate the mean value, together with the 95% confidence intervals (CI) of each category, and to evaluate the impact of each impression method on the marginal adaptation by comparing digital and conventional techniques separately for in vitro and in vivo studies. Twelve studies were included in the meta-analysis from the 63 identified records after database searching. For the in vitro studies, where ceramic restorations were fabricated after conventional impressions, the mean value of the marginal fit was 58.9 μm (95% CI: 41.1-76.7 μm), whereas after digital impressions, it was 63.3 μm (95% CI: 50.5-76.0 μm). In the in vivo studies, the mean marginal discrepancy of the restorations after digital impressions was 56.1 μm (95% CI: 46.3-65.8 μm), whereas after conventional impressions, it was 79.2 μm (95% CI: 59.6-98.9 μm) No significant difference was observed regarding the marginal discrepancy of single-unit ceramic restorations fabricated after digital or conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Eighteen-Month Final Evaluation of UPS Second Generation Diesel Hybrid-Electric Delivery Vans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lammert, M.; Walkowicz, K.
2012-09-01
A parallel hybrid-electric diesel delivery van propulsion system was evaluated at a UPS facility in Minneapolis using on-vehicle data logging, fueling, and maintenance records. Route and drive cycle analysis showed different duty cycles for hybrid vs. conventional delivery vans; routes were switched between the study groups to provide a valid comparison. The hybrids demonstrated greater advantage on the more urban routes; the initial conventional vans' routes had less dense delivery zones. The fuel economy of the hybrids on the original conventional group?s routes was 10.4 mpg vs. 9.2 mpg for the conventional group on those routes a year earlier. Themore » hybrid group's fuel economy on the original hybrid route assignments was 9.4 mpg vs. 7.9 mpg for the conventional group on those routes a year later. There was no statistically significant difference in total maintenance cost per mile or for the vehicle total cost of operation per mile. Propulsion-related maintenance cost per mile was 77% higher for the hybrids, but only 52% more on a cost-per-delivery-day basis. Laboratory dynamometer testing demonstrated 13%-36% hybrid fuel economy improvement, depending on duty cycle, and up to a 45% improvement in ton-mi/gal. NOx emissions increased 21%-49% for the hybrids in laboratory testing.« less
Phillips, Christie A; Harrison, Mark A
2005-06-01
Considerable speculation has occurred concerning the potential for higher numbers of foodborne pathogens on organically grown produce compared with produce not grown organically. The microflora composition of spring mix or mesclun, a mixture of multiple salad ingredients, grown either by organic or conventional means was determined. Unwashed or washed spring mix was obtained from a commercial California fresh-cut produce processor who does not use manure in their cultivation practices. Fifty-four samples of each type of product were supplied over a 4-month period. Analysis included enumeration of total mesophiles, psychrotrophs, coliforms, generic Escherichia coli, lactic acid bacteria, yeasts, and molds. In addition, spring mix was analyzed for the presence of Salmonella and Listeria monocytogenes. The mean populations of mesophilic and psychrotrophic bacteria, yeasts, molds, lactic acid bacteria, and coliforms on conventionally grown spring mix were not statistically different (P > 0.05) from respective mean populations on organically grown spring mix. The mean population of each microbial group was significantly higher on unwashed spring mix compared with the washed product. Of the 14 samples found to contain E. coli, eight were from nonwashed conventional spring mix, one was from washed conventional spring mix, and four were from nonwashed organic spring mix. Salmonella and L. monocytogenes were not detected in any of the samples analyzed.
Life-cycle greenhouse gas emissions of shale gas, natural gas, coal, and petroleum.
Burnham, Andrew; Han, Jeongwoo; Clark, Corrie E; Wang, Michael; Dunn, Jennifer B; Palou-Rivera, Ignasi
2012-01-17
The technologies and practices that have enabled the recent boom in shale gas production have also brought attention to the environmental impacts of its use. It has been debated whether the fugitive methane emissions during natural gas production and transmission outweigh the lower carbon dioxide emissions during combustion when compared to coal and petroleum. Using the current state of knowledge of methane emissions from shale gas, conventional natural gas, coal, and petroleum, we estimated up-to-date life-cycle greenhouse gas emissions. In addition, we developed distribution functions for key parameters in each pathway to examine uncertainty and identify data gaps such as methane emissions from shale gas well completions and conventional natural gas liquid unloadings that need to be further addressed. Our base case results show that shale gas life-cycle emissions are 6% lower than conventional natural gas, 23% lower than gasoline, and 33% lower than coal. However, the range in values for shale and conventional gas overlap, so there is a statistical uncertainty whether shale gas emissions are indeed lower than conventional gas. Moreover, this life-cycle analysis, among other work in this area, provides insight on critical stages that the natural gas industry and government agencies can work together on to reduce the greenhouse gas footprint of natural gas.
Utah Virtual Lab: JAVA interactivity for teaching science and statistics on line.
Malloy, T E; Jensen, G C
2001-05-01
The Utah on-line Virtual Lab is a JAVA program run dynamically off a database. It is embedded in StatCenter (www.psych.utah.edu/learn/statsampler.html), an on-line collection of tools and text for teaching and learning statistics. Instructors author a statistical virtual reality that simulates theories and data in a specific research focus area by defining independent, predictor, and dependent variables and the relations among them. Students work in an on-line virtual environment to discover the principles of this simulated reality: They go to a library, read theoretical overviews and scientific puzzles, and then go to a lab, design a study, collect and analyze data, and write a report. Each student's design and data analysis decisions are computer-graded and recorded in a database; the written research report can be read by the instructor or by other students in peer groups simulating scientific conventions.
Aslanimehr, Masoomeh; Rezvani, Shirin; Mahmoudi, Ali; Moosavi, Najmeh
2017-01-01
Statement of the Problem: Candida species are believed to play an important role in initiation and progression of denture stomatitis. The type of the denture material also influences the adhesion of candida and development of stomatitis. Purpose: The aim of this study was comparing the adherence of candida albicans to the conventional and injection molding acrylic denture base materials. Materials and Method: Twenty injection molding and 20 conventional pressure pack acrylic discs (10×10×2 mm) were prepared according to their manufacturer’s instructions. Immediately before the study, samples were placed in sterile water for 3 days to remove residual monomers. The samples were then sterilized using an ultraviolet light unit for 10 minutes. 1×108 Cfu/ml suspension of candida albicans ATCC-10231 was prepared from 48 h cultured organism on sabouraud dextrose agar plates incubated at 37oC. 100 μL of this suspension was placed on the surface of each disk. After being incubated at 37oC for 1 hour, the samples were washed with normal saline to remove non-adherent cells. Attached cells were counted using the colony count method after shaking at 3000 rmp for 20 seconds. Finally, each group was tested for 108 times and the data were statistically analyzed by t-test. Results: Quantitative analysis revealed that differences in colony count average of candida albicans adherence to conventional acrylic materials (8.3×103) comparing to injection molding acrylic resins (6×103) were statistically significant (p<0.001). Conclusion: Significant reduction of candida albicans adherence to the injection acrylic resin materials makes them valuable for patients with high risk of denture stomatitis. PMID:28280761
Aslanimehr, Masoomeh; Rezvani, Shirin; Mahmoudi, Ali; Moosavi, Najmeh
2017-03-01
Candida species are believed to play an important role in initiation and progression of denture stomatitis. The type of the denture material also influences the adhesion of candida and development of stomatitis. The aim of this study was comparing the adherence of candida albicans to the conventional and injection molding acrylic denture base materials. Twenty injection molding and 20 conventional pressure pack acrylic discs (10×10×2 mm) were prepared according to their manufacturer's instructions. Immediately before the study, samples were placed in sterile water for 3 days to remove residual monomers. The samples were then sterilized using an ultraviolet light unit for 10 minutes. 1×10 8 Cfu/ml suspension of candida albicans ATCC-10231 was prepared from 48 h cultured organism on sabouraud dextrose agar plates incubated at 37oC. 100 μL of this suspension was placed on the surface of each disk. After being incubated at 37oC for 1 hour, the samples were washed with normal saline to remove non-adherent cells. Attached cells were counted using the colony count method after shaking at 3000 rmp for 20 seconds. Finally, each group was tested for 108 times and the data were statistically analyzed by t-test. Quantitative analysis revealed that differences in colony count average of candida albicans adherence to conventional acrylic materials (8.3×10 3 ) comparing to injection molding acrylic resins (6×10 3 ) were statistically significant ( p <0.001). Significant reduction of candida albicans adherence to the injection acrylic resin materials makes them valuable for patients with high risk of denture stomatitis.
Effects of self-ligating and conventional brackets on halitosis and periodontal conditions.
Kaygisiz, Emine; Uzuner, Fatma Deniz; Yuksel, Sema; Taner, Levent; Çulhaoğlu, Rana; Sezgin, Yasemin; Ateş, Can
2015-05-01
To evaluate the effects of fixed orthodontic treatment with steel-ligated conventional brackets and self-ligating brackets on halitosis and periodontal health. Sixty patients, at the permanent dentition stage aged 12 to 18 years, who had Angle Class I malocclusion with mild-to-moderate crowding were randomly selected. Inclusion criteria were nonsmokers, without systematic disease, and no use of antibiotics and oral mouth rinses during the 2-month period before the study. The patients were subdivided into three groups randomly: the group treated with conventional brackets (group 1, n = 20) ligated with steel ligature wires, the group treated with self-ligating brackets (group 2, n = 20), and the control group (group 3, n = 20). The periodontal records were obtained 1 week before bonding (T1), immediately before bonding (T2), 1 week after bonding (T3), 4 weeks after bonding (T4), and 8 weeks after bonding (T5). Measurements of the control group were repeated within the same periods. The volatile sulfur components determining halitosis were measured with the Halimeter at T2, T3, T4, and T5. A two-way repeated measures of analysis of variance (ANOVA) was used to compare the groups statistically. No statistically significant group × time interactions were found for plaque index, gingival index, pocket depth, bleeding on probing, and halitosis, which means three independent groups change like each other by time. The risk of tongue coating index (TCI) being 2 was 10.2 times higher at T1 than at T5 (P < .001). Therefore, the probability of higher TCI was decreased by time in all groups. The self-ligating brackets do not have an advantage over conventional brackets with respect to periodontal status and halitosis.
Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications
NASA Astrophysics Data System (ADS)
Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David
2017-10-01
The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.
Liu, Mei-Juan; Men, Yan-Ming; Zhang, Yong-Lin; Zhang, Yu-Xi; Liu, Hao
2017-01-01
We aimed to evaluate the diagnostic values of conventional ultrasound (US), ultrasound contrast (UC) and ultrasound elastography (UE) in distinguishing the benign and malignant thyroid nodules. A total of 100 patients with thyroid nodules receiving operative treatment were selected; they underwent the conventional US, UE and UC examinations before operation, respectively. The nodules received pathological examination after operation to distinguish benign from malignant lesions. The sensitivity, specificity and diagnostic accordance rate of each diagnostic method was evaluated by receiver operating characteristic (ROC) curve, and the area under the curve (AUC) of ROC was calculated. The manifestations of malignant thyroid nodules in conventional US examination were mostly the hypoecho, heterogeneous echo, irregular shape, unclear boundary, aspect ratio <1, microcalcification and irregular peripheral echo halo, and there were statistically significant differences compared with the benign nodules (P<0.05). UE showed that the differences between benign and malignant nodules in 2, 3 and 4 points were statistically significant (P<0.05). The manifestations of malignant nodules in UC were mostly the irregular shape, obscure boundary, no obvious enhancement, heterogeneous enhancement and visible perfusion defects, and there were statistically significant differences compared with the benign nodules (P<0.05). ROC curve showed that both sensitivity and specificity of UE and UC were superior to those of conventional US. AUC was the largest (AUC = 0.908) and the diagnostic value was the highest in the conventional US combined with UE and UC. Conventional US combined with elastography and UC can significantly improve the sensitivity, specificity and accuracy of diagnosis of benign and malignant thyroid nodules. PMID:28693244
Liu, Mei-Juan; Men, Yan-Ming; Zhang, Yong-Lin; Zhang, Yu-Xi; Liu, Hao
2017-07-01
We aimed to evaluate the diagnostic values of conventional ultrasound (US), ultrasound contrast (UC) and ultrasound elastography (UE) in distinguishing the benign and malignant thyroid nodules. A total of 100 patients with thyroid nodules receiving operative treatment were selected; they underwent the conventional US, UE and UC examinations before operation, respectively. The nodules received pathological examination after operation to distinguish benign from malignant lesions. The sensitivity, specificity and diagnostic accordance rate of each diagnostic method was evaluated by receiver operating characteristic (ROC) curve, and the area under the curve (AUC) of ROC was calculated. The manifestations of malignant thyroid nodules in conventional US examination were mostly the hypoecho, heterogeneous echo, irregular shape, unclear boundary, aspect ratio <1, microcalcification and irregular peripheral echo halo, and there were statistically significant differences compared with the benign nodules (P<0.05). UE showed that the differences between benign and malignant nodules in 2, 3 and 4 points were statistically significant (P<0.05). The manifestations of malignant nodules in UC were mostly the irregular shape, obscure boundary, no obvious enhancement, heterogeneous enhancement and visible perfusion defects, and there were statistically significant differences compared with the benign nodules (P<0.05). ROC curve showed that both sensitivity and specificity of UE and UC were superior to those of conventional US. AUC was the largest (AUC = 0.908) and the diagnostic value was the highest in the conventional US combined with UE and UC. Conventional US combined with elastography and UC can significantly improve the sensitivity, specificity and accuracy of diagnosis of benign and malignant thyroid nodules.
Hiremath, S B; Muraleedharan, A; Kumar, S; Nagesh, C; Kesavadas, C; Abraham, M; Kapilamoorthy, T R; Thomas, B
2017-04-01
Tumefactive demyelinating lesions with atypical features can mimic high-grade gliomas on conventional imaging sequences. The aim of this study was to assess the role of conventional imaging, DTI metrics ( p:q tensor decomposition), and DSC perfusion in differentiating tumefactive demyelinating lesions and high-grade gliomas. Fourteen patients with tumefactive demyelinating lesions and 21 patients with high-grade gliomas underwent brain MR imaging with conventional, DTI, and DSC perfusion imaging. Imaging sequences were assessed for differentiation of the lesions. DTI metrics in the enhancing areas and perilesional hyperintensity were obtained by ROI analysis, and the relative CBV values in enhancing areas were calculated on DSC perfusion imaging. Conventional imaging sequences had a sensitivity of 80.9% and specificity of 57.1% in differentiating high-grade gliomas ( P = .049) from tumefactive demyelinating lesions. DTI metrics ( p : q tensor decomposition) and DSC perfusion demonstrated a statistically significant difference in the mean values of ADC, the isotropic component of the diffusion tensor, the anisotropic component of the diffusion tensor, the total magnitude of the diffusion tensor, and rCBV among enhancing portions in tumefactive demyelinating lesions and high-grade gliomas ( P ≤ .02), with the highest specificity for ADC, the anisotropic component of the diffusion tensor, and relative CBV (92.9%). Mean fractional anisotropy values showed no significant statistical difference between tumefactive demyelinating lesions and high-grade gliomas. The combination of DTI and DSC parameters improved the diagnostic accuracy (area under the curve = 0.901). Addition of a heterogeneous enhancement pattern to DTI and DSC parameters improved it further (area under the curve = 0.966). The sensitivity increased from 71.4% to 85.7% after the addition of the enhancement pattern. DTI and DSC perfusion add profoundly to conventional imaging in differentiating tumefactive demyelinating lesions and high-grade gliomas. The combination of DTI metrics and DSC perfusion markedly improved diagnostic accuracy. © 2017 by American Journal of Neuroradiology.
Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis
Arakeri, Surekha Ulhas
2017-01-01
Introduction Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. Aim To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. Materials and Methods FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. Results The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. Conclusion MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up. PMID:28511391
Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis.
Sinkar, Prachi; Arakeri, Surekha Ulhas
2017-03-01
Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up.
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Effects of Item Exposure for Conventional Examinations in a Continuous Testing Environment.
ERIC Educational Resources Information Center
Hertz, Norman R.; Chinn, Roberta N.
This study explored the effect of item exposure on two conventional examinations administered as computer-based tests. A principal hypothesis was that item exposure would have little or no effect on average difficulty of the items over the course of an administrative cycle. This hypothesis was tested by exploring conventional item statistics and…
Saez, M; Figueiras, A; Ballester, F; Pérez-Hoyos, S; Ocaña, R; Tobías, A
2001-06-01
The objective of this paper is to introduce a different approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. The approach is illustrated by providing quantitative estimates of the short-term effects of air pollution on mortality in three Spanish cities, Barcelona, Valencia and Vigo, for the period 1992-1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were non-linear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 microg/m(3) and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 microg/m(3) of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant, especially those among air pollutants and meteorological variables. Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities.
Bookstein, Fred L; Domjanić, Jacqueline
2014-09-01
The relationship of geometric morphometrics (GMM) to functional analysis of the same morphological resources is currently a topic of active interest among functional morphologists. Although GMM is typically advertised as free of prior assumptions about shape features or morphological theories, it is common for GMM findings to be concordant with findings from studies based on a-priori lists of shape features whenever prior insights or theories have been properly accounted for in the study design. The present paper demonstrates this happy possibility by revisiting a previously published GMM analysis of footprint outlines for which there is also functionally relevant information in the form of a-pri-ori foot measurements. We show how to convert the conventional measurements into the language of shape, thereby affording two parallel statistical analyses. One is the classic multivariate analysis of "shape features", the other the equally classic GMM of semilandmark coordinates. In this example, the two data sets, analyzed by protocols that are remarkably different in both their geometry and their algebra, nevertheless result in one common biometrical summary: wearing high heels is bad for women inasmuch as it leads to the need for orthotic devices to treat the consequently flattened arch. This concordance bears implications for other branches of applied anthropology. To carry out a good biomedical analysis of applied anthropometric data it may not matter whether one uses GMM or instead an adequate assortment of conventional measurements. What matters is whether the conventional measurements have been selected in order to match the natural spectrum of functional variation.
Revising the lower statistical limit of x-ray grating-based phase-contrast computed tomography.
Marschner, Mathias; Birnbacher, Lorenz; Willner, Marian; Chabior, Michael; Herzen, Julia; Noël, Peter B; Pfeiffer, Franz
2017-01-01
Phase-contrast x-ray computed tomography (PCCT) is currently investigated as an interesting extension of conventional CT, providing high soft-tissue contrast even if examining weakly absorbing specimen. Until now, the potential for dose reduction was thought to be limited compared to attenuation CT, since meaningful phase retrieval fails for scans with very low photon counts when using the conventional phase retrieval method via phase stepping. In this work, we examine the statistical behaviour of the reverse projection method, an alternative phase retrieval approach and compare the results to the conventional phase retrieval technique. We investigate the noise levels in the projections as well as the image quality and quantitative accuracy of the reconstructed tomographic volumes. The results of our study show that this method performs better in a low-dose scenario than the conventional phase retrieval approach, resulting in lower noise levels, enhanced image quality and more accurate quantitative values. Overall, we demonstrate that the lower statistical limit of the phase stepping procedure as proposed by recent literature does not apply to this alternative phase retrieval technique. However, further development is necessary to overcome experimental challenges posed by this method which would enable mainstream or even clinical application of PCCT.
Depincé-Berger, Anne E; Moreau, Amelie; Bossy, Virginie; Genin, Christian; Rinaudo, Melanie; Paul, Stephane
2016-09-01
Indirect immunofluorescence plays a major role in the detection of antinuclear antibodies (ANAs) and follow-up of their titers in the context of connective tissue diseases. Given the numerous unfavorable features of the conventional manual reading of HEP2 slides (need of time and expert morphologists for the reading, lack of standardization, subjectivity of the interpretation), the biomedical industry has developed automated techniques of slide preparation and microscope reading. We collected 49 sera beforehand analyzed by the conventional reading of slides. They were prepared again by QUANTA-Lyser(®) and reanalyzed in four different conditions: two dilutions of screening (1/40 and 1/80), two different systems of analysis, NOVA View(®) automated reading (INOVA Diagnostics), then confirmation by the operator, and conventional manual reading by two different qualified operators. The analysis was realized in blind of the first interpretation and clinical diagnosis. The sera were classified in four groups, on the basis of the results of the first analysis: negative sera (titer < 1/160; 11 patients), low positives (titer at 1/160; 18 patients), moderated positives (titers between 1/320 and 1/640; 10 patients), and strong positives (titers between 1/1,280 and 1/2,560; 10 patients). Among the 49 patients, 13 presented a connective tissue disease including 4 systemic scleroderma (SS), 3 rheumatoid arthritis (RA), 2 Goujerot-Sjogren (GS), 2 systemic lupus erythematosus (SLE), 1 polymyositis (PM), 1 Raynaud's syndrome (RS), and 1 CREST syndrome. One patient presented both an SLE and an SS. Regarding the screening dilution, the 1/40 dilution is less specific than the 1/80 dilution for both the systems of analysis (5.6% vs. 16.7% for the manual reading, and 27.8% vs. 50% for the automated reading). It also generates statistically more false positives (P = 0.037 for the conventional analysis and P = 0.003 for the automated system). The automated NOVA View(®) reading of slides allows a gain in specificity for both dilutions, and also statistically less false positives (P = 0.002 at the 1/40 and P = 0.0006 at the 1/80), and detriment of the sensitivity at the highest dilution (84.6% vs. 92.3% with manual reading). Thus, according to our analysis of 49 sera, the automated NOVA View(®) system of reading of slides at the dilution 1/80 seems to be a successful condition for the detection of ANAs on HEP2 cells, close to the significance (P = 0.067). The automated NOVA View(®) reading of slides allows saving time, and an improvement in the standardization. Nevertheless, it requires a confirmation by a qualified operator, to interpret mixed patterns in particular. © 2016 Wiley Periodicals, Inc.
Liu, Dungang; Liu, Regina; Xie, Minge
2014-01-01
Meta-analysis has been widely used to synthesize evidence from multiple studies for common hypotheses or parameters of interest. However, it has not yet been fully developed for incorporating heterogeneous studies, which arise often in applications due to different study designs, populations or outcomes. For heterogeneous studies, the parameter of interest may not be estimable for certain studies, and in such a case, these studies are typically excluded from conventional meta-analysis. The exclusion of part of the studies can lead to a non-negligible loss of information. This paper introduces a metaanalysis for heterogeneous studies by combining the confidence density functions derived from the summary statistics of individual studies, hence referred to as the CD approach. It includes all the studies in the analysis and makes use of all information, direct as well as indirect. Under a general likelihood inference framework, this new approach is shown to have several desirable properties, including: i) it is asymptotically as efficient as the maximum likelihood approach using individual participant data (IPD) from all studies; ii) unlike the IPD analysis, it suffices to use summary statistics to carry out the CD approach. Individual-level data are not required; and iii) it is robust against misspecification of the working covariance structure of the parameter estimates. Besides its own theoretical significance, the last property also substantially broadens the applicability of the CD approach. All the properties of the CD approach are further confirmed by data simulated from a randomized clinical trials setting as well as by real data on aircraft landing performance. Overall, one obtains an unifying approach for combining summary statistics, subsuming many of the existing meta-analysis methods as special cases. PMID:26190875
Renormalization Group Tutorial
NASA Technical Reports Server (NTRS)
Bell, Thomas L.
2004-01-01
Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.
Statistical methods and neural network approaches for classification of data from multiple sources
NASA Technical Reports Server (NTRS)
Benediktsson, Jon Atli; Swain, Philip H.
1990-01-01
Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.
Moura, Renata Vasconcellos; Kojima, Alberto Noriyuki; Saraceni, Cintia Helena Coury; Bassolli, Lucas; Balducci, Ivan; Özcan, Mutlu; Mesquita, Alfredo Mikail Melo
2018-05-01
The increased use of CAD systems can generate doubt about the accuracy of digital impressions for angulated implants. The aim of this study was to evaluate the accuracy of different impression techniques, two conventional and one digital, for implants with and without angulation. We used a polyurethane cast that simulates the human maxilla according to ASTM F1839, and 6 tapered implants were installed with external hexagonal connections to simulate tooth positions 17, 15, 12, 23, 25, and 27. Implants 17 and 23 were placed with 15° of mesial angulation and distal angulation, respectively. Mini cone abutments were installed on these implants with a metal strap 1 mm in height. Conventional and digital impression procedures were performed on the maxillary master cast, and the implants were separated into 6 groups based on the technique used and measurement type: G1 - control, G2 - digital impression, G3 - conventional impression with an open tray, G4 - conventional impression with a closed tray, G5 - conventional impression with an open tray and a digital impression, and G6 - conventional impression with a closed tray and a digital impression. A statistical analysis was performed using two-way repeated measures ANOVA to compare the groups, and a Kruskal-Wallis test was conducted to analyze the accuracy of the techniques. No significant difference in the accuracy of the techniques was observed between the groups. Therefore, no differences were found among the conventional impression and the combination of conventional and digital impressions, and the angulation of the implants did not affect the accuracy of the techniques. All of the techniques exhibited trueness and had acceptable precision. The variation of the angle of the implants did not affect the accuracy of the techniques. © 2018 by the American College of Prosthodontists.
Comparing the effectiveness of laser vs. conventional endoforehead lifting.
Chang, Cheng-Jen; Yu, De-Yi; Chang, Shu-Ying; Hsiao, Yen-Chang
2018-04-01
The objective of this study was to compare the efficacy and safety of laser versus conventional endoforehead lifting. Over a period of 12 years (January 2000-January 2012), a total of 110 patients with hyperactive muscles over the frontal region have been collected for a retrospective study. The SurgiLase 150XJ CO 2 laser system, in conjunction with the flexible FIBERLASE, was used. The endoscope was 4 mm in diameter with an angle of 30°. The primary efficacy measurement was the assessment of the final outcome for using laser vs. conventional methods. Both groups were observed at three weeks, six weeks and six months after surgery. The most common complication in early convalescence (three weeks) was swelling. This was followed by local paraesthesia, ecchymosis, localized hematomas and scar with alopecia. All these problems disappeared completely after the 6-month study period. Based on a chi-square analysis, there were clinically and statistically significant differences favouring the laser endoforehead surgery in the operative time, early and late complications. All patients achieved significant improvement after both laser and conventional endoforehead surgery in the final outcome. However, the early and late complications indicated a greater difference in the laser group.
The integrity of bonded amalgam restorations: a clinical evaluation after five years.
Mach, Zbynek; Regent, Jan; Staninec, Michal; Mrklas, Lubor; Setcos, James C
2002-04-01
Bonded amalgam restorations have been studied extensively in vitro, but few long term clinical studies exist. The authors examined the clinical performance of bonded amalgam restorations after five years of clinical service an compared it with that of nonbonded amalgam restorations. The authors placed 75 bonded and 62 nonbonded amalgam restorations in patients needing restorations. Most of the restorations were placed in conventional preparations; six bonded restorations were placed in nonretentive cavities. They were evaluated after a five-year period of clinical service by two trained dentists using a mirror and explorer and following modified U.S. Public Health Service criteria. Statistical analysis (via Fisher exact test) showed no significant differences between the two techniques when conventional preparations were used. Restorations in nonretentive preparations were successful during this period. Bonded and nonbonded amalgam restorations yielded similar results in conventional preparations after five years of clinical service. Bonded amalgam restorations were clinically successful in a limited number of nonretentive preparations over a five-year period. Bonded amalgam restorations can be used successfully in conventional preparations and possibly in nonretentive preparations as well, and can be expected to last at least five years.
Comparison of atomization characteristics of drop-in and conventional jet fuels
NASA Astrophysics Data System (ADS)
Kannaiyan, Kumaran; Sadr, Reza; Micro Scale Thermo-Fluids Lab Team
2016-11-01
Surge in energy demand and stringent emission norms have been driving the interest on alternative drop-in fuels in aviation industry. The gas-to-liquid (GTL), synthetic paraffinic kerosene fuel derived from natural gas, has drawn significant attention as drop-in fuel due to its cleaner combustion characteristics when compared to other alternative fuels derived from various feedstocks. The fuel specifications such as chemical and physical properties of drop-in fuels are different from those of the conventional jet fuels, which can affect their atomization characteristics and in turn the combustion performance. The near nozzle liquid sheet dynamics of the drop-in fuel, GTL, is studied at different nozzle operating conditions and compared with that of the conventional Jet A-1 fuel. The statistical analysis of the near nozzle sheet dynamics shows that the drop-in fuel atomization characteristics are comparable to those of the conventional fuel. Furthermore, the microscopic spray characteristics measured using phase Doppler anemometry at downstream locations are slightly different between the fuels. Authors acknowledge the support by National Priorities Research Program (NPRP) of Qatar National Research Fund through the Grant NPRP-7-1449-2-523.
van Bochove, J A; van Amerongen, W E
2006-03-01
The aim is to investigate possible differences in discomfort during treatment with the atraumatic restorative treatment (ART) or the Conventional restorative method with and without local analgesia (LA). The study group consisted of 6 and 7 year old children with no dental experience (mean age 6.98, SD +/- 0.52) randomly divided into four treatment groups: Conventional method with and without LA and ART with and without LA. One or two proximal lesions in primary molars were treated. The heart rate and the behaviour (Venham) were measured. Statistical analysis was performed in SPSS version 10.0. In a first session 300 children were treated and 109 children for a second time in the same way as at the first visit. During the first session ART without LA gave the least discomfort while the Conventional method without LA gave the most discomfort. During the second treatment the least discomfort was observed with ART without LA and the most discomfort in the Conventional way with LA. There is a constant preference for hand instruments; the bur is increasingly accepted. The experience with LA is the reverse.
Advanced Artificial Intelligence Technology Testbed
NASA Technical Reports Server (NTRS)
Anken, Craig S.
1993-01-01
The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.
Lopes, Lawrence Gonzaga; Franco, Eduardo Batista; Pereira, José Carlos; Mondelli, Rafael Francisco Lia
2008-01-01
The aim of this study was to evaluate the polymerization shrinkage and shrinkage stress of composites polymerized with a LED and a quartz tungsten halogen (QTH) light sources. The LED was used in a conventional mode (CM) and the QTH was used in both conventional and pulse-delay modes (PD). The composite resins used were Z100, A110, SureFil and Bisfil 2B (chemical-cured). Composite deformation upon polymerization was measured by the strain gauge method. The shrinkage stress was measured by photoelastic analysis. The polymerization shrinkage data were analyzed statistically using two-way ANOVA and Tukey test (p≤0.05), and the stress data were analyzed by one-way ANOVA and Tukey's test (p≤0.05). Shrinkage and stress means of Bisfil 2B were statistically significant lower than those of Z100, A110 and SureFil. In general, the PD mode reduced the contraction and the stress values when compared to CM. LED generated the same stress as QTH in conventional mode. Regardless of the activation mode, SureFil produced lower contraction and stress values than the other light-cured resins. Conversely, Z100 and A110 produced the greatest contraction and stress values. As expected, the chemically cured resin generated lower shrinkage and stress than the light-cured resins. In conclusion, The PD mode effectively decreased contraction stress for Z100 and A110. Development of stress in light-cured resins depended on the shrinkage value. PMID:19089287
Santaella, María L; Font, Ivonne; Disdier, Orville M
2004-06-01
To compare effectiveness of oral therapy with reduced nicotinamide adenine dinucleotide (NADH) to conventional modalities of treatment in patients with chronic fatigue syndrome (CFS). CFS is a potentially disabling condition of unknown etiology. Although its clinical presentation is associated to a myriad of symptoms, fatigue is a universal and essential finding for its diagnosis. No therapeutic regimen has proven effective for this condition. A total of 31 patients fulfilling the Centers for Disease Control criteria for CFS, were randomly assigned to either NADH or nutritional supplements and psychological therapy for 24 months. A thorough medical history, physical examination and completion of a questionnaire on the severity of fatigue and other symptoms were performed each trimester of therapy. In addition, all of them underwent evaluation in terms of immunological parameters and viral antibody titers. Statistical analysis was applied to the demographic data, as well as to symptoms scores at baseline and at each trimester of therapy. The twelve patients who received NADH had a dramatic and statistically significant reduction of the mean symptom score in the first trimester (p < 0.001). However, symptom scores in the subsequent trimesters of therapy were similar in both treatment groups. Elevated IgG and Ig E antibody levels were found in a significant number of patients. Observed effectiveness of NADH over conventional treatment in the first trimester of the trial and the trend of improvement of that modality in the subsequent trimesters should be further assessed in a larger patient sample.
Porwal, Anand; Khandelwal, Meenakshi; Punia, Vikas; Sharma, Vivek
2017-01-01
Aim: The purpose of this study was to evaluate the effect of different denture cleansers on the color stability, surface hardness, and roughness of different denture base resins. Materials and Methods: Three denture base resin materials (conventional heat cure resin, high impact resin, and polyamide denture base resin) were immersed for 180 days in commercially available two denture cleansers (sodium perborate and sodium hypochlorite). Color, surface roughness, and hardness were measured for each sample before and after immersion procedure. Statistical Analysis: One-way analysis of variance and Tukey's post hoc honestly significant difference test were used to evaluate color, surface roughness, and hardness data before and after immersion in denture cleanser (α =0.05). Results: All denture base resins tested exhibited a change in color, surface roughness, and hardness to some degree in both denture cleansers. Polyamides resin immersed in sodium perborate showed a maximum change in color after immersion for 180 days. Conventional heat cure resin immersed in sodium hypochlorite showed a maximum change in surface roughness and conventional heat cure immersed in sodium perborate showed a maximum change in hardness. Conclusion: Color changes of all denture base resins were within the clinically accepted range for color difference. Surface roughness change of conventional heat cure resin was not within the clinically accepted range of surface roughness. The choice of denture cleanser for different denture base resins should be based on the chemistry of resin and cleanser, denture cleanser concentration, and duration of immersion. PMID:28216847
Cole, Pamela S; Quisberg, Jennifer; Melin, M Mark
2009-01-01
Small studies have indicated that the addition of acoustic pressure wound therapy (APWT) to conventional wound care may hasten healing of chronic wounds. We evaluated our early clinical experience using APWT as an adjunct to conventional wound care. The study was a retrospective chart review of consecutive patients receiving APWT in addition to conventional wound care in a hospital-based, primarily outpatient setting. Medical records of all patients treated with APWT between August 2006 and October 2007 were reviewed. Analysis included the 41 patients with 52 wounds who received APWT at least 2 times per week during the study period. Statistical comparisons were made for wound dimensions, tissue characteristics, and pain at start versus end of APWT. Thirty-eight percent of wounds (N = 20) healed completely with a mean 6.8 weeks of APWT. Median wound area and volume decreased significantly (88% [P < .0001] and 100% [P < .0001], respectively) from start to end of APWT. The proportion of wounds with greater than 75% granulation tissue increased from 26% (n = 12) to 80% (n = 41) (P < .0001), and normal periwound skin increased from 25% (n = 13) to 54% (n = 28) (P = .0001). Presence of greater than 50% fibrin slough decreased from 50% (n = 24) to 9% (n = 4) of wounds (P = .006). This early experience supplementing conventional wound care with APWT suggests it may promote healing in chronic wounds, where the ordered cellular and molecular processes leading to healing have stalled.
Rai, Rathika; Kumar, S Arun; Prabhu, R; Govindan, Ranjani Thillai; Tanveer, Faiz Mohamed
2017-01-01
Accuracy in fit of cast metal restoration has always remained as one of the primary factors in determining the success of the restoration. A well-fitting restoration needs to be accurate both along its margin and with regard to its internal surface. The aim of the study is to evaluate the marginal fit of metal ceramic crowns obtained by conventional inlay casting wax pattern using conventional impression with the metal ceramic crowns obtained by computer-aided design and computer-aided manufacturing (CAD/CAM) technique using direct and indirect optical scanning. This in vitro study on preformed custom-made stainless steel models with former assembly that resembles prepared tooth surfaces of standardized dimensions comprised three groups: the first group included ten samples of metal ceramic crowns fabricated with conventional technique, the second group included CAD/CAM-milled direct metal laser sintering (DMLS) crowns using indirect scanning, and the third group included DMLS crowns fabricated by direct scanning of the stainless steel model. The vertical marginal gap and the internal gap were evaluated with the stereomicroscope (Zoomstar 4); post hoc Turkey's test was used for statistical analysis. One-way analysis of variance method was used to compare the mean values. Metal ceramic crowns obtained from direct optical scanning showed the least marginal and internal gap when compared to the castings obtained from inlay casting wax and indirect optical scanning. Indirect and direct optical scanning had yielded results within clinically acceptable range.
Park, Yong Seo; Polovka, Martin; Ham, Kyung-Sik Ham; Park, Yang-Kyun; Vearasilp, Suchada; Namieśnik, Jacek; Toledo, Fernando; Arancibia-Avila, Patricia; Gorinstein, Shela
2016-09-01
Organic, semiorganic, and conventional "Hayward" kiwifruits, treated with ethylene for 24 h and stored during 10 days, were assessed by UV spectrometry, fluorometry, and chemometrical analysis for changes in selected characteristics of quality (firmness, dry matter and soluble solid contents, pH, and acidity) and bioactivity (concentration of polyphenols via Folin-Ciocalteu and p-hydroxybenzoic acid assays). All of the monitored qualitative parameters and characteristics related to bioactivity were affected either by cultivation practices or by ethylene treatment and storage. Results obtained, supported by statistical evaluation (Friedman two-way ANOVA) and chemometric analysis, clearly proved that the most significant impact on the majority of the evaluated parameters of quality and bioactivity of "Hayward" kiwifruit had the ethylene treatment followed by the cultivation practices and the postharvest storage. Total concentration of polyphenols expressed via p-hydroxybenzoic acid assay exhibited the most significant sensitivity to all three evaluated parameters, reaching a 16.5% increase for fresh organic compared to a conventional control sample. As a result of postharvest storage coupled with ethylene treatment, the difference increased to 26.3%. Three-dimensional fluorescence showed differences in the position of the main peaks and their fluorescence intensity for conventional, semiorganic, and organic kiwifruits in comparison with ethylene nontreated samples.
Lu, Chao; Lv, Xueyou; Lin, Yiming; Li, Dejian; Chen, Lihua; Ji, Feng; Li, Youming; Yu, Chaohui
2016-07-01
Conventional forceps biopsy (CFB) is the most popular way to screen for gastric epithelial neoplasia (GEN) and adenocarcinoma of gastric epithelium. The aim of this study was to compare the diagnostic accuracy between conventional forceps biopsy and endoscopic submucosal dissection (ESD).Four hundred forty-four patients who finally undertook ESD in our hospital were enrolled from Jan 1, 2009 to Sep 1, 2015. We retrospectively assessed the characteristics of pathological results of CFB and ESD.The concordance rate between CFB and ESD specimens was 68.92% (306/444). Men showed a lower concordance rate (63.61% vs 79.33%; P = 0.001) and concordance patients were younger (P = 0.048). In multivariate analysis, men significantly had a lower concordance rate (coefficient -0.730, P = 0.002) and a higher rate of pathological upgrade (coefficient -0.648, P = 0.015). Locations of CFB did not influence the concordance rate statistically.The concordance rate was relatively high in our hospital. According to our analysis, old men plus gastric fundus or antrum of CFB were strongly suggested to perform ESD if precancerous lesions were found. And young women with low-grade intraepithelial neoplasia could select regular follow-up.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
Reliable enumeration of malaria parasites in thick blood films using digital image analysis.
Frean, John A
2009-09-23
Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.
Lucareli, P R; Lima, M O; Lima, F P S; de Almeida, J G; Brech, G C; D'Andréa Greve, J M
2011-09-01
Single-blind randomized, controlled clinical study. To evaluate, using kinematic gait analysis, the results obtained from gait training on a treadmill with body weight support versus those obtained with conventional gait training and physiotherapy. Thirty patients with sequelae from traumatic incomplete spinal cord injuries at least 12 months earlier; patients were able to walk and were classified according to motor function as ASIA (American Spinal Injury Association) impairment scale C or D. Patients were divided randomly into two groups of 15 patients by the drawing of opaque envelopes: group A (weight support) and group B (conventional). After an initial assessment, both groups underwent 30 sessions of gait training. Sessions occurred twice a week, lasted for 30 min each and continued for four months. All of the patients were evaluated by a single blinded examiner using movement analysis to measure angular and linear kinematic gait parameters. Six patients (three from group A and three from group B) were excluded because they attended fewer than 85% of the training sessions. There were no statistically significant differences in intra-group comparisons among the spatial-temporal variables in group B. In group A, the following significant differences in the studied spatial-temporal variables were observed: increases in velocity, distance, cadence, step length, swing phase and gait cycle duration, in addition to a reduction in stance phase. There were also no significant differences in intra-group comparisons among the angular variables in group B. However, group A achieved significant improvements in maximum hip extension and plantar flexion during stance. Gait training with body weight support was more effective than conventional physiotherapy for improving the spatial-temporal and kinematic gait parameters among patients with incomplete spinal cord injuries.
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-10-01
Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules.
Ingraffea, Anthony R; Wells, Martin T; Santoro, Renee L; Shonkoff, Seth B C
2014-07-29
Casing and cement impairment in oil and gas wells can lead to methane migration into the atmosphere and/or into underground sources of drinking water. An analysis of 75,505 compliance reports for 41,381 conventional and unconventional oil and gas wells in Pennsylvania drilled from January 1, 2000-December 31, 2012, was performed with the objective of determining complete and accurate statistics of casing and cement impairment. Statewide data show a sixfold higher incidence of cement and/or casing issues for shale gas wells relative to conventional wells. The Cox proportional hazards model was used to estimate risk of impairment based on existing data. The model identified both temporal and geographic differences in risk. For post-2009 drilled wells, risk of a cement/casing impairment is 1.57-fold [95% confidence interval (CI) (1.45, 1.67); P < 0.0001] higher in an unconventional gas well relative to a conventional well drilled within the same time period. Temporal differences between well types were also observed and may reflect more thorough inspections and greater emphasis on finding well leaks, more detailed note taking in the available inspection reports, or real changes in rates of structural integrity loss due to rushed development or other unknown factors. Unconventional gas wells in northeastern (NE) Pennsylvania are at a 2.7-fold higher risk relative to the conventional wells in the same area. The predicted cumulative risk for all wells (unconventional and conventional) in the NE region is 8.5-fold [95% CI (7.16, 10.18); P < 0.0001] greater than that of wells drilled in the rest of the state.
Iyer, Sneha R; Gogate, Parag R
2017-01-01
The current work investigates the application of low intensity ultrasonic irradiation for improving the cooling crystallization of Mefenamic Acid for the first time. The crystal shape and size has been analyzed with the help of optical microscope and image analysis software respectively. The effect of ultrasonic irradiation on crystal size, particle size distribution (PSD) and yield has been investigated, also establishing the comparison with conventional approach. It has been observed that application of ultrasound not only enhances the yield but also reduces the induction time for crystallization as compared to conventional cooling crystallization technique. In the presence of ultrasound, the maximum yield was obtained at optimum conditions of power dissipation of 30W and ultrasonic irradiation time of 10min. The yield was further improved by application of ultrasound in cycles where the formed crystals are allowed to grow in the absence of ultrasonic irradiation. It was also observed that the desired crystal morphology was obtained for the ultrasound assisted crystallization. The conventionally obtained needle shaped crystals transformed into plate shaped crystals for the ultrasound assisted crystallization. The particle size distribution was analyzed using statistical means on the basis of skewness and kurtosis values. It was observed that the skewness and excess kurtosis value for ultrasound assisted crystallization was significantly lower as compared to the conventional approach. XRD analysis also revealed better crystal properties for the processed mefenamic acid using ultrasound assisted approach. The overall process intensification benefits of mefenamic acid crystallization using the ultrasound assisted approach were reduced particle size, increase in the yield and uniform PSD coupled with desired morphology. Copyright © 2016 Elsevier B.V. All rights reserved.
Citirik, Mehmet; Batman, Cosar; Bicer, Tolga; Zilelioglu, Orhan
2009-09-01
To assess the alterations in keratometric astigmatism following the 25-gauge transconjunctival sutureless pars plana vitrectomy versus the conventional pars plana vitrectomy. Sixteen consecutive patients were enrolled into the study. Conventional vitrectomy was applied to eight of the cases and 25-gauge transconjunctival sutureless vitrectomy was performed in eight patients. Keratometry was performed before and after the surgery. In the 25-gauge transconjunctival sutureless pars plana vitrectomy group, statistically significant changes were not observed in the corneal curvature in any post-operative follow-up measurement (p > 0.05); whereas in the conventional pars plana vitrectomy group, statistically significant changes were observed in the first postoperative day (p = 0.01) and first postoperative month (p = 0.03). We noted that these changes returned to baseline in three months (p = 0.26). Both 25-gauge transconjunctival sutureless and conventional pars plana vitrectomy are effective surgical modalities for selected diseases of the posterior segment. Surgical procedures are critical for the visual rehabilitation of the patients. The post-operative corneal astigmatism of the vitrectomised eyes can be accurately determined at least two months post-operatively.
Directions for new developments on statistical design and analysis of small population group trials.
Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel
2016-06-14
Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.
Kawata, Masaaki; Sato, Chikara
2007-06-01
In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.
Holographic Refraction and the Measurement of Spherical Ametropia.
Nguyen, Nicholas Hoai Nam
2016-10-01
To evaluate the performance of a holographic logMAR chart for the subjective spherical refraction of the human eye. Bland-Altman analysis was used to assess the level of agreement between subjective spherical refraction using the holographic logMAR chart and conventional autorefraction and subjective spherical refraction. The 95% limits of agreement (LoA) were calculated between holographic refraction and the two standard methods (subjective and autorefraction). Holographic refraction has a lower mean spherical refraction when compared to conventional refraction (LoA 0.11 ± 0.65 D) and when compared to autorefraction (LoA 0.36 ± 0.77 D). After correcting for systemic bias, this is comparable between autorefraction and conventional subjective refraction (LoA 0.45 ± 0.79 D). After correcting for differences in vergence distance and chromatic aberration between holographic and conventional refraction, approximately 65% (group 1) of measurements between holography and conventional subjective refraction were similar (MD = 0.13 D, SD = 0.00 D). The remaining 35% (group 2) had a mean difference of 0.45 D (SD = 0.12 D) between the two subjective methods. Descriptive statistics showed group 2's mean age (21 years, SD = 13 years) was considerably lower than group 1's mean age (41 years, SD = 17), suggesting accommodation may have a role in the greater mean difference of group 2. Overall, holographic refraction has good agreement with conventional refraction and is a viable alternative for spherical subjective refraction. A larger bias between holographic and conventional refraction was found in younger subjects than older subjects, suggesting an association between accommodation and myopic over-correction during holographic refraction.
Tofangchiha, Maryam; Adel, Mamak; Bakhshi, Mahin; Esfehani, Mahsa; Nazeman, Pantea; Ghorbani Elizeyi, Mojgan; Javadi, Amir
2013-01-01
Vertical root fracture (VRF) is a complication which is chiefly diagnosed radiographically. Recently, film-based radiography has been substituted with digital radiography. At the moment, there is a wide range of monitors available in the market for viewing digital images. The present study aims to compare the diagnostic accuracy, sensitivity and specificity of medical and conventional monitors in detection of vertical root fractures. In this in vitro study 228 extracted single-rooted human teeth were endodontically treated. Vertical root fractures were induced in 114 samples. The teeth were imaged by a digital charge-coupled device radiography using parallel technique. The images were evaluated by a radiologist and an endodontist on two medical and conventional liquid-crystal display (LCD) monitors twice. Z-test was used to analyze the sensitivity, accuracy and specificity of each monitor. Significance level was set at 0.05. Inter and intra observer agreements were calculated by Cohen's kappa. Accuracy, specificity and sensitivity for conventional monitor were calculated as 67.5%, 72%, 62.5% respectively; and data for medical grade monitor were 67.5%, 66.5% and 68% respectively. Statistical analysis showed no significant differences in detecting VRF between the two techniques. Inter-observer agreement for conventional and medical monitor was 0.47 and 0.55 respectively (moderate). Intra-observer agreement was 0.78 for medical monitor and 0.87 for conventional one (substantial). The type of monitor does not influence diagnosis of vertical root fractures.
Optimal weighting of data to detect climatic change - Application to the carbon dioxide problem
NASA Technical Reports Server (NTRS)
Bell, T. L.
1982-01-01
It is suggested that a weighting of surface temperature data, using information about the expected level of warming in different seasons and geographical regions and statistical information about the amount of natural variability in surface temperature, can improve the chances of early detection of carbon dioxide concentration-induced climatic warming. A preliminary analysis of the optimal weighting method presented suggests that it is 25 per cent more effective in revealing surface warming than the conventional method, in virtue of the fact that 25 per cent more data must conventionally be analyzed in order to arrive at a similar probability of detection. An approximate calculation suggests that the warming ought to have already been detected, if the only sources of significant surface temperature variability had time scales of less than one year.
NASA Astrophysics Data System (ADS)
Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna
2016-11-01
Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.
Fritscher, Karl; Grunerbl, Agnes; Hanni, Markus; Suhm, Norbert; Hengg, Clemens; Schubert, Rainer
2009-10-01
Currently, conventional X-ray and CT images as well as invasive methods performed during the surgical intervention are used to judge the local quality of a fractured proximal femur. However, these approaches are either dependent on the surgeon's experience or cannot assist diagnostic and planning tasks preoperatively. Therefore, in this work a method for the individual analysis of local bone quality in the proximal femur based on model-based analysis of CT- and X-ray images of femur specimen will be proposed. A combined representation of shape and spatial intensity distribution of an object and different statistical approaches for dimensionality reduction are used to create a statistical appearance model in order to assess the local bone quality in CT and X-ray images. The developed algorithms are tested and evaluated on 28 femur specimen. It will be shown that the tools and algorithms presented herein are highly adequate to automatically and objectively predict bone mineral density values as well as a biomechanical parameter of the bone that can be measured intraoperatively.
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
Meta-analysis using Dirichlet process.
Muthukumarana, Saman; Tiwari, Ram C
2016-02-01
This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. © The Author(s) 2012.
CTTITEM: SAS macro and SPSS syntax for classical item analysis.
Lei, Pui-Wa; Wu, Qiong
2007-08-01
This article describes the functions of a SAS macro and an SPSS syntax that produce common statistics for conventional item analysis including Cronbach's alpha, item difficulty index (p-value or item mean), and item discrimination indices (D-index, point biserial and biserial correlations for dichotomous items and item-total correlation for polytomous items). These programs represent an improvement over the existing SAS and SPSS item analysis routines in terms of completeness and user-friendliness. To promote routine evaluations of item qualities in instrument development of any scale, the programs are available at no charge for interested users. The program codes along with a brief user's manual that contains instructions and examples are downloadable from suen.ed.psu.edu/-pwlei/plei.htm.
Kohli, Divyata; Badakar, Chandrashekhar M; Vyavahare, Supriya S; Shah, Parin P; Gokhale, Niraj S; Patel, Punit M; Mundada, Madhura V
2017-01-01
Introduction Early treatment of carious lesions in children is important for the maintenance of oral health. Multicoloured restorations could be the impetus for an extremely nervous or defiant child to take dental treatment. Aim The aim of this study was to assess and compare the clinical success of conventional composites and coloured compomer material in first permanent molars of children with mixed dentition. Materials and Methods A total of sixty sites, divided into two groups, with thirty subjects in each group using split mouth design were chosen amongst patients reporting to Department of Pedodontics and Preventive Dentistry. In control group conventional composites were placed, similarly coloured compomers were placed in experimental group under standard operating protocol. Patients were recalled for assessment of clinical success amongst control as well as experimental group at regular intervals of one; three and six months follow up based on Modified Ryge’s Criteria. Statistical analysis was done using Chi-square test using SPSS version 20.0 (Chicago, USA). Results Both conventional composites and coloured compomers had comparable retention rates in terms of anatomical form, marginal integrity, secondary caries and marginal discolouration. Conclusion The coloured compomer material showed promising results in this six month follow up study in permanent molars and had properties comparable to that of conventional composites. PMID:28764297
Identifying sources of metal exposure in organic and conventional dairy farming.
López-Alonso, M; Rey-Crespo, F; Herrero-Latorre, C; Miranda, M
2017-10-01
In humans the main route of exposure to toxic metals is through the diet, and there is therefore a clear need for this source of contamination to be minimized, particularly in food of animal origin. For this purpose, the various sources of toxic metals in livestock farming (which vary depending on the production system) must be taken into account. The objectives of the present study were to establish the profile of metal exposure in dairy cattle in Spain and to determine, by chemometric (multivariate statistical) analysis, any differences between organic and conventional systems. Blood samples from 522 cows (341 from organic farms and 181 from conventional farms) were analysed by inductively coupled plasma mass spectrometry to determine the concentrations of 14 elements: As, Cd, Co, Cr, Cu, Fe, Hg, I, Mn, Mo, Ni, Pb, Se and Zn. In conventional systems the generally high and balanced trace element concentrations in the mineral-supplemented concentrate feed strongly determined the metal status of the cattle. However, in organic systems, soil ingestion was an important contributing factor. Our results demonstrate that general information about the effects of mineral supplementation in conventional farming cannot be directly extrapolated to organic farming and special attention should be given to the contribution of ingestion of soil during grazing and/or ingestion of soil contaminated forage. Copyright © 2017 Elsevier Ltd. All rights reserved.
Folco, Alejandra A; Benítez-Rogé, Sandra C; Iglesias, Marina; Calabrese, Diana; Pelizardi, Cristina; Rosa, Alcira; Brusca, Marisa I; Hecht, Pedro; Mateu, María E
2014-01-01
Orthodontic brackets contribute to the accumulation of bacterial plaque on tooth surfaces because they hinder oral hygiene. In contrast to conventional brackets, self-ligating brackets do not require additional parts to support the arches, thus improving dental hygiene. The aim of this study was to compare the gingival response in orthodontic patients wearing self-ligating or conventional brackets. A sample of 22 patients aged 16 to 30 years was divided into two groups: Group A, treated with selfligating brackets (Damon system) and Group B, treated with conventional brackets (Roth technique). The following were assessed during the treatment: Plaque Index (PI), Gingival Index (GI) and Probing Depth (PD), and sub-gingival samples were taken from teeth 14/24 for microbiological observation. No statistically significant difference was found between Groups A and B; p>0.05 (sign-ranked) or between PI, GI and PD at the different times (Friedman's Analysis of Variance), even though the indices were found to increase at 14 days, particularly for self-ligating brackets. The quantity and quality of microorganisms present were compatible with health on days 0, 28 and 56. As from day 14 there is a predominance of microbiota compatible with gingivitis in both groups. In the samples studied, orthodontic treatment increases bacterial plaque and inflammatory gingival response, but gingival-periodontal health can be maintained with adequate basic therapy. Self-ligating and conventional brackets produced similar gingival response.
Dinakaran, Shiji
2015-01-01
Background: Cervical lesions of anterior and posterior teeth are a common finding in routine dental practice. They are of much concern to the patient, if present in esthetically sensitive regions. Adhesive tooth-colored restorative materials are generally recommended for treating such lesions. The aim of the present study was to evaluate and compare the effect of various food media (lime juice, tea, coffee, and Coca-Cola) on the marginal integrity of Class V compomer (Dyract®), conventional glass-ionomer (Fuji II) and resin-modified glass-ionomer (Fuji II LC improved) restorations along their cemental and enamel margins with saline as control media. Materials and Methods: After restoration of prepared Class V cavities in human premolars with the three different materials (n = 8), they were immersed in the test media for 7 days and then stained with methylene blue dye. Buccolingual sections were prepared and examined under stereomicroscope and scores (0-2) were given. Results: Data were analyzed statistically using one-way analysis of variance in SPSS version 16.0. P < 0.05 were considered statistically significant. Conclusions: Among the three tested materials Compomer (Dyract®) showed more marginal integrity than the other two. Micro leakage values of Fuji II and Fuji II LC improved were statistically significant in acidic media (lime juice and Coca-Cola) compared to saline. Enamel margins showed more marginal adaptation than cemental margins. PMID:25878480
Dinakaran, Shiji
2015-03-01
Cervical lesions of anterior and posterior teeth are a common finding in routine dental practice. They are of much concern to the patient, if present in esthetically sensitive regions. Adhesive tooth-colored restorative materials are generally recommended for treating such lesions. The aim of the present study was to evaluate and compare the effect of various food media (lime juice, tea, coffee, and Coca-Cola) on the marginal integrity of Class V compomer (Dyract(®)), conventional glass-ionomer (Fuji II) and resin-modified glass-ionomer (Fuji II LC improved) restorations along their cemental and enamel margins with saline as control media. After restoration of prepared Class V cavities in human premolars with the three different materials (n = 8), they were immersed in the test media for 7 days and then stained with methylene blue dye. Buccolingual sections were prepared and examined under stereomicroscope and scores (0-2) were given. Data were analyzed statistically using one-way analysis of variance in SPSS version 16.0. P < 0.05 were considered statistically significant. Among the three tested materials Compomer (Dyract(®)) showed more marginal integrity than the other two. Micro leakage values of Fuji II and Fuji II LC improved were statistically significant in acidic media (lime juice and Coca-Cola) compared to saline. Enamel margins showed more marginal adaptation than cemental margins.
Vadhana, Sekar; Latha, Jothi; Velmurugan, Natanasabapathy
2015-05-01
This study evaluated the penetration depth of 2% chlorhexidine digluconate (CHX) into root dentinal tubules and the influence of passive ultrasonic irrigation (PUI) using a confocal laser scanning microscope (CLSM). Twenty freshly extracted anterior teeth were decoronated and instrumented using Mtwo rotary files up to size 40, 4% taper. The samples were randomly divided into two groups (n = 10), that is, conventional syringe irrigation (CSI) and PUI. CHX was mixed with Rhodamine B dye and was used as the final irrigant. The teeth were sectioned at coronal, middle and apical levels and viewed under CLSM to record the penetration depth of CHX. The data were statistically analyzed using Kruskal-Wallis and Mann-Whitney U tests. The mean penetration depths of 2% CHX in coronal, middle and apical thirds were 138 µm, 80 µm and 44 µm in CSI group, respectively, whereas the mean penetration depths were 209 µm, 138 µm and 72 µm respectively in PUI group. Statistically significant difference was present between CSI group and PUI group at all three levels (p < 0.01 for coronal third and p < 0.001 for middle and apical thirds). On intragroup analysis, both groups showed statistically significant difference among three levels (p < 0.001). Penetration depth of 2% CHX into root dentinal tubules is deeper in coronal third when compared to middle and apical third. PUI aided in deeper penetration of 2% CHX into dentinal tubules when compared to conventional syringe irrigation at all three levels.
Goat milk free fatty acid characterization during conventional and ohmic heating pasteurization.
Pereira, R N; Martins, R C; Vicente, A A
2008-08-01
The disruption of the milk fat globule membrane can lead to an excessive accumulation of free fatty acids in milk, which is frequently associated with the appearance of rancid flavors. Solid-phase microextraction and gas chromatography techniques have been shown to be useful tools in the quantification of individual free fatty acids in dairy products providing enough sensitivity to detect levels of rancidity in milk. Therefore, the aim of this study was to characterize the short-chain and medium-chain free fatty acid profile in i) raw untreated goat milk; ii) raw goat milk passing through pumps and heating units (plate-and-frame heat exchanger and ohmic heater); and iii) processed goat milk by conventional and ohmic pasteurization to determine the influence of each treatment in the final quality of the milk. Multivariate statistical analysis has shown that the treatments studied were not responsible for the variability found on free fatty acid contents. In particular, it was possible to conclude that ohmic pasteurization at 72 degrees C for 15 s did not promote an extended modification of free fatty acid contents in goat milk when compared with that of conventional pasteurization. Furthermore, principal component analysis showed that the capric acid can be used to discriminate goat's milk with different free fatty acid concentrations. Hierarchical cluster analysis showed evidence of the existence of correlations between contents of short and medium chain free fatty acids in goat milk.
The influence of the compression interface on the failure behavior and size effect of concrete
NASA Astrophysics Data System (ADS)
Kampmann, Raphael
The failure behavior of concrete materials is not completely understood because conventional test methods fail to assess the material response independent of the sample size and shape. To study the influence of strength and strain affecting test conditions, four typical concrete sample types were experimentally evaluated in uniaxial compression and analyzed for strength, deformational behavior, crack initiation/propagation, and fracture patterns under varying boundary conditions. Both low friction and conventional compression interfaces were assessed. High-speed video technology was used to monitor macrocracking. Inferential data analysis proved reliably lower strength results for reduced surface friction at the compression interfaces, regardless of sample shape. Reciprocal comparisons revealed statistically significant strength differences between most sample shapes. Crack initiation and propagation was found to differ for dissimilar compression interfaces. The principal stress and strain distributions were analyzed, and the strain domain was found to resemble the experimental results, whereas the stress analysis failed to explain failure for reduced end confinement. Neither stresses nor strains indicated strength reductions due to reduced friction, and therefore, buckling effects were considered. The high-speed video analysis revealed localize buckling phenomena, regardless of end confinement. Slender elements were the result of low friction, and stocky fragments developed under conventional confinement. The critical buckling load increased accordingly. The research showed that current test methods do not reflect the "true'' compressive strength and that concrete failure is strain driven. Ultimate collapse results from buckling preceded by unstable cracking.
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Riffel, Philipp; Michaely, Henrik J; Morelli, John N; Paul, Dominik; Kannengiesser, Stephan; Schoenberg, Stefan O; Haneder, Stefan
2015-04-01
The purpose of this study was to evaluate the feasibility and technical quality of a zoomed three-dimensional (3D) turbo spin-echo (TSE) sampling perfection with application optimized contrasts using different flip-angle evolutions (SPACE) sequence of the lumbar spine. In this prospective feasibility study, nine volunteers underwent a 3-T magnetic resonance examination of the lumbar spine including 1) a conventional 3D T2-weighted (T2w) SPACE sequence with generalized autocalibrating partially parallel acquisition technique acceleration factor 2 and 2) a zoomed 3D T2w SPACE sequence with a reduced field of view (reduction factor 2). Images were evaluated with regard to image sharpness, signal homogeneity, and the presence of artifacts by two experienced radiologists. For quantitative analysis, signal-to-noise ratio (SNR) values were calculated. Image sharpness of anatomic structures was statistically significantly greater with zoomed SPACE (P < .0001), whereas the signal homogeneity was statistically significantly greater with conventional SPACE (cSPACE; P = .0003). There were no statistically significant differences in extent of artifacts. Acquisition times were 8:20 minutes for cSPACE and 6:30 minutes for zoomed SPACE. Readers 1 and 2 selected zSPACE as the preferred sequence in five of nine cases. In two of nine cases, both sequences were rated as equally preferred by both the readers. SNR values were statistically significantly greater with cSPACE. In comparison to a cSPACE sequences, zoomed SPACE imaging of the lumbar spine provides sharper images in conjunction with a 25% reduction in acquisition time. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Duque, Jussaro Alves; Vivan, Rodrigo Ricci; Cavenago, Bruno Cavalini; Amoroso-Silva, Pablo Andrés; Bernardes, Ricardo Affonso; Vasconcelos, Bruno Carvalho de; Duarte, Marco Antonio Hungaro
2017-01-01
This study aimed to evaluate the influence of the NiTi wire in Conventional NiTi (ProTaper Universal PTU) and Controlled Memory NiTi (ProTaper Gold PTG) instrument systems on the quality of root canal preparation. Twelve mandibular molars with separate mesial canals were scanned using a high-definition microcomputed tomography system. The PTU and PTG instruments were used to shape twelve mesial canals each. The canals were scanned after preparation with F2 and F3 instruments of the PTU and PTG systems. The analyzed parameters included the remaining dentin thickness at the apical and cervical levels, root canal volume and untouched canal walls. Data was analyzed for statistical significance by the Friedman and Dunn's tests. For the comparison of data between groups, the Mann-Whitney test was used. In the pre-operative analysis, there were no statistically significant differences between the groups in terms of the area and volume of root canals (P>.05). There was also no statistically significant difference between the systems with respect to root canal volume after use of the F2 and F3 instruments. There was no statistical difference in the dentin thickness at the first apical level between, before and after instrumentation for both systems. At the 3 cervical levels, the PTG maintained centralization of the preparation on the transition between the F2 and F3 instruments, which did not occur with the PTU. Conclusion The Conventional NiTi (PTU) and Controlled Memory NiTi (PTG) instruments displayed comparable capabilities for shaping the straight mesial root canals of mandibular molars, although the PTG was better than the PTU at maintaining the centralization of the shape in the cervical portion.
NASA Astrophysics Data System (ADS)
Ogawa, T.; Sato, T.; Hashimoto, S.; Niita, K.
2013-09-01
The fragmentation cross-sections of relativistic energy nucleus-nucleus collisions were analyzed using the statistical multi-fragmentation model (SMM) incorporated with the Monte-Carlo radiation transport simulation code particle and heavy ion transport code system (PHITS). Comparison with the literature data showed that PHITS-SMM reproduces fragmentation cross-sections of heavy nuclei at relativistic energies better than the original PHITS by up to two orders of magnitude. It was also found that SMM does not degrade the neutron production cross-sections in heavy ion collisions or the fragmentation cross-sections of light nuclei, for which SMM has not been benchmarked. Therefore, SMM is a robust model that can supplement conventional nucleus-nucleus reaction models, enabling more accurate prediction of fragmentation cross-sections.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Camin, Federica; Pavone, Anita; Bontempo, Luana; Wehrens, Ron; Paolini, Mauro; Faberi, Angelo; Marianella, Rosa Maria; Capitani, Donatella; Vista, Silvia; Mannina, Luisa
2016-04-01
Isotope Ratio Mass Spectrometry (IRMS), (1)H Nuclear Magnetic Resonance ((1)H NMR), conventional chemical analysis and chemometric elaboration were used to assess quality and to define and confirm the geographical origin of 177 Italian PDO (Protected Denomination of Origin) olive oils and 86 samples imported from Tunisia. Italian olive oils were richer in squalene and unsaturated fatty acids, whereas Tunisian olive oils showed higher δ(18)O, δ(2)H, linoleic acid, saturated fatty acids β-sitosterol, sn-1 and 3 diglyceride values. Furthermore, all the Tunisian samples imported were of poor quality, with a K232 and/or acidity values above the limits established for extra virgin olive oils. By combining isotopic composition with (1)H NMR data using a multivariate statistical approach, a statistical model able to discriminate olive oil from Italy and those imported from Tunisia was obtained, with an optimal differentiation ability arriving at around 98%. Copyright © 2015 Elsevier Ltd. All rights reserved.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.
Xiao, Hao; Sun, Tianyang; Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.
Effective Thermal Inactivation of the Spores of Bacillus cereus Biofilms Using Microwave.
Park, Hyong Seok; Yang, Jungwoo; Choi, Hee Jung; Kim, Kyoung Heon
2017-07-28
Microwave sterilization was performed to inactivate the spores of biofilms of Bacillus cereus involved in foodborne illness. The sterilization conditions, such as the amount of water and the operating temperature and treatment time, were optimized using statistical analysis based on 15 runs of experimental results designed by the Box-Behnken method. Statistical analysis showed that the optimal conditions for the inactivation of B. cereus biofilms were 14 ml of water, 108°C of temperature, and 15 min of treatment time. Interestingly, response surface plots showed that the amount of water is the most important factor for microwave sterilization under the present conditions. Complete inactivation by microwaves was achieved in 5 min, and the inactivation efficiency by microwave was obviously higher than that by conventional steam autoclave. Finally, confocal laser scanning microscopy images showed that the principal effect of microwave treatment was cell membrane disruption. Thus, this study can contribute to the development of a process to control food-associated pathogens.
Estimating differential expression from multiple indicators
Ilmjärv, Sten; Hundahl, Christian Ansgar; Reimets, Riin; Niitsoo, Margus; Kolde, Raivo; Vilo, Jaak; Vasar, Eero; Luuk, Hendrik
2014-01-01
Regardless of the advent of high-throughput sequencing, microarrays remain central in current biomedical research. Conventional microarray analysis pipelines apply data reduction before the estimation of differential expression, which is likely to render the estimates susceptible to noise from signal summarization and reduce statistical power. We present a probe-level framework, which capitalizes on the high number of concurrent measurements to provide more robust differential expression estimates. The framework naturally extends to various experimental designs and target categories (e.g. transcripts, genes, genomic regions) as well as small sample sizes. Benchmarking in relation to popular microarray and RNA-sequencing data-analysis pipelines indicated high and stable performance on the Microarray Quality Control dataset and in a cell-culture model of hypoxia. Experimental-data-exhibiting long-range epigenetic silencing of gene expression was used to demonstrate the efficacy of detecting differential expression of genomic regions, a level of analysis not embraced by conventional workflows. Finally, we designed and conducted an experiment to identify hypothermia-responsive genes in terms of monotonic time-response. As a novel insight, hypothermia-dependent up-regulation of multiple genes of two major antioxidant pathways was identified and verified by quantitative real-time PCR. PMID:24586062
Gomes, Cid André Fidelis de Paula; El Hage, Yasmin; Amaral, Ana Paula; Politti, Fabiano; Biasotto-Gonzalez, Daniela Aparecida
2014-01-01
Temporomandibular disorder (TDM) is the most common source of orofacial pain of a non-dental origin. Sleep bruxism is characterized by clenching and/or grinding the teeth during sleep and is involved in the perpetuation of TMD. The aim of the present study was to investigate the effects of massage therapy, conventional occlusal splint therapy and silicone occlusal splint therapy on electromyographic activity in the masseter and anterior temporal muscles and the intensity of signs and symptoms in individuals with severe TMD and sleep bruxism. Sixty individuals with severe TMD and sleep bruxism were randomly distributed into four treatment groups: 1) massage group, 2) conventional occlusal splint group, 3) massage + conventional occlusal splint group and 4) silicone occlusal splint group. Block randomization was employed and sealed opaque envelopes were used to conceal the allocation. Groups 2, 3 and 4 wore an occlusal splint for four weeks. Groups 1 and 3 received three weekly massage sessions for four weeks. All groups were evaluated before and after treatment through electromyographic analysis of the masseter and anterior temporal muscles and the Fonseca Patient History Index. The Wilcoxon test was used to compare the effects of the different treatments and repeated-measures ANOVA was used to determine the intensity of TMD. The inter-group analysis of variance revealed no statistically significant differences in median frequency among the groups prior to treatment. In the intra-group analysis, no statistically significant differences were found between pre-treatment and post-treatment evaluations in any of the groups. Group 3 demonstrated a greater improvement in the intensity of TMD in comparison to the other groups. Massage therapy and the use of an occlusal splint had no significant influence on electromyographic activity of the masseter or anterior temporal muscles. However, the combination of therapies led to a reduction in the intensity of signs and symptoms among individuals with severe TMD and sleep bruxism. This study is registered in August, 2014 in the ClinicalTrials.gov (NCT01874041).
Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav
To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.
Crack closure on rehydration of glass-ionomer materials.
Sidhu, Sharanbir K; Pilecki, Peter; Sherriff, Martyn; Watson, Timothy F
2004-10-01
Moisture-sensitivity of immature glass-ionomer cements suggests that hydration-induced volumetric expansion might close and potentially heal established cracks. Crack closure in glass-ionomer cements (GICs) was observed following rehydration. Circular cavities were prepared in 15 teeth: 10 were restored with resin-modified GICs (5 with Fuji II LC and 5 with Photac-Fil) and 5 were restored with a conventional GIC (Fuji IX); all were dehydrated for 1 min with air and imaged immediately by confocal microscopy. Crack formation in each was located, after which water was placed on the surface and observed for 15 min via a CCD camera. Dehydration caused cracks with measurable gaps, while rehydration resulted in varying degrees of closure: closure was limited in the conventional GIC, and complete or near complete along part/s of the crack in the resin-modified GICs. In all, closure movement became imperceptible after the first 10 min. Statistical analysis indicated no significant difference between the closure behavior of all materials. However, the resin-modified GICs appeared to show a greater potential for closure of established cracks than the conventional GIC upon rehydration.
Video-based teleradiology for intraosseous lesions. A receiver operating characteristic analysis.
Tyndall, D A; Boyd, K S; Matteson, S R; Dove, S B
1995-11-01
Immediate access to off-site expert diagnostic consultants regarding unusual radiographic findings or radiographic quality assurance issues could be a current problem for private dental practitioners. Teleradiology, a system for transmitting radiographic images, offers a potential solution to this problem. Although much research has been done to evaluate feasibility and utilization of teleradiology systems in medical imaging, little research on dental applications has been performed. In this investigation 47 panoramic films with an equal distribution of images with intraosseous jaw lesions and no disease were viewed by a panel of observers with teleradiology and conventional viewing methods. The teleradiology system consisted of an analog video-based system simulating remote radiographic consultation between a general dentist and a dental imaging specialist. Conventional viewing consisted of traditional viewbox methods. Observers were asked to identify the presence or absence of 24 intraosseous lesions and to determine their locations. No statistically significant differences in modalities or observers were identified between methods at the 0.05 level. The results indicate that viewing intraosseous lesions of video-based panoramic images is equal to conventional light box viewing.
Do flexible acrylic resin lingual flanges improve retention of mandibular complete dentures?
Ahmed Elmorsy, Ayman Elmorsy; Ahmed Ibraheem, Eman Mostafa; Ela, Alaa Aboul; Fahmy, Ahmed; Nassani, Mohammad Zakaria
2015-01-01
Objectives: The aim of this study was to compare the retention of conventional mandibular complete dentures with that of mandibular complete dentures having lingual flanges constructed with flexible acrylic resin “Versacryl.” Materials and Methods: The study sample comprised 10 completely edentulous patients. Each patient received one maxillary complete denture and two mandibular complete dentures. One mandibular denture was made of conventional heat-cured acrylic resin and the other had its lingual flanges made of flexible acrylic resin Versacryl. Digital force-meter was used to measure retention of mandibular dentures at delivery and at 2 weeks and 45 days following denture insertion. Results: The statistical analysis showed that at baseline and follow-up appointments, retention of mandibular complete dentures with flexible lingual flanges was significantly greater than retention of conventional mandibular dentures (P < 0.05). In both types of mandibular dentures, retention of dentures increased significantly over the follow-up period (P < 0.05). Conclusions: The use of flexible acrylic resin lingual flanges in the construction of mandibular complete dentures improved denture retention. PMID:26539387
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
Performance analysis of cutting graphite-epoxy composite using a 90,000psi abrasive waterjet
NASA Astrophysics Data System (ADS)
Choppali, Aiswarya
Graphite-epoxy composites are being widely used in many aerospace and structural applications because of their properties: which include lighter weight, higher strength to weight ratio and a greater flexibility in design. However, the inherent anisotropy of these composites makes it difficult to machine them using conventional methods. To overcome the major issues that develop with conventional machining such as fiber pull out, delamination, heat generation and high tooling costs, an effort is herein made to study abrasive waterjet machining of composites. An abrasive waterjet is used to cut 1" thick graphite epoxy composites based on baseline data obtained from the cutting of ¼" thick material. The objective of this project is to study the surface roughness of the cut surface with a focus on demonstrating the benefits of using higher pressures for cutting composites. The effects of major cutting parameters: jet pressure, traverse speed, abrasive feed rate and cutting head size are studied at different levels. Statistical analysis of the experimental data provides an understanding of the effect of the process parameters on surface roughness. Additionally, the effect of these parameters on the taper angle of the cut is studied. The data is analyzed to obtain a set of process parameters that optimize the cutting of 1" thick graphite-epoxy composite. The statistical analysis is used to validate the experimental data. Costs involved in the cutting process are investigated in term of abrasive consumed to better understand and illustrate the practical benefits of using higher pressures. It is demonstrated that, as pressure increased, ultra-high pressure waterjets produced a better surface quality at a faster traverse rate with lower costs.
Quantitative analysis of tympanic membrane perforation: a simple and reliable method.
Ibekwe, T S; Adeosun, A A; Nwaorgu, O G
2009-01-01
Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.
Tur Martínez, Jaume; Petrone, Patrizio; Axelrad, Alexander; Marini, Corrado P
2018-05-12
TEG provides an in-vivo assessment of viscoelastic clot strength in whole blood compared with CCT, which may not reflect the influence of platelets. The aim of this study was to compare TEG vs. CCT in trauma patients stratified by mechanism of injury (MOI) and pre-existing coagulation status. A retrospective, observational study of 230 polytrauma patients admitted to a University Hospital Level 1 Trauma Center, with TEG and CCT on admission stratified by MOI: multiple trauma (MT), isolated traumatic brain injury (TBI) or MT+TBI. Statistical analysis included correlation between TEG and CCT in all groups and a subgroup analysis of anticoagulated patients. Data were analyzed with ANOVA, Spearman and lineal regression when appropriate. Statistical significance was accepted at P<0.05. TEG was normal in 28.7%, hypercoagulable in 68.3%, hypocoagulable in 7%. There was no difference in TEG status among the groups. The coagulation status was not affected by age, ISS or shock. The CCT were abnormal in 63.6% of patients with normal TEG. Normal or hypercoagulable-TEG was found in 21/23 patients on Coumadin who had elevated INR and in 10/11 patients on NOAC. An analysis of the 23 patients on Coumadin stratified by INR showed a normal or hypercoagulable-TEG in 21/23 patients. Only 2 patients had a hypocoagulable-TEG. Mortality was 5.2% (58.3% severe TBI). TEG is more useful than CCT in polytrauma patients, including patients on anticoagulants. TBI could increase the incidence of hypercoagulability in trauma. CCT are not useful from the standpoint of treatment. Copyright © 2018 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Large Metal Heads and Vitamin E Polyethylene Increase Frictional Torque in Total Hip Arthroplasty.
Meneghini, R Michael; Lovro, Luke R; Wallace, Joseph M; Ziemba-Davis, Mary
2016-03-01
Trunnionosis has reemerged in modern total hip arthroplasty for reasons that remain unclear. Bearing frictional torque transmits forces to the modular head-neck interface, which may contribute to taper corrosion. The purpose of this study is to compare frictional torque of modern bearing couples in total hip arthroplasty. Mechanical testing based on in vivo loading conditions was used to measure frictional torque. All bearing couples were lubricated and tested at 1 Hz for more than 2000 cycles. The bearing couples tested included conventional, highly crosslinked (XLPE) and vitamin E polyethylene, CoCr, and ceramic femoral heads and dual-mobility bearings. Statistical analysis was performed using Student t test for single-variable and analysis of variance for multivariant analysis. P ≤ .05 was considered statistically significant. Large CoCr metal heads (≥36 mm) substantially increased frictional torque against XLPE liners (P = .01), a finding not observed in ceramic heads. Vitamin E polyethylene substantially increased frictional torque compared with XLPE in CoCr and ceramic heads (P = .001), whereas a difference between conventional and XLPE was not observed (P = .69) with the numbers available. Dual-mobility bearing with ceramic inner head demonstrated the lowest mean frictional torque of all bearing couples. In this simulated in vivo model, large-diameter CoCr femoral heads and vitamin E polyethylene liners are associated with increased frictional torque compared with smaller metal heads and XLPE, respectively. The increased frictional torque of vitamin E polyethylene and larger-diameter femoral heads should be considered and further studied, along with reported benefits of these modern bearing couples. Copyright © 2016 Elsevier Inc. All rights reserved.
Havla, Lukas; Schneider, Moritz J; Thierfelder, Kolja M; Beyer, Sebastian E; Ertl-Wagner, Birgit; Reiser, Maximilian F; Sommer, Wieland H; Dietrich, Olaf
2016-02-01
The purpose of this study was to propose and evaluate a new wavelet-based technique for classification of arterial and venous vessels using time-resolved cerebral CT perfusion data sets. Fourteen consecutive patients (mean age 73 yr, range 17-97) with suspected stroke but no pathology in follow-up MRI were included. A CT perfusion scan with 32 dynamic phases was performed during intravenous bolus contrast-agent application. After rigid-body motion correction, a Paul wavelet (order 1) was used to calculate voxelwise the wavelet power spectrum (WPS) of each attenuation-time course. The angiographic intensity A was defined as the maximum of the WPS, located at the coordinates T (time axis) and W (scale/width axis) within the WPS. Using these three parameters (A, T, W) separately as well as combined by (1) Fisher's linear discriminant analysis (FLDA), (2) logistic regression (LogR) analysis, or (3) support vector machine (SVM) analysis, their potential to classify 18 different arterial and venous vessel segments per subject was evaluated. The best vessel classification was obtained using all three parameters A and T and W [area under the curve (AUC): 0.953 with FLDA and 0.957 with LogR or SVM]. In direct comparison, the wavelet-derived parameters provided performance at least equal to conventional attenuation-time-course parameters. The maximum AUC obtained from the proposed wavelet parameters was slightly (although not statistically significantly) higher than the maximum AUC (0.945) obtained from the conventional parameters. A new method to classify arterial and venous cerebral vessels with high statistical accuracy was introduced based on the time-domain wavelet transform of dynamic CT perfusion data in combination with linear or nonlinear multidimensional classification techniques.
Li, Xiaohui; Yu, Jianhua; Gong, Yuekun; Ren, Kaijing; Liu, Jun
2015-04-21
To assess the early postoperative clinical and radiographic outcomes after navigation-assisted or standard instrumentation total knee arthroplasty (TKA). From August 2007 to May 2008, 60 KSS-A type patients underwent 67 primary TKA operations by the same surgical team. Twenty-two operations were performed with the Image-free navigation system with an average age of 64.5 years while the remaining 45 underwent conventional manual procedures with an average age of 66 years. Their preoperative demographic and functional data had no statistical differences (P>0.05). The operative duration, blood loss volume and hospitalization days were compared for two groups. And radiographic data included coronal femoral component angle, coronal tibial component angle, sagittal femoral component angle, sagittal tibial component angle and coronal tibiofemoral angle after one month. And functional assessment scores were evaluated at 1, 3 and 6 months postoperatively. Operative duration was significantly longer for computer navigation (P<0.05). The average blood loss volume was 555.26 ml in computer navigation group and 647.56 ml in conventional manual method group (P<0.05). And hospitalization stay was shorter in computer navigation group than that in conventional method group (7.74 vs 8.68 days) (P=0.04). The alignment deviation was better in computer-assisted group than that in conventional manual method group (P<0.05). The percentage of patients with a coronal tibiofemoral angle within ±3 of ideal value was 95.45% for computer-assisted mini-invasive TKA group and 80% for conventional TKA group (P=0.003). The Knee Society Clinical Rating Score was higher in computer-assisted group than that in conventional manual method group at 1 and 3 montha post-operation. However, no statistical inter-group difference existed at 6 months post-operation. Navigation allows a surgeon to precisely implant the components for TKA. And it offers faster functional recovery and shorter hospitalization stay. At 6 months post-operation, there is no statistical inter-group difference in KSS scores.
Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy
2013-01-01
Background Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. Purpose The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. Material and methods We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student’s t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Results Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (±57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (±45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (±633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). Conclusion We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction. PMID:24143076
Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy
2013-01-01
Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student's t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (± 57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (± 45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (± 633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction.
Gould, A Lawrence
2016-12-30
Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-04-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-01-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325
Lee, Jung-Ju; Lee, Sang Kun; Choi, Jang Wuk; Kim, Dong-Wook; Park, Kyung Il; Kim, Bom Sahn; Kang, Hyejin; Lee, Dong Soo; Lee, Seo-Young; Kim, Sung Hun; Chung, Chun Kee; Nam, Hyeon Woo; Kim, Kwang Ki
2009-12-01
Ictal single-photon emission computed tomography (SPECT) is a valuable method for localizing the ictal onset zone in the presurgical evaluation of patients with intractable epilepsy. Conventional methods used to localize the ictal onset zone have problems with time lag from seizure onset to injection. To evaluate the clinical usefulness of a method that we developed, which involves an attachable automated injector (AAI), in reducing time lag and improving the ability to localize the zone of seizure onset. Patients admitted to the epilepsy monitoring unit (EMU) between January 1, 2003, and June 30, 2008, were included. The definition of ictal onset zone was made by comprehensive review of medical records, magnetic resonance imaging (MRI), data from video electroencephalography (EEG) monitoring, and invasive EEG monitoring if available. We comprehensively evaluated the time lag to injection and the image patterns of ictal SPECT using traditional visual analysis, statistical parametric mapping-assisted, and subtraction ictal SPECT coregistered to an MRI-assisted means of analysis. Image patterns were classified as localizing, lateralizing, and nonlateralizing. The whole number of patients was 99: 48 in the conventional group and 51 in the AAI group. The mean (SD) delay time to injection from seizure onset was 12.4+/-12.0 s in the group injected by our AAI method and 40.4+/-26.3 s in the group injected by the conventional method (P=0.000). The mean delay time to injection from seizure detection was 3.2+/-2.5 s in the group injected by the AAI method and 21.4+/-9.7 s in the group injected by the conventional method (P=0.000). The AAI method was superior to the conventional method in localizing the area of seizure onset (36 out of 51 with AAI method vs. 21 out of 48 with conventional method, P=0.009), especially in non-temporal lobe epilepsy (non-TLE) patients (17 out of 27 with AAI method vs. 3 out of 13 with conventional method, P=0.041), and in lateralizing the seizure onset hemisphere (47 out of 51 with AAI method vs. 33 out of 48 with conventional method, P=0.004). The AAI method was superior to the conventional method in reducing the time lag of tracer injection and in localizing and lateralizing the ictal onset zone, especially in patients with non-TLE.
Statistical modeling of yield and variance instability in conventional and organic cropping systems
USDA-ARS?s Scientific Manuscript database
Cropping systems research was undertaken to address declining crop diversity and verify competitiveness of alternatives to the predominant conventional cropping system in the northern Corn Belt. To understand and capitalize on temporal yield variability within corn and soybean fields, we quantified ...
A study of the utilization of ERTS-1 data from the Wabash River Basin
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator)
1974-01-01
The author has identified the following significant results. The identification and area estimation of crops experiment tested the usefulness of ERTS data for crop survey and produced results indicating that crop statistics could be obtained from ERTS imagery. Soil association mapping results showed that strong relationships exist between ERTS data derived maps and conventional soil maps. Urban land use analysis experiment results indicate potential for accurate gross land use mapping. Water resources mapping demonstrated the feasibility of mapping water bodies using ERTS imagery.
Erovic, Boban M; Chan, Harley H L; Daly, Michael J; Pothier, David D; Yu, Eugene; Coulson, Chris; Lai, Philip; Irish, Jonathan C
2014-01-01
Conventional computed tomography (CT) imaging is the standard imaging technique for temporal bone diseases, whereas cone-beam CT (CBCT) imaging is a very fast imaging tool with a significant less radiation dose compared with conventional CT. We hypothesize that a system for intraoperative cone-beam CT provides comparable image quality to diagnostic CT for identifying temporal bone anatomical landmarks in cadaveric specimens. Cross-sectional study. University tertiary care facility. Twenty cadaveric temporal bones were affixed into a head phantom and scanned with both a prototype cone-beam CT C-arm and multislice helical CT. Imaging performance was evaluated by 3 otologic surgeons and 1 head and neck radiologist. Participants were presented images in a randomized order and completed landmark identification questionnaires covering 21 structures. CBCT and multislice CT have comparable performance in identifying temporal structures. Three otologic surgeons indicated that CBCT provided statistically equivalent performance for 19 of 21 landmarks, with CBCT superior to CT for the chorda tympani and inferior for the crura of the stapes. Subgroup analysis showed that CBCT performed superiorly for temporal bone structures compared with CT. The radiologist rated CBCT and CT as statistically equivalent for 18 of 21 landmarks, with CT superior to CBCT for the crura of stapes, chorda tympani, and sigmoid sinus. CBCT provides comparable image quality to conventional CT for temporal bone anatomical sites in cadaveric specimens. Clinical applications of low-dose CBCT imaging in surgical planning, intraoperative guidance, and postoperative assessment are promising but require further investigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, Brian T; Korber, Bette T
2008-01-01
Simian immunodeficiency virus infection of macaques may result in neuroAIDS, a feature more commonly observed in macaques with rapid progressive disease than in those with conventional disease. This is the first report of two conventional progressors (H631 and H636) with encephalitis in rhesus macaques inoculated with a derivative of SIVsmES43-3. Phylogenetic analyses of viruses isolated from the cerebral spinal fluid (CSF) and plasma from both animals demonstrated tissue compartmentalization. Additionally, virus from the central nervous system (CNS) was able to infect primary macaque monocyte-derived macrophages more efficiently than virus from plasma. Conversely, virus isolated from plasma was able to replicatemore » better in peripheral blood mononuclear cells than virus from CNS. We speculate that these viruses were under different selective pressures in their separate compartments. Furthermore, these viruses appear to have undergone adaptive evolution to preferentially replicate in their respective cell targets. Analysis of the number of potential N-linked glycosylation sites (PNGS) in gp160 showed that there was a statistically significant loss of PNGS in viruses isolated from CNS in both macaques compared to SIVsmE543-3. Moreover, virus isolated from the brain in H631, had statistically significant loss of PNGS compared to virus isolated from CSF and plasma of the same animal. It is possible that the brain isolate may have adapted to decrease the number of PNGS given that humoral immune selection pressure is less likely to be encountered in the brain. These viruses provide a relevant model to study the adaptations required for SIV to induce encephalitis.« less
Boonsiriseth, K; Sirintawat, N; Arunakul, K; Wongsirichat, N
2013-07-01
This study aimed to evaluate the efficacy of anesthesia obtained with a novel injection approach for inferior alveolar nerve block compared with the conventional injection approach. 40 patients in good health, randomly received each of two injection approaches of local anesthetic on each side of the mandible at two separate appointments. A sharp probe and an electric pulp tester were used to test anesthesia before injection, after injection when the patients' sensation changed, and 5 min after injection. This study comprised positive aspiration and intravascular injection 5% and neurovascular bundle injection 7.5% in the conventional inferior alveolar nerve block, but without occurrence in the novel injection approach. A visual analog scale (VAS) pain assessment was used during injection and surgery. The significance level used in the statistical analysis was p<0.05. For the novel injection approach compared with the conventional injection approach, no significant difference was found on the subjective onset, objective onset, operation time, duration of anesthesia and VAS pain score during operation, but the VAS pain score during injection was significantly different. The efficacy of inferior alveolar nerve block by the novel injection approach provided adequate anesthesia and caused less pain and greater safety during injection. Copyright © 2012 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Vina, Andres; Peters, Albert J.; Ji, Lei
2003-01-01
There is a global concern about the increase in atmospheric concentrations of greenhouse gases. One method being discussed to encourage greenhouse gas mitigation efforts is based on a trading system whereby carbon emitters can buy effective mitigation efforts from farmers implementing conservation tillage practices. These practices sequester carbon from the atmosphere, and such a trading system would require a low-cost and accurate method of verification. Remote sensing technology can offer such a verification technique. This paper is focused on the use of standard image processing procedures applied to a multispectral Ikonos image, to determine whether it is possible to validate that farmers have complied with agreements to implement conservation tillage practices. A principal component analysis (PCA) was performed in order to isolate image variance in cropped fields. Analyses of variance (ANOVA) statistical procedures were used to evaluate the capability of each Ikonos band and each principal component to discriminate between conventional and conservation tillage practices. A logistic regression model was implemented on the principal component most effective in discriminating between conventional and conservation tillage, in order to produce a map of the probability of conventional tillage. The Ikonos imagery, in combination with ground-reference information, proved to be a useful tool for verification of conservation tillage practices.
2015-01-01
Attenuation of the pesticide fipronil and its major degradates was determined during conventional wastewater treatment and wetland treatment. Analysis of flow-weighted composite samples by liquid and gas chromatography–tandem mass spectrometry showed fipronil occurrence at 12–31 ng/L in raw sewage, primary effluent, secondary effluent, chlorinated effluent, and wetland effluent. Mean daily loads of total fipronil related compounds in raw sewage and in plant effluent after chlorination were statistically indistinguishable (p = 0.29; n = 10), whereas fipronil itself was partially removed (25 ± 3%; p = 0.00025; n = 10); the associated loss in toxicity was balanced by the formation of toxic fipronil degradates, showing conventional treatment to be unfit for reducing overall toxicity. In contrast to these findings at the municipal wastewater treatment, both parental fipronil and the sum of fipronil-related compounds were removed in the wetland with efficiencies of 44 ± 4% and 47 ± 13%, respectively. Total fipronil concentrations in plant effluent (28 ± 6 ng/L as fipronil) were within an order of magnitude of half-maximal effective concentrations (EC50) of nontarget invertebrates. This is the first systematic assessment of the fate of fipronil and its major degradates during full-scale conventional wastewater and constructed wetland treatment. PMID:26710933
NASA Astrophysics Data System (ADS)
Sudhakar, Beeravelli; Krishna, Mylangam Chaitanya; Murthy, Kolapalli Venkata Ramana
2016-01-01
The aim of the present study was to formulate and evaluate the ritonavir-loaded stealth liposomes by using 32 factorial design and intended to delivered by parenteral delivery. Liposomes were prepared by ethanol injection method using 32 factorial designs and characterized for various physicochemical parameters such as drug content, size, zeta potential, entrapment efficiency and in vitro drug release. The optimization process was carried out using desirability and overlay plots. The selected formulation was subjected to PEGylation using 10 % PEG-10000 solution. Stealth liposomes were characterized for the above-mentioned parameters along with surface morphology, Fourier transform infrared spectrophotometer, differential scanning calorimeter, stability and in vivo pharmacokinetic studies in rats. Stealth liposomes showed better result compared to conventional liposomes due to effect of PEG-10000. The in vivo studies revealed that stealth liposomes showed better residence time compared to conventional liposomes and pure drug solution. The conventional liposomes and pure drug showed dose-dependent pharmacokinetics, whereas stealth liposomes showed long circulation half-life compared to conventional liposomes and pure ritonavir solution. The results of statistical analysis showed significance difference as the p value is (<0.05) by one-way ANOVA. The result of the present study revealed that stealth liposomes are promising tool in antiretroviral therapy.
Hasan, Shadi W; Elektorowicz, Maria; Oleszkiewicz, Jan A
2012-09-01
The influence of sludge properties in SMEBR and conventional MBR pilot systems on membrane fouling was investigated. Generated data were analyzed using statistical analysis Pearson's product momentum correlation coefficient (r(p)). Analysis showed that TMP had strong direct (r(p)=0.9182) and inverse (r(p)=-0.9205) correlations to mean particle size diameter in MBR and SMEBR, respectively. TMP in SMEBR had a strong direct correlation to the sludge mixed liquor suspended solids concentration (MLSS) (r(p)=0.7757) while a weak direct correlation (r(p)=0.1940) was observed in MBR. SMEBR showed a moderate inverse correlation (r(p)=-0.6118) between TMP and soluble carbohydrates (EPS(c)) and a very weak direct correlation (r(p)=0.3448) to soluble proteins (EPS(p)). Conversely, EPS(p) in MBR had more significant impact (r(p)=0.4856) on membrane fouling than EPS(c) (r(p)=0.3051). The results provide insight into optimization of operational conditions in SMEBR system to overcome membrane fouling. Copyright © 2012 Elsevier Ltd. All rights reserved.
Data-adaptive test statistics for microarray data.
Mukherjee, Sach; Roberts, Stephen J; van der Laan, Mark J
2005-09-01
An important task in microarray data analysis is the selection of genes that are differentially expressed between different tissue samples, such as healthy and diseased. However, microarray data contain an enormous number of dimensions (genes) and very few samples (arrays), a mismatch which poses fundamental statistical problems for the selection process that have defied easy resolution. In this paper, we present a novel approach to the selection of differentially expressed genes in which test statistics are learned from data using a simple notion of reproducibility in selection results as the learning criterion. Reproducibility, as we define it, can be computed without any knowledge of the 'ground-truth', but takes advantage of certain properties of microarray data to provide an asymptotically valid guide to expected loss under the true data-generating distribution. We are therefore able to indirectly minimize expected loss, and obtain results substantially more robust than conventional methods. We apply our method to simulated and oligonucleotide array data. By request to the corresponding author.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
NASA Astrophysics Data System (ADS)
Mandelis, Andreas; Zhang, Yu; Melnikov, Alexander
2012-09-01
A solar cell lock-in carrierographic image generation theory based on the concept of non-equilibrium radiation chemical potential was developed. An optoelectronic diode expression was derived linking the emitted radiative recombination photon flux (current density), the solar conversion efficiency, and the external load resistance via the closed- and/or open-circuit photovoltage. The expression was shown to be of a structure similar to the conventional electrical photovoltaic I-V equation, thereby allowing the carrierographic image to be used in a quantitative statistical pixel brightness distribution analysis with outcome being the non-contacting measurement of mean values of these important parameters averaged over the entire illuminated solar cell surface. This is the optoelectronic equivalent of the electrical (contacting) measurement method using an external resistor circuit and the outputs of the solar cell electrode grid, the latter acting as an averaging distribution network over the surface. The statistical theory was confirmed using multi-crystalline Si solar cells.
Erfanian, Parham; Tenzif, Siamak; Guerriero, Rocco C
2004-01-01
Objective To determine the effects of a semi-customized experimental cervical pillow on symptomatic adults with chronic neck pain (with and without headache) during a four week study. Design A randomized controlled trial. Sample size Thirty-six adults were recruited for the trial, and randomly assigned to experimental or non-experimental groups of 17 and 19 participants respectively. Subjects Adults with chronic biomechanical neck pain who were recruited from the Canadian Memorial Chiropractic College (CMCC) Walk-in Clinic. Outcome measures Subjective findings were assessed using a mail-in self-report daily pain diary, and the CMCC Neck Disability Index (NDI). Statistical analysis Using repeated measure analysis of variance weekly NDI scores, average weekly AM and PM pain scores between the experimental and non-experimental groups were compared throughout the study. Results The experimental group had statistically significant lower NDI scores (p < 0.05) than the non-experimental group. The average weekly AM scores were lower and statistically significant (p < 0.05) in the experimental group. The PM scores in the experimental group were lower but not statistically significant than the other group. Conclusions The study results show that compared to conventional pillows, this experimental semi-customized cervical pillow was effective in reducing low-level neck pain intensity, especially in the morning following its use in a 4 week long study. PMID:17549216
Cox, Tony; Popken, Douglas; Ricci, Paolo F
2013-01-01
Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662
The Importance of Proving the Null
Gallistel, C. R.
2010-01-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? PMID:19348549
Zhao, Bo; Yang, Lanju; Xiao, Lei; Sun, Baoquan; Zou, Xianbao; Gao, Dongmei; Jian, Xiandong
2016-01-01
To observe the effect of sodium bicarbonate combined with ulinastatin on cholinesterase activity for patients with acute phoxim pesticide poisoning. A total of 67 eligible patients with acute phoxim pesticide poisoning, Who were admitted to the emeryency department of hospital from March 2011 to February 2014, Acording to different treatments au patients were randomly divided into the conventional treatment group (n=34) and the sodium bicarbonate+ulinastatin group (n=35) . The conventional treatment group were given thorough gastric lavage with water, the sodium bicarbonate + ulinastatin group were given gastric lavage with 2% sodium bicarbonate solution. Both groups were given such treatments as catharsis, administration of oxygen, fluid infusion, diuresis, and antidotes such as atropine and pralidoxime methylchloride. On the basis of comprehensive treatment, people in the sodium bicarbonate+ulinastatin group were given 5% sodium bicarbonate injection and ulinastatin. The clinical effect of the two groups were compared. The serum cholinesterase activity of the sodium bicarbonate+ulinastatin group was significantly higher than the conventional treatment group from the 5th day, and the difference was statistically significant (P<0.05) . The total atropine dosage, total pralidoxime methylchloride dosage and hospitalization days were better than the conventional treatment group, and the differences were statistically significant (P<0.05) . The difference in the time of atropinization between the two groups was not statistically significant (P>0.05) . The results of arterial blood pH, HCO3- of the sodium bicarbonate + ulinastatin group were higher than the conventional treatment group, and the difference of HCO3- at the 10th day was statistically significant (P<0.05) . Sodium bicarbonate combined with ulinastatin can improve the therapeutic effect and reduce complications in the treatment of acute phoxim pesticide poisoning, and have beneficial effects on the recovery of cholinesterase activity.
An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling
Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher
2013-01-01
The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188
Impact of satellite-based data on FGGE general circulation statistics
NASA Technical Reports Server (NTRS)
Salstein, David A.; Rosen, Richard D.; Baker, Wayman E.; Kalnay, Eugenia
1987-01-01
The NASA Goddard Laboratory for Atmospheres (GLA) analysis/forecast system was run in two different parallel modes in order to evaluate the influence that data from satellites and other FGGE observation platforms can have on analyses of large scale circulation; in the first mode, data from all observation systems were used, while in the second only conventional upper air and surface reports were used. The GLA model was also integrated for the same period without insertion of any data; an independent objective analysis based only on rawinsonde and pilot balloon data is also performed. A small decrease in the vigor of the general circulation is noted to follow from the inclusion of satellite observations.
Lash, Timothy L
2007-11-26
The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is likely to lead to overconfidence regarding the potential for causal associations, whereas the former safeguards against such overinterpretations. Furthermore, such analyses, once programmed, allow rapid implementation of alternative assignments of probability distributions to the bias parameters, so elevate the plane of discussion regarding study bias from characterizing studies as "valid" or "invalid" to a critical and quantitative discussion of sources of uncertainty.
Saez, M; Figueiras, A; Ballester, F; Perez-Hoyos, S; Ocana, R; Tobias, A
2001-01-01
STUDY OBJECTIVE—The objective of this paper is to introduce a different approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. DESIGN—The approach is illustrated by providing quantitative estimates of the short-term effects of air pollution on mortality in three Spanish cities, Barcelona, Valencia and Vigo, for the period 1992-1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were non-linear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. MAIN RESULTS—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 µg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 µg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant, especially those among air pollutants and meteorological variables. CONCLUSIONS—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities. Keywords: air pollution; mortality; longitudinal studies PMID:11351001
The Use of Interrupted Case Studies to Enhance Critical Thinking Skills in Biology
White, Tracy K.; Whitaker, Paul; Gonya, Terri; Hein, Richard; Kroening, Dubear; Lee, Kevin; Lee, Laura; Lukowiak, Andrea; Hayes, Elizabeth
2009-01-01
There has been a dramatic increase in the availability of case studies for use in the biology classroom, and perceptions of the effectiveness of case-study-based learning are overwhelmingly positive. Here we report the results of a study in which we evaluated the ability of interrupted case studies to improve critical thinking in the context of experimental design and the conventions of data interpretation. Students were assessed using further case studies designed to evaluate their ability to recognize and articulate problematic approaches to these elements of experimentation. Our work reveals that case studies have broad utility in the classroom. In addition to demonstrating a small but statistically significant increase in the number of students capable of critically evaluating selected aspects of experimental design, we also observed increased student engagement and documented widespread misconceptions regarding the conventions of data acquisition and analysis. PMID:23653687
Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid
2014-01-01
Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762
Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography
2018-01-01
The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user’s eye gaze. PMID:29304120
Liñero, Olaia; Cidad, Maite; Carrero, Jose Antonio; Nguyen, Christophe; de Diego, Alberto
2015-11-04
A 5-month experiment was performed to study the accumulation of several inorganic elements in tomato plants cultivated using organic or synthetic fertilizer. Plants were harvested in triplicate at six sampling dates during their life cycle. Statistical and chemometric analysis of data indicated the sequestration of toxic elements and of Na, Zn, Fe, and Co in roots, while the rest of the elements, including Cd, were mainly translocated to aboveground organs. A general decreasing trend in element concentrations with time was observed for most of them. A negative correlation between some element concentrations and ripening stage of fruits was identified. Conventionally grown plants seemed to accumulate more Cd and Tl in their tissues, while organic ones were richer in some nutrients. However, there was no clear effect of the fertilizer used (organic vs synthetic) on the elemental composition of fruits.
Hládek, Ľuboš; Porr, Bernd; Brimijoin, W Owen
2018-01-01
The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user's eye gaze.
De-Deus, Gustavo; Brandão, Maria Claudia; Barino, Bianca; Di Giorgi, Karina; Fidel, Rivail Antonio Sergio; Luna, Aderval Severino
2010-09-01
This study was designed to quantitatively evaluate the amount of dentin debris extruded from the apical foramen by comparing the conventional sequence of the ProTaper Universal nickel-titanium (NiTi) files with the single-file ProTaper F2 technique. Thirty mesial roots of lower molars were selected, and the use of different instrumentation techniques resulted in 3 groups (n=10 each). In G1, a crown-down hand-file technique was used, and in G2 conventional ProTaper Universal technique was used. In G3, ProTaper F2 file was used in a reciprocating motion. The apical finish preparation was equivalent to ISO size 25. An apparatus was used to evaluate the apically extruded debris. Statistical analysis was performed using 1-way analysis of variance and Tukey multiple comparisons. No significant difference was found in the amount of the debris extruded between the conventional sequence of the ProTaper Universal NiTi files and the single-file ProTaper F2 technique (P>.05). In contrast, the hand instrumentation group extruded significantly more debris than both NiTi groups (P<.05). The present results yielded favorable input for the F2 single-file technique in terms of apically extruded debris, inasmuch as it is the most simple and cost-effective instrumentation approach. Copyright (c) 2010 Mosby, Inc. All rights reserved.
Franco, Érika Mendonça Fernandes; Valarelli, Fabrício Pinelli; Fernandes, João Batista; Cançado, Rodrigo Hermont; de Freitas, Karina Maria Salvatore
2015-01-01
Abstract Objective: The aim of this study was to compare torque expression in active and passive self-ligating and conventional brackets. Methods: A total of 300 segments of stainless steel wire 0.019 x 0.025-in and six different brands of brackets (Damon 3MX, Portia, In-Ovation R, Bioquick, Roth SLI and Roth Max) were used. Torque moments were measured at 12°, 24°, 36° and 48°, using a wire torsion device associated with a universal testing machine. The data obtained were compared by analysis of variance followed by Tukey test for multiple comparisons. Regression analysis was performed by the least-squares method to generate the mathematical equation of the optimal curve for each brand of bracket. Results: Statistically significant differences were observed in the expression of torque among all evaluated bracket brands in all evaluated torsions (p < 0.05). It was found that Bioquick presented the lowest torque expression in all tested torsions; in contrast, Damon 3MX bracket presented the highest torque expression up to 36° torsion. Conclusions: The connection system between wire/bracket (active, passive self-ligating or conventional with elastic ligature) seems not to interfere in the final torque expression, the latter being probably dependent on the interaction between the wire and the bracket chosen for orthodontic mechanics. PMID:26691972
Tennankore, Karthik K; Na, Yingbo; Wald, Ron; Chan, Christopher T; Perl, Jeffrey
2018-01-01
Home hemodialysis (HHD) has many benefits, but less is known about relative outcomes when comparing different home-based hemodialysis modalities. Here, we compare patient and treatment survival for patients receiving short daily HHD (2-3 hours/5 plus sessions per week), nocturnal HHD (6-8 hours/5 plus sessions per week) and conventional HHD (3-6 hours/2-4 sessions per week). A nationally representative cohort of Canadian HHD patients from 1996-2012 was studied. The primary outcome was death or treatment failure (defined as a permanent return to in-center hemodialysis or peritoneal dialysis) using an intention to treat analysis and death-censored treatment failure as a secondary outcome. The cohort consisted of 600, 508 and 202 patients receiving conventional, nocturnal, and short daily HHD, respectively. Conventional-HHD patients were more likely to use dialysis catheter access (43%) versus nocturnal or short daily HHD (32% and 31%, respectively). Although point estimates were in favor of both therapies, after multivariable adjustment for patient and center factors, there was no statistically significant reduction in the relative hazard for the death/treatment failure composite comparing nocturnal to conventional HHD (hazard ratio 0.83 [95% confidence interval 0.66-1.03]) or short daily to conventional HHD (0.84, 0.63-1.12). Among those with information on vascular access, patients receiving nocturnal HHD had a relative improvement in death-censored treatment survival (0.75, 0.57-0.98). Thus, in this national cohort of HHD patients, those receiving short daily and nocturnal HHD had similar patient/treatment survival compared with patients receiving conventional HHD. Copyright © 2017 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Filintas, Agathos, , Dr; Hatzigiannakis, Evagellos, , Dr; Arampatzis, George, , Dr; Ilias, Andreas; Panagopoulos, Andreas, , Dr; Hatzispiroglou, Ioannis
2015-04-01
The aim of the present study is a thorough comparison of hydrometry's conventional and innovative methods-tools for river flow monitoring. A case study was conducted in Stara river at Agios Germanos monitoring station (northwest Greece), in order to investigate possible deviations between conventional and innovative methods-tools on river flow velocity and discharge. For this study, two flowmeters were used, which manufac-tured in 2013 (OTT Messtechnik Gmbh, 2013), as follows: a) A conventional propeller flow velocity meter (OTT-Model C2) which is a me-chanical current flow meter with a certification of calibration BARGO, operated with a rod and a relocating device, along with a digital measuring device including an elec-tronic flow calculator, data logger and real time control display unit. The flowmeter has a measurement velocity range 0.025-4.000 m/s. b) An innovative electromagnetic flowmeter (OTT-Model MF pro) which it is con-sisted of a compact and light-weight sensor and a robust handheld unit. Both system components are designed to be attached to conventional wading rods. The electromag-netic flowmeter uses Faraday's Law of electromagnetic induction to measure the process flow. When an electrically conductive fluid flows along the meter, an electrode voltage is induced between a pair of electrodes placed at right angles to the direction of mag-netic field. The electrode voltage is directly proportional to the average fluid velocity. The electromagnetic flowmeter was operated with a rod and relocating device, along with a digital measuring device with various logging and graphical capabilities and vari-ous methods of velocity measurement (ISO/USGS standards). The flowmeter has a measurement velocity range 0.000-6.000 m/s. The river flow data were averaged over a pair measurement of 60+60 seconds and the measured river water flow velocity, depths and widths of the segments were used for the estimation of cross-section's mean flow velocity in each measured segment. Then it was used the mid-section method for the overall discharge calculation of all segments flow area. The cross-section characteristics, the river flow velocity of segments and the mean water flow velocity and discharge total profile were measured, calculated and an-notated respectively. A series of concurrent conventional and innovative (electromag-netic) flow measurements were performed during 2014. The results and statistical analysis showed that Froude number during the measurement period in all cases was Fr<1 which means that the water flow of the Stara river is classified as subcritical flow. The 12 months' study showed various advantages for the elec-tromagnetic sensor that is virtually maintenance-free because there are no moving parts, no calibration was required in practice, and it can be used even in the lowest water ve-locities from 0.000 m/s. Moreover, based on the concurrent hydromeasurements of the Stara River, on the velocity and discharge modelling and the statistical analysis, it was found that there was not a significant statistical difference (α=0.05) between mean velocity measured with a) conventional and b) electromagnetic method which seems to be more accurate in low velocities where a significant statistical difference was found. Acknowledgments Data in this study are collected in the framework of the elaboration of the national water resources monitoring network, supervised by the Special Secretariat for Water-Hellenic Ministry for the Environment and Climate Change. This project is elaborated in the framework of the operational program "Environment and Sustainable Development" which is co-funded by the National Strategic Reference Framework (NSRF) and the Public Investment Program (PIP).
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Why weight? Modelling sample and observational level variability improves power in RNA-seq analyses
Liu, Ruijie; Holik, Aliaksei Z.; Su, Shian; Jansz, Natasha; Chen, Kelan; Leong, Huei San; Blewitt, Marnie E.; Asselin-Labat, Marie-Liesse; Smyth, Gordon K.; Ritchie, Matthew E.
2015-01-01
Variations in sample quality are frequently encountered in small RNA-sequencing experiments, and pose a major challenge in a differential expression analysis. Removal of high variation samples reduces noise, but at a cost of reducing power, thus limiting our ability to detect biologically meaningful changes. Similarly, retaining these samples in the analysis may not reveal any statistically significant changes due to the higher noise level. A compromise is to use all available data, but to down-weight the observations from more variable samples. We describe a statistical approach that facilitates this by modelling heterogeneity at both the sample and observational levels as part of the differential expression analysis. At the sample level this is achieved by fitting a log-linear variance model that includes common sample-specific or group-specific parameters that are shared between genes. The estimated sample variance factors are then converted to weights and combined with observational level weights obtained from the mean–variance relationship of the log-counts-per-million using ‘voom’. A comprehensive analysis involving both simulations and experimental RNA-sequencing data demonstrates that this strategy leads to a universally more powerful analysis and fewer false discoveries when compared to conventional approaches. This methodology has wide application and is implemented in the open-source ‘limma’ package. PMID:25925576
Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.
Cole, Steve W; Galic, Zoran; Zack, Jerome A
2003-09-22
Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus
Vexler, Albert; Yu, Jihnhee
2018-04-13
A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.
Adjustment of geochemical background by robust multivariate statistics
Zhou, D.
1985-01-01
Conventional analyses of exploration geochemical data assume that the background is a constant or slowly changing value, equivalent to a plane or a smoothly curved surface. However, it is better to regard the geochemical background as a rugged surface, varying with changes in geology and environment. This rugged surface can be estimated from observed geological, geochemical and environmental properties by using multivariate statistics. A method of background adjustment was developed and applied to groundwater and stream sediment reconnaissance data collected from the Hot Springs Quadrangle, South Dakota, as part of the National Uranium Resource Evaluation (NURE) program. Source-rock lithology appears to be a dominant factor controlling the chemical composition of groundwater or stream sediments. The most efficacious adjustment procedure is to regress uranium concentration on selected geochemical and environmental variables for each lithologic unit, and then to delineate anomalies by a common threshold set as a multiple of the standard deviation of the combined residuals. Robust versions of regression and RQ-mode principal components analysis techniques were used rather than ordinary techniques to guard against distortion caused by outliers Anomalies delineated by this background adjustment procedure correspond with uranium prospects much better than do anomalies delineated by conventional procedures. The procedure should be applicable to geochemical exploration at different scales for other metals. ?? 1985.
Ferreira, Luciano Ambrosio; Grossmann, Eduardo; Januzzi, Eduardo; Gonçalves, Rafael Tardin Rosa Ferraz; Mares, Fernando Antonio Guedes; de Paula, Marcos Vinicius Queiroz; Carvalho, Antonio Carlos Pires
2015-01-01
Ear acupuncture works by reducing painful sensations with analgesic effect through microsystem therapy and has been demonstrated to be as effective as conventional therapies in the control of facial pain. This clinical trial aimed to evaluate the adjuvant action of auricular acupuncture through an observation of the evolution of temporomandibular and masticatory myofascial symptoms in two groups defined by the therapies elected: auricular acupuncture associated with occlusal splint (study) and the use of the occlusal splint plate alone (control). We have selected 20 patients, who were randomly allocated into two groups of ten individuals. Symptoms were evaluated in five different moments, every seven days. We analyzed the orofacial muscle and joint palpation in order to measure the intensity of the experienced pain. Both groups showed a statistically significant decrease in muscle and joint symptoms (p < 0.05). However, comparisons between the groups showed an expressive and significant reduction of symptomatology in the study group (p < 0.05) already on the first week of therapy. According to the results, to the methodological criteria developed and statistical analysis applied, the conclusion is that auricular acupuncture therapy has synergistic action on conventional occlusal splint treatment. It was demonstrated to be effective in the reduction of symptoms in the short term.
Ferreira, Luciano Ambrosio; Grossmann, Eduardo; Januzzi, Eduardo; Gonçalves, Rafael Tardin Rosa Ferraz; Mares, Fernando Antonio Guedes; de Paula, Marcos Vinicius Queiroz; Carvalho, Antonio Carlos Pires
2015-01-01
Ear acupuncture works by reducing painful sensations with analgesic effect through microsystem therapy and has been demonstrated to be as effective as conventional therapies in the control of facial pain. This clinical trial aimed to evaluate the adjuvant action of auricular acupuncture through an observation of the evolution of temporomandibular and masticatory myofascial symptoms in two groups defined by the therapies elected: auricular acupuncture associated with occlusal splint (study) and the use of the occlusal splint plate alone (control). We have selected 20 patients, who were randomly allocated into two groups of ten individuals. Symptoms were evaluated in five different moments, every seven days. We analyzed the orofacial muscle and joint palpation in order to measure the intensity of the experienced pain. Both groups showed a statistically significant decrease in muscle and joint symptoms (p < 0.05). However, comparisons between the groups showed an expressive and significant reduction of symptomatology in the study group (p < 0.05) already on the first week of therapy. According to the results, to the methodological criteria developed and statistical analysis applied, the conclusion is that auricular acupuncture therapy has synergistic action on conventional occlusal splint treatment. It was demonstrated to be effective in the reduction of symptoms in the short term. PMID:26351510
Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai
2017-09-01
Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.
Al-Dwairi, Ziad N; Tahboub, Kawkab Y; Baba, Nadim Z; Goodacre, Charles J
2018-06-13
The introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) technology to the field of removable prosthodontics has recently made it possible to fabricate complete dentures of prepolymerized polymethyl methacrylate (PMMA) blocks, which are claimed to be of better mechanical properties; however, no published reports that have evaluated mechanical properties of CAD/CAM PMMA. The purpose of this study was to compare flexural strength, impact strength, and flexural modulus of two brands of CAD/CAM PMMA and a conventional heat-cured PMMA. 45 rectangular specimens (65 mm × 10 mm × 3 mm) were fabricated (15 CAD/CAM AvaDent PMMA specimens from AvaDent, 15 CAD/CAM Tizian PMMA specimens from Shütz Dental, 15 conventional Meliodent PMMA specimens from Heraeus Kulzer) and stored in distilled water at (37 ± 1°C) for 7 days. Specimens (N = 15) in each group were subjected to the three-point bending test and impact strength test, employing the Charpy configuration on unnotched specimens. The morphology of the fractured specimens was studied under a scanning electron microscope (SEM). Statistical analysis was performed using one-way ANOVA and Tukey pairwise multiple comparisons with 95% confidence interval. The Schütz Dental specimens showed the highest mean flexural strength (130.67 MPa) and impact strength (29.56 kg/m 2 ). The highest mean flexural modulus was recorded in the AvaDent group (2519.6 MPa). The conventional heat-cured group showed the lowest mean flexural strength (93.33 MPa), impact strength (14.756 kg/m 2 ), and flexural modulus (2117.2 MPa). Differences in means of flexural properties between AvaDent and Schütz Dental specimens were not statistically significant (p > 0.05). As CAD/CAM PMMA specimens exhibited improved flexural strength, flexural modulus, and impact strength in comparison to the conventional heat-cured groups, CAD/CAM dentures are expected to be more durable. Different brands of CAD/CAM PMMA may have inherent variations in mechanical properties. © 2018 by the American College of Prosthodontists.
Duque, Jussaro Alves; Duarte, Marco Antonio Hungaro; Canali, Lyz Cristina Furquim; Zancan, Rafaela Fernandes; Vivan, Rodrigo Ricci; Bernardes, Ricardo Affonso; Bramante, Clovis Monteiro
2017-02-01
The aim of this study was to compare the effectiveness of Easy Clean (Easy Dental Equipment, Belo Horizonte, MG, Brazil) in continuous and reciprocating motion, passive ultrasonic irrigation (PUI), Endoactivator systems (Dentsply Maillefer, Ballaigues, Switzerland), and conventional irrigation for debris removal from root canals and isthmus. Fifty mesial roots of mandibular molars were embedded in epoxy resin using a metal muffle; afterward, the blocks containing the roots were sectioned at 2, 4, and 6 mm from the apex. After instrumentation, the roots were divided into 5 groups (n = 10) for application of the final irrigation protocol using Easy Clean in continuous rotation, Easy Clean in reciprocating motion, PUI, Endoactivator, and conventional irrigation. Scanning electron microscopic images were taken after instrumentation and after the first, second, and third activation of irrigating solution to evaluate the area of remaining debris with image J software (National Institutes of Health, Bethesda, MD). The protocol of 3 irrigating solution activations for 20 seconds provided better cleaning of the canal and isthmus. On conclusion of all procedures, analysis of the canals showed a statistical difference only at 2 mm; the Easy Clean in continuous rotation was more efficient than conventional irrigation (P < .05). On conclusion of all steps, the largest difference was observed in the isthmus in which the Easy Clean in continuous rotation was more effective than conventional irrigation at the 3 levels analyzed and the Endoactivator at 4 mm (P < .05). The PUI promoted greater cleaning than conventional irrigation at 6 mm (P < .05). There was no statistical difference between Easy Clean in continuous rotation, Easy Clean in reciprocating motion, and PUI (P > .05). Irrigating solution activation methods provided better cleaning of the canal and isthmus, especially the Easy Clean used in continuous rotation. The protocol of 3 irrigating solution activations for 20 seconds favored better cleaning. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Cox, M.; Shirono, K.
2017-10-01
A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.
Choi, Sang Hyun; Lee, Jeong Hyun; Choi, Young Jun; Park, Ji Eun; Sung, Yu Sub; Kim, Namkug; Baek, Jung Hwan
2017-01-01
This study aimed to explore the added value of histogram analysis of the ratio of initial to final 90-second time-signal intensity AUC (AUCR) for differentiating local tumor recurrence from contrast-enhancing scar on follow-up dynamic contrast-enhanced T1-weighted perfusion MRI of patients treated for head and neck squamous cell carcinoma (HNSCC). AUCR histogram parameters were assessed among tumor recurrence (n = 19) and contrast-enhancing scar (n = 27) at primary sites and compared using the t test. ROC analysis was used to determine the best differentiating parameters. The added value of AUCR histogram parameters was assessed when they were added to inconclusive conventional MRI results. Histogram analysis showed statistically significant differences in the 50th, 75th, and 90th percentiles of the AUCR values between the two groups (p < 0.05). The 90th percentile of the AUCR values (AUCR 90 ) was the best predictor of local tumor recurrence (AUC, 0.77; 95% CI, 0.64-0.91) with an estimated cutoff of 1.02. AUCR 90 increased sensitivity by 11.7% over that of conventional MRI alone when added to inconclusive results. Histogram analysis of AUCR can improve the diagnostic yield for local tumor recurrence during surveillance after treatment for HNSCC.
A meta-analysis of aneurysm formation in laser assisted vascular anastomosis (LAVA)
NASA Astrophysics Data System (ADS)
Chen, Chen; Peng, Fei; Xu, Dahai; Cheng, Qinghua
2009-08-01
Laser assisted vascular anastomosis (LAVA) is looked as a particularly promising non-suture method in future. However, aneurysm formation is one of the main reasons delay the clinical application of LAVA. Some scientists investigated the incidence of aneurysms in animal model. To systematically analyze the literature on reported incidence of aneurysm formation in LAVA therapy, we performed a meta-analysis comparing LAVA with conventional suture anastomosis (CSA) in animal model. Data were systematically retrieved and selected from PUBMED. In total, 23 studies were retrieved. 18 studies were excluded, and 5 studies involving 647 animals were included. Analysis suggested no statistically significant difference between LAVA and CSA (OR 1.24, 95%CI 0.66-2.32, P=0.51). Result of meta analysis shows that the technology of LAVA is very close to clinical application.
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-01-01
Abstract Background: Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Methods: Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Results: Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). Conclusion: The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules. PMID:29068996
Sekine, Leo; Morais, Vinícius Daudt; Lima, Karine Margarites; Onsten, Tor Gunnar Hugo; Ziegelmann, Patrícia Klarmann; Ribeiro, Rodrigo Antonini
2015-12-01
Previous meta-analyses suggested that acute myeloid leukaemia induction regimens containing idarubicin (IDA) or high-dose daunorubicin (HDD) induce higher rates of complete remission (CR) than conventional-dose daunorubicin (CDD), with a possible benefit in overall survival. However, robust comparisons between these regimens are still lacking. We conducted a mixed treatment comparison meta-analysis regarding these three regimens. Mixed treatment comparison is a statistical method of data summarization that aggregates data from both direct and indirect effect estimates. Literature search strategy included MEDLINE, EMBASE, Cochrane, Scielo and LILACS, from inception until August 2013 and resulted in the inclusion of 17 trials enrolling 7258 adult patients. HDD [relative risk (RR) 1.13; 95% credible interval (CrI) 1.02-1.26] and IDA (RR 1.13; 95% CrI 1.05-1.23) showed higher CR rates than CDD. IDA also led to lower long-term overall mortality rates when compared with CDD (RR 0.93, 95% CrI 0.86-0.99), whereas HDD and CDD were no different (RR 0.94, 95% CrI 0.85-1.02). HDD and IDA comparison did not reach statistically significant differences in CR (RR 1.00; 95% CrI 0.89-1.11) and in long-term mortality (RR 1.01, 95% CrI 0.91-1.11). IDA and HDD are consistently superior to CDD in inducing CR, and IDA was associated with lower long-term mortality. On the basis of these findings, we recommend incorporation of IDA and HDD instead of the traditional CDD as standard treatments for acute myeloid leukaemia induction. The lack of HDD benefit on mortality, when compared with CDD in this study, should be cautiously addressed, because it may have been susceptible to underestimation because of statistical power limitations. Copyright © 2014 John Wiley & Sons, Ltd.
Advances in Bayesian Modeling in Educational Research
ERIC Educational Resources Information Center
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
NASA Astrophysics Data System (ADS)
Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Shiokawa, Kazuo
2014-08-01
We have developed a new analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow intensity image data to study atmospheric gravity waves. This method can deal with extensive amounts of imaging data obtained on different years and at various observation sites without bias caused by different event extraction criteria for the person processing the data. The new method was applied to sodium airglow data obtained in 2011 at Syowa Station (69°S, 40°E), Antarctica. The results were compared with those obtained from a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics, such as wavelengths, phase velocities, and wave periods. The horizontal phase velocity of each wave event in the airglow images corresponded closely to a peak in the spectrum. The statistical results of spectral analysis showed an eastward offset of the horizontal phase velocity distribution. This could be interpreted as the existence of wave sources around the stratospheric eastward jet. Similar zonal anisotropy was also seen in the horizontal phase velocity distribution of the gravity waves by the event analysis. Both methods produce similar statistical results about directionality of atmospheric gravity waves. Galactic contamination of the spectrum was examined by calculating the apparent velocity of the stars and found to be limited for phase speeds lower than 30 m/s. In conclusion, our new method is suitable for deriving the horizontal phase velocity characteristics of atmospheric gravity waves from an extensive amount of imaging data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Çildağ, Mehmet Burak, E-mail: mbcildag@yahoo.com; Çildağ, Songül, E-mail: songulcildag@yahoo.com; Köseoğlu, Ömer Faruk Kutsi, E-mail: kutsikoseoglu@yahoo.com
ObjectiveThe aim of this study is to investigate the potential association of neutrophil–lymphocyte ratio (NLR) between primary patency of percutaneous transluminal angioplasty (PTA) in hemodialysis arteriovenous fistula stenosis and type (Conventional and Drug-Eluting) of balloons used in PTA.Material-MethodThis retrospective study consists of 78 patients with significant arteriovenous fistulas stenosis who were treated with PTA by using Drug-Eluting Balloon (DEB) (n = 29) or Conventional Balloon (CB) (n = 49). NLR was calculated from preinterventional blood samples. All patients were classified into two groups. Group A; primary patency <12 months (43/78), Group B; primary patency ≥12 months (35/78). Cox regression analysis and Kaplan–Meier method were used to determine respectivelymore » independent factors affecting the primary patency and to compare the primary patency for the two balloon types.ResultsNLR ratio and balloon type of the two groups were significantly different (p = 0.002, p = 0.010). The cut-off value of NLR was 3.18 for determination of primary patency, with sensitivity of 81.4 % and specificity of 51.4 %. Primary patency rates between PTA with DEB and CB displayed statistically significant differences (p < 0.05). The cut-off value was 3.28 for determination of 12-month primary patency with the conventional balloon group; sensitivity was 81.8 % and specificity was 81.3 %. There was no statistical relation between NLR levels and the drug-eluting balloon group in 12-month primary patency (p = 0.927).ConclusionIncreased level of NLR may be a risk factor in the development of early AVF restenosis after successful PTA. Preferring Drug-Eluting Balloon at an increased level of NLR can be beneficial to prolong patency.« less
Sridharan, K; Sandbhor, Shailesh; Rajasekaran, U B; Sam, George; Ramees, M Mohamed; Abraham, Esther A
2017-08-01
The purpose of this research is to compare the frictional attributes of stainless steel conventional brackets and self-ligating stainless steel brackets with different dimensions of archwires. The test was carried with two sets of maxillary brackets: (1) Conventional stainless steel (Victory Series), (2) stainless steel self-ligating (SmartClip) without first premolar brackets. Stainless steel, nickel-titanium (NiTi), and beta-Ti which are the types of orthodontic wire alloys were tested in this study. To monitor the frictional force, a universal testing machine (Instron 33R 4467) that comprises 10 kg tension load cell was assigned on a range of 1 kg and determined from 0 to 2 kg, which allows moving of an archwire along the brackets. One-way analysis of variance was used to test the difference between groups. To analyze the statistical difference between the two groups, Student's t-test was used. For Victory Series in static friction, p-value was 0.946 and for kinetic friction it was 0.944; at the same time for SmartClip, the p value for static and kinetic frictional resistance was 0.497 and 0.518 respectively. Hence, there was no statistically significant difference between the NiTi and stainless steel archwires. It is concluded that when compared with conventional brackets with stainless steel ligatures, self-ligating brackets can produce significantly less friction during sliding. Beta-Ti archwires expressed high amount of frictional resistance and the stainless steel archwires comprise low frictional resistance among all the archwire materials. In orthodontics, frictional resistance has always had a major role. Its ability to impair tooth movement leads to the need for higher forces to move the teeth and it extends the treatment time which results in loss of posterior anchorage. Friction in orthodontics is related with sliding mechanics when a wire is moving through one or a series of bracket slots.
Conventional versus computer-navigated TKA: a prospective randomized study.
Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques
2017-06-01
The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant better functional outcomes of computer-navigated TKA. Computer navigation for TKAs should be used routinely in primary implants. II.
Melfa, G I; Raspanti, C; Attard, M; Cocorullo, G; Attard, A; Mazzola, S; Salamone, G; Gulotta, G; Scerrino, G
2016-01-01
Primary hyperparathyroidism (PHPT) origins from a solitary adenoma in 70- 95% of cases. Moreover, the advances in methods for localizing an abnormal parathyroid gland made minimally invasive techniques more prominent. This study presents a micro-cost analysis of two parathyroidectomy techniques. 72 consecutive patients who underwent minimally invasive parathyroidectomy, video-assisted (MIVAP, group A, 52 patients) or "open" under local anaesthesia (OMIP, group B, 20 patients) for PHPT were reviewed. Operating room, consumable, anaesthesia, maintenance costs, equipment depreciation and surgeons/anaesthesiologists fees were evaluated. The patient's satisfaction and the rate of conversion to conventional parathyroidectomy were investigated. T-Student's, Kolmogorov-Smirnov tests and Odds Ratio were used for statistical analysis. 1 patient of the group A and 2 of the group B were excluded from the cost analysis because of the conversion to the conventional technique. Concerning the remnant patients, the overall average costs were: for Operative Room, 1186,69 € for the MIVAP group (51 patients) and 836,11 € for the OMIP group (p<0,001); for the Team, 122,93 € (group A) and 90,02 € (group B) (p<0,001); the other operative costs were 1388,32 € (group A) and 928,23 € (group B) (p<0,001). The patient's satisfaction was very strongly in favour of the group B (Odds Ratio 20,5 with a 95% confidence interval). MIVAP is more expensive compared to the "open" parathyroidectomy under local anaesthesia due to the costs of general anaesthesia and the longer operative time. Moreover, the patients generally prefer the local anaesthesia. Nevertheless, the rate of conversion to the conventional parathyroidectomy was relevant in the group of the local anaesthesia compared to the MIVAP, since the latter allows a four-gland exploration.
Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis
NASA Astrophysics Data System (ADS)
Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.
2013-05-01
Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.
Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah
2015-01-01
Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-01-01
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-12-31
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
Johnston, David J; Moreau, Robert A
2017-02-01
The aim of this study was to determine if the compositional difference between grain sorghum and corn impact ethanol yields and coproduct value when grain sorghum is incorporated into existing corn ethanol facilities. Fermentation properties of corn and grain sorghum were compared utilizing two fermentation systems (conventional thermal starch liquefaction and native starch hydrolysis). Fermentation results indicated that protease addition influenced the fermentation rate and yield for grain sorghum, improving yields by 1-2% over non-protease treated fermentations. Distillers Dried Grains with Solubles produced from sorghum had a statistically significant higher yields and significantly higher protein content relative to corn. Lipid analysis of the Distillers Dried Grains with Solubles showed statistically significant differences between corn and sorghum in triacylglycerol, diacylglycerol and free fatty acid levels. Published by Elsevier Ltd.
THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.
Theobald, Douglas L; Wuttke, Deborah S
2006-09-01
THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.
Face recognition using an enhanced independent component analysis approach.
Kwak, Keun-Chang; Pedrycz, Witold
2007-03-01
This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.
Hernekamp, J F; Reinecke, A; Neubrech, F; Bickert, B; Kneser, U; Kremer, T
2016-04-01
Four-corner fusion is a standard procedure for advanced carpal collapse. Several operative techniques and numerous implants for osseous fixation have been described. Recently, a specially designed locking plate (Aptus©, Medartis, Basel, Switzerland) was introduced. The purpose of this study was to compare functional results after osseous fixation using K-wires (standard of care, SOC) with four-corner fusion and locking plate fixation. 21 patients who underwent four-corner fusion in our institution between 2008 and 2013 were included in a retrospective analysis. In 11 patients, osseous fixation was performed using locking plates whereas ten patients underwent bone fixation with conventional K-wires. Outcome parameters were functional outcome, osseous consolidation, patient satisfaction (DASH- and Krimmer Score), pain and perioperative morbidity and the time until patients returned to daily work. Patients were divided in two groups and paired t-tests were performed for statistical analysis. No implant related complications were observed. Osseous consolidation was achieved in all cases. Differences between groups were not significant regarding active range of motion (AROM), pain and function. Overall patient satisfaction was acceptable in all cases; differences in the DASH questionnaire and the Krimmer questionnaire were not significant. One patient of the plate group required conversion to total wrist arthrodesis without implant-related complications. Both techniques for four-corner fusion have similar healing rates. Using the more expensive locking implant avoids a second operation for K-wire removal, but no statistical differences were detected in functional outcome as well as in patient satisfaction when compared to SOC.
The Impact of Progesterone Level on Day Of hCG Injection in IVF Cycles on Clinical Pregnancy Rate.
Ashmita, Jawa; Vikas, Swarankar; Swati, Garg
2017-01-01
Premature progesterone rise (PPR) has long been implicated as contributing to implantation failure. Despite the use of gonadotropin-releasing hormone (GnRH) analogues, subtle increases in serum progesterone ( P 4 ) levels beyond a threshold progesterone concentration were observed on the day of trigger in controlled ovarian hyperstimulation cycles. The purpose of the study was to evaluate the incidence of PPR on the day of trigger in conventional IVF/ICSI cycles and its impact on clinical pregnancy rate. A total of 235 patients undergoing conventional IVF/IVF-ICSI by fresh embryo transfer cycles from January 2016 to December 2016 at the infertility unit of a tertiary care hospital were prospectively analyzed. Patients included in the study were subjected to GnRH agonist long/antagonist protocol. Ovulation induction was given with rFSH and/or HMG in both the protocols. The cutoff for defining PPR was P 4 ≥ 1.5 ng/ml, and an analysis of the role of P 4 on clinical pregnancy rate was performed. Statistical analysis was performed with the Statistical Package for the Social Sciences trial version 23.0 software for Windows and Primer software. The overall clinical pregnancy rate per embryo transfer was 30.6%. The clinical pregnancy rate in the patients with P 4 <1.5 ng/ml was significantly higher than those with elevated levels, P 4 ≥ 1.5 ng/ml (33.3% vs. 12.9%; P = 0.037). Premature progesterone elevation in ART cycles is possibly associated with lower clinical pregnancy rates.
Overcoming bias in estimating the volume-outcome relationship.
Tsai, Alexander C; Votruba, Mark; Bridges, John F P; Cebul, Randall D
2006-02-01
To examine the effect of hospital volume on 30-day mortality for patients with congestive heart failure (CHF) using administrative and clinical data in conventional regression and instrumental variables (IV) estimation models. The primary data consisted of longitudinal information on comorbid conditions, vital signs, clinical status, and laboratory test results for 21,555 Medicare-insured patients aged 65 years and older hospitalized for CHF in northeast Ohio in 1991-1997. The patient was the primary unit of analysis. We fit a linear probability model to the data to assess the effects of hospital volume on patient mortality within 30 days of admission. Both administrative and clinical data elements were included for risk adjustment. Linear distances between patients and hospitals were used to construct the instrument, which was then used to assess the endogeneity of hospital volume. When only administrative data elements were included in the risk adjustment model, the estimated volume-outcome effect was statistically significant (p=.029) but small in magnitude. The estimate was markedly attenuated in magnitude and statistical significance when clinical data were added to the model as risk adjusters (p=.39). IV estimation shifted the estimate in a direction consistent with selective referral, but we were unable to reject the consistency of the linear probability estimates. Use of only administrative data for volume-outcomes research may generate spurious findings. The IV analysis further suggests that conventional estimates of the volume-outcome relationship may be contaminated by selective referral effects. Taken together, our results suggest that efforts to concentrate hospital-based CHF care in high-volume hospitals may not reduce mortality among elderly patients.
Gupta, Alisha; Agarwala, Sandeep; Sreenivas, Vishnubhatla; Srinivas, Madhur; Bhatnagar, Veereshwar
2017-01-01
Females with Krickenbeck low-type anorectal malformations - vestibular fistula (VF) and perineal fistula (PF) - are managed either by a primary definitive or conventional three-staged approach. Ultimate outcome in these children may be affected by wound dehiscence leading to healing by fibrosis. Most of the literature favors one approach over other based on retrospective analysis of their outcomes. Whether a statistically significant difference in wound dehiscence rates between these approaches exists needed to be seen. A randomized controlled trial for girls <14 years with VF or PF was done. Random tables were used to randomize 33 children to Group I (primary procedure) and 31 to Group II (three-staged procedure). Statistical analysis was done for significance of difference ( P < 0.05) in the primary outcome (wound dehiscence) and secondary outcomes (immediate and early postoperative complications). Of the 64 children randomized, 54 (84%) had VF. Both groups were comparable in demography, clinical profile and age at surgery. The incidence of wound dehiscence (39.4% vs. 18.2%; P = 0.04), immediate postoperative complications (51.5% vs. 12.9%; P = 0.001), and early postoperative complications (42.4% vs. 12.9%; P = 0.01) was significantly higher in Group I as compared to Group II. Six of 13 children (46.2%) with dehiscence in Group I required a diverting colostomy to be made. Females with VF or PF undergoing primary definitive procedure have a significantly higher incidence of wound dehiscence ( P = 0.04), immediate ( P = 0.001) and early postoperative complications ( P = 0.01).
Cicinelli, Ettore; Trojano, Giuseppe; Mastromauro, Marcella; Vimercati, Antonella; Marinaccio, Marco; Mitola, Paola Carmela; Resta, Leonardo; de Ziegler, Dominique
2017-08-01
To evaluate the association between endometriosis end chronic endometritis (CE) diagnosed by hysteroscopy, conventional histology, and immunohistochemistry. Case-control study. University hospital. Women with and without endometriosis who have undergone hysterectomy. Retrospective evaluation of 78 women who have undergone hysterectomy and were affected by endometriosis and 78 women without endometriosis. CE diagnosed based on conventional histology and immunohistochemistry with anti-syndecan-1 antibodies to identify CD138 cells. The prevalence of CE was statistically significantly higher in the women with endometriosis as compared with the women who did not have endometriosis (33 of 78, 42.3% vs. 12 of 78, 15.4% according to hysteroscopy; and 30 of 78, 38.5% vs. 11 of 78, 14.1% according to histology). The women were divided into two groups, 115 patients without CE and 41 patients with CE. With univariate analysis, parity was associated with a lower risk for CE, and endometriosis was associated with a statistically significantly elevated risk of CE. Using multivariate analysis, parity continued to be associated with a lower incidence of CE, whereas endometriosis was associated with a 2.7 fold higher risk. The diagnosis of CE is more frequent in women with endometriosis. Although no etiologic relationships between CE and endometriosis can be established, this study suggests that CE should be considered and if necessary ruled out in women with endometriosis, particularly if they have abnormal uterine bleeding. Identification and appropriate treatment of CE may avoid unnecessary surgery. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Duning, Thomas; Kellinghaus, Christoph; Mohammadi, Siawoosh; Schiffbauer, Hagen; Keller, Simon; Ringelstein, E Bernd; Knecht, Stefan; Deppe, Michael
2010-02-01
Conventional structural MRI fails to identify a cerebral lesion in 25% of patients with cryptogenic partial epilepsy (CPE). Diffusion tensor imaging is an MRI technique sensitive to microstructural abnormalities of cerebral white matter (WM) by quantification of fractional anisotropy (FA). The objectives of the present study were to identify focal FA abnormalities in patients with CPE who were deemed MRI negative during routine presurgical evaluation. Diffusion tensor imaging at 3 T was performed in 12 patients with CPE and normal conventional MRI and in 67 age matched healthy volunteers. WM integrity was compared between groups on the basis of automated voxel-wise statistics of FA maps using an analysis of covariance. Volumetric measurements from high resolution T1-weighted images were also performed. Significant FA reductions in WM regions encompassing diffuse areas of the brain were observed when all patients as a group were compared with controls. On an individual basis, voxel based analyses revealed widespread symmetrical FA reduction in CPE patients. Furthermore, asymmetrical temporal lobe FA reduction was consistently ipsilateral to the electroclinical focus. No significant correlations were found between FA alterations and clinical data. There were no differences in brain volumes of CPE patients compared with controls. Despite normal conventional MRI, WM integrity abnormalities in CPE patients extend far beyond the epileptogenic zone. Given that unilateral temporal lobe FA abnormalities were consistently observed ipsilateral to the seizure focus, analysis of temporal FA may provide an informative in vivo investigation into the localisation of the epileptogenic zone in MRI negative patients.
Shah, Farhan Khalid; Gebreel, Ashraf; Elshokouki, Ali hamed; Habib, Ahmed Ali
2012-01-01
PURPOSE To compare the changes in the occlusal vertical dimension, activity of masseter muscles and biting force after insertion of immediate denture constructed with conventional, tooth-supported and Implant-supported immediate mandibular complete denture. MATERIALS AND METHODS Patients were selected and treatment was carried out with all the three different concepts i.e, immediate denture constructed with conventional (Group A), tooth-supported (Group B) and Implant-supported (Group C) immediate mandibular complete dentures. Parameters of evaluation and comparison were occlusal vertical dimension measured by radiograph (at three different time intervals), Masseter muscle electromyographic (EMG) measurement by EMG analysis (at three different positions of jaws) and bite force measured by force transducer (at two different time intervals). The obtained data were statistically analyzed by using ANOVA-F test at 5% level of significance. If the F test was significant, Least Significant Difference test was performed to test further significant differences between variables. RESULTS Comparison between mean differences in occlusal vertical dimension for tested groups showed that it was only statistically significant at 1 year after immediate dentures insertion. Comparison between mean differences in wavelet packet coefficients of the electromyographic signals of masseter muscles for tested groups was not significant at rest position, but significant at initial contact position and maximum voluntary clench position. Comparison between mean differences in maximum biting force for tested groups was not statistically significant at 5% level of significance. CONCLUSION Immediate complete overdentures whether tooth or implant supported prosthesis is recommended than totally mucosal supported prosthesis. PMID:22737309
A Comparison of Spatial Statistical Methods in a School Finance Policy Context
ERIC Educational Resources Information Center
Slagle, Mike
2010-01-01
A shortcoming of the conventional ordinary least squares (OLS) approaches for estimating median voter models of education demand is the inability to more fully explain the spatial relationships between neighboring school districts. Consequently, two school districts that appear to be descriptively similar in terms of conventional measures of…
Adaptation of zirconia crowns created by conventional versus optical impression: in vitro study
Bahrami, Babak; Fossoyeux, InÈs; Atash, Ramin
2017-01-01
PURPOSE The aim of this study was to compare the precision of optical impression (Trios, 3Shape) versus that of conventional impression (Imprint IV, 3M-ESPE) with three different margins (shoulder, chamfer, and knife-edge) on Frasaco teeth. MATERIALS AND METHODS The sample comprised of 60 zirconia half-crowns, divided into six groups according to the type of impression and margin. Scanning electron microscopy enabled us to analyze the gap between the zirconia crowns and the Frasaco teeth, using ImageJ software, based on eight reproducible and standardized measuring points. RESULTS No statistically significant difference was found between conventional impressions and optical impressions, except for two of the eight points. A statistically significant difference was observed between the three margin types; the chamfer and knife-edge finishing lines appeared to offer better adaptation results than the shoulder margin. CONCLUSION Zirconia crowns created from optical impression and those created from conventional impression present similar adaptation. While offering identical results, the former have many advantages. In view of our findings, we believe the chamfer margin should be favored. PMID:28680553
[Digital radiography in young children. Considerations based on experiences in practice].
Berkhout, W E R; Mileman, P A; Weerheijm, K L
2004-10-01
In dentistry, digital radiology techniques, such as a charge-coupled device and a storage phosphor plate, are gaining popularity. It was the objective of this study to assess the importance of the advantages and disadvantages of digital radiology techniques for bitewing radiography in young children, when compared to conventional film. A group of dentists received a questionnaire regarding their experiences with digital radiology techniques or conventional films among young children. Using the Simple Multi-Attributive Rating Technique (SMART) a final weighted score was calculated for the charge-coupled device, the phosphor plate, and conventional film. The scores were 7.40, 7.38, and 6.98 respectively. The differences were not statistically significant (p > 0.47). It could be concluded that, on the basis of experiences in practice, there are no statistically significant preferences for the use of digital radioogy techniques for bitewing radiography in young children.
Bayesian statistics in radionuclide metrology: measurement of a decaying source
NASA Astrophysics Data System (ADS)
Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal
2007-08-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.
Ying, Xiao-Ming; Jiang, Yong-Liang; Xu, Peng; Wang, Peng; Zhu, Bo; Guo, Shao-Qing
2016-08-25
To conduct a meta analysis of studies comparing theapeutic effect and safety of microendoscopic discectomy to conventional open discectomy in the treatment of lumbar disc herniation in China. A systematic literature retrieval was conducted in the Chinese Bio medicine Database, CNKI database, Chongqin VIP database and Wangfang database. The statistical analysis was performed using a RevMan 4.2 software. The comparison included excellent rate, operation times, blood loss, periods of bed rest and resuming daily activities, hospital stay or hospital stay after surgery, and complications of microendoscopic discectomy versus conventional open discectomy. The search yielded 20 reports, which included 2 957 cases treated by microendoscopic discectomy and 2 130 cases treated by conventional open discectomy. There were 12, 11, 7, 5, 4 and 4 reports which had comparison of operation times, blood loss, period of bed rest, periods of resuming daily activities, hospital stay and hospital stay after surgery respectively. Complications were mentioned in 10 reports. Compared to patients treated by open discectomy, patients treated by microendoscopic discectomy had a higher excellent rates [OR=1.29, 95%CI (1.03, 1.62)], less blood loss[OR=-63.67, 95%CI (-86.78, -40.55)], less period of bed rest[OR=-15.33, 95%CI (-17.76, -12.90)], less period of resumption of daily activities [OR=-24.41, 95%CI (-36.86, -11.96)], less hospital stay [OR=-5.00, 95%CI (-6.94, -3.06)] or hospital stay after surgery [OR=-7.47, 95%CI (-9.17, -5.77) respectively. However, incidence of complications and operation times were proved no significant different between microendoscopic discectomy and open discectomy. Microendoscopic discectomy and conventional open discectomy in treatment of lumbar disc herniation are both safe, effective; incidence of complications are nearly. Patients with lumbar disc herniation treated by microendoscopic discectomy have fewer blood loss, shorter periods of bed rest and hospital stay, and resume daily activities faster. Techniques are selected according to indications, microendoscopic discectomy should be carried out when conjunct indications occur.
Williams, L. Keoki; Buu, Anne
2017-01-01
We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206
Makeyev, Oleksandr; Joe, Cody; Lee, Colin; Besio, Walter G
2017-07-01
Concentric ring electrodes have shown promise in non-invasive electrophysiological measurement demonstrating their superiority to conventional disc electrodes, in particular, in accuracy of Laplacian estimation. Recently, we have proposed novel variable inter-ring distances concentric ring electrodes. Analytic and finite element method modeling results for linearly increasing distances electrode configurations suggested they may decrease the truncation error resulting in more accurate Laplacian estimates compared to currently used constant inter-ring distances configurations. This study assesses statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes. Full factorial design of analysis of variance was used with one categorical and two numerical factors: the inter-ring distances, the electrode diameter, and the number of concentric rings in the electrode. The response variables were the Relative Error and the Maximum Error of Laplacian estimation computed using a finite element method model for each of the combinations of levels of three factors. Effects of the main factors and their interactions on Relative Error and Maximum Error were assessed and the obtained results suggest that all three factors have statistically significant effects in the model confirming the potential of using inter-ring distances as a means of improving accuracy of Laplacian estimation.
Electrofacies analysis for coal lithotype profiling based on high-resolution wireline log data
NASA Astrophysics Data System (ADS)
Roslin, A.; Esterle, J. S.
2016-06-01
The traditional approach to coal lithotype analysis is based on a visual characterisation of coal in core, mine or outcrop exposures. As not all wells are fully cored, the petroleum and coal mining industries increasingly use geophysical wireline logs for lithology interpretation.This study demonstrates a method for interpreting coal lithotypes from geophysical wireline logs, and in particular discriminating between bright or banded, and dull coal at similar densities to a decimetre level. The study explores the optimum combination of geophysical log suites for training the coal electrofacies interpretation, using neural network conception, and then propagating the results to wells with fewer wireline data. This approach is objective and has a recordable reproducibility and rule set.In addition to conventional gamma ray and density logs, laterolog resistivity, microresistivity and PEF data were used in the study. Array resistivity data from a compact micro imager (CMI tool) were processed into a single microresistivity curve and integrated with the conventional resistivity data in the cluster analysis. Microresistivity data were tested in the analysis to test the hypothesis that the improved vertical resolution of microresistivity curve can enhance the accuracy of the clustering analysis. The addition of PEF log allowed discrimination between low density bright to banded coal electrofacies and low density inertinite-rich dull electrofacies.The results of clustering analysis were validated statistically and the results of the electrofacies results were compared to manually derived coal lithotype logs.
Rai, Arpita; Acharya, Ashith B.; Naikmasur, Venkatesh G.
2016-01-01
Background: Age estimation of living or deceased individuals is an important aspect of forensic sciences. Conventionally, pulp-to-tooth area ratio (PTR) measured from periapical radiographs have been utilized as a nondestructive method of age estimation. Cone-beam computed tomography (CBCT) is a new method to acquire three-dimensional images of the teeth in living individuals. Aims: The present study investigated age estimation based on PTR of the maxillary canines measured in three planes obtained from CBCT image data. Settings and Design: Sixty subjects aged 20–85 years were included in the study. Materials and Methods: For each tooth, mid-sagittal, mid-coronal, and three axial sections—cementoenamel junction (CEJ), one-fourth root level from CEJ, and mid-root—were assessed. PTR was calculated using AutoCAD software after outlining the pulp and tooth. Statistical Analysis Used: All statistical analyses were performed using an SPSS 17.0 software program. Results and Conclusions: Linear regression analysis showed that only PTR in axial plane at CEJ had significant age correlation (r = 0.32; P < 0.05). This is probably because of clearer demarcation of pulp and tooth outline at this level. PMID:28123269
Manual tracing versus smartphone application (app) tracing: a comparative study.
Sayar, Gülşilay; Kilinc, Delal Dara
2017-11-01
This study aimed to compare the results of conventional manual cephalometric tracing with those acquired with smartphone application cephalometric tracing. The cephalometric radiographs of 55 patients (25 females and 30 males) were traced via the manual and app methods and were subsequently examined with Steiner's analysis. Five skeletal measurements, five dental measurements and two soft tissue measurements were managed based on 21 landmarks. The durations of the performances of the two methods were also compared. SNA (Sella, Nasion, A point angle) and SNB (Sella, Nasion, B point angle) values for the manual method were statistically lower (p < .001) than those for the app method. The ANB value for the manual method was statistically lower than that of app method. L1-NB (°) and upper lip protrusion values for the manual method were statistically higher than those for the app method. Go-GN/SN, U1-NA (°) and U1-NA (mm) values for manual method were statistically lower than those for the app method. No differences between the two methods were found in the L1-NB (mm), occlusal plane to SN, interincisal angle or lower lip protrusion values. Although statistically significant differences were found between the two methods, the cephalometric tracing proceeded faster with the app method than with the manual method.
Datta, Rakesh; Datta, Karuna; Venkatesh, M D
2015-07-01
The classical didactic lecture has been the cornerstone of the theoretical undergraduate medical education. Their efficacy however reduces due to reduced interaction and short attention span of the students. It is hypothesized that the interactive response pad obviates some of these drawbacks. The aim of this study was to evaluate the effectiveness of an interactive response system by comparing it with conventional classroom teaching. A prospective comparative longitudinal study was conducted on 192 students who were exposed to either conventional or interactive teaching over 20 classes. Pre-test, Post-test and retentions test (post 8-12 weeks) scores were collated and statistically analysed. An independent observer measured number of student interactions in each class. Pre-test scores from both groups were similar (p = 0.71). There was significant improvement in both post test scores when compared to pre-test scores in either method (p < 0.001). The interactive post-test score was better than conventional post test score (p < 0.001) by 8-10% (95% CI-difference of means - 8.2%-9.24%-10.3%). The interactive retention test score was better than conventional retention test score (p < 0.001) by 15-18% (95% CI-difference of means - 15.0%-16.64%-18.2%). There were 51 participative events in the interactive group vs 25 in the conventional group. The Interactive Response Pad method was efficacious in teaching. Students taught with the interactive method were likely to score 8-10% higher (statistically significant) in the immediate post class time and 15-18% higher (statistically significant) after 8-12 weeks. The number of student-teacher interactions increases when using the interactive response pads.
Shokry, Mohamed; Aboelsaad, Nayer
2016-01-01
The purpose of this study was to test the effect of the surgical removal of impacted mandibular third molars using piezosurgery versus the conventional surgical technique on postoperative sequelae and bone healing. Material and Methods. This study was carried out as a randomized controlled clinical trial: split mouth design. Twenty patients with bilateral mandibular third molar mesioangular impaction class II position B indicated for surgical extraction were treated randomly using either the piezosurgery or the conventional bur technique on each site. Duration of the procedure, postoperative edema, trismus, pain, healing, and bone density and quantity were evaluated up to 6 months postoperatively. Results. Test and control sites were compared using paired t-test. There was statistical significance in reduction of pain and swelling in test sites, where the time of the procedure was statistically increased in test site. For bone quantity and quality, statistical difference was found where test site showed better results. Conclusion. Piezosurgery technique improves quality of patient's life in form of decrease of postoperative pain, trismus, and swelling. Furthermore, it enhances bone quality within the extraction socket and bone quantity along the distal aspect of the mandibular second molar. PMID:27597866
Fu, Huichao; Wang, Jiaxing; Zhou, Shenyuan; Cheng, Tao; Zhang, Wen; Wang, Qi; Zhang, Xianlong
2015-11-01
There is a rising interest in the use of patient-specific instrumentation (PSI) during total knee arthroplasty (TKA). The goal of this meta-analysis was to compare PSI with conventional instrumentation (CI) in patients undergoing TKA. A literature search was performed in PubMed, Embase, Springer, Ovid, China National Knowledge Infrastructure, and the Cochrane Library. A total of 10 randomized controlled studies involving 837 knees comparing outcomes of PSI TKAs with CI TKAs were included in the present analysis. Outcomes of interest included component alignment, surgical time, blood loss, and hospital stay. The results presented no significant differences between the two instrumentations in terms of restoring a neutral mechanical axis and femoral component placement. However, their differences have been noted regarding the alignment of the tibial component in coronal and sagittal planes. Also, 3 min less surgical time was used in PSI patients. Based on these findings, PSI appeared not to be superior to CI in terms of the post-operative mechanical axis of the limb or femoral component placement. Despite a statistical difference for operative duration, the benefit of a small reduction in surgical time with PSI is clinically irrelevant. Therapeutic study (systematic review and meta-analysis), Level I.
Meta-analysis of Microbial Fuel Cells Using Waste Substrates.
Dowdy, F Ryan; Kawakita, Ryan; Lange, Matthew; Simmons, Christopher W
2018-05-01
Microbial fuel cell experimentation using waste streams is an increasingly popular field of study. One obstacle to comparing studies has been the lack of consistent conventions for reporting results such that meta-analysis can be used for large groups of experiments. Here, 134 unique microbial fuel cell experiments using waste substrates were compiled for analysis. Findings include that coulombic efficiency correlates positively with volumetric power density (p < 0.001), negatively with working volume (p < 0.05), and positively with percentage removal of chemical oxygen demand (p < 0.005). Power density in mW/m 2 correlates positively with chemical oxygen demand loading (p < 0.005), and positively with maximum open-circuit voltage (p < 0.05). Finally, single-chamber versus double-chamber reactor configurations differ significantly in maximum open-circuit voltage (p < 0.005). Multiple linear regression to predict either power density or maximum open-circuit voltage produced no significant models due to the amount of multicollinearity between predictor variables. Results indicate that statistically relevant conclusions can be drawn from large microbial fuel cell datasets. Recommendations for future consistency in reporting results following a MIAMFCE convention (Minimum Information About a Microbial Fuel Cell Experiment) are included.
Subperiosteal preparation using a new piezoelectric device: a histological examination.
Stoetzer, Marcus; Magel, Anja; Kampmann, Andreas; Lemound, Juliana; Gellrich, Nils-Claudius; von See, Constantin
2014-01-01
Subperiosteal preparation using a periosteal elevator leads to disturbances of local immunohistochemistry and periosteal histology due to a microtrauma. Usually soft-tissue damage can be considerably reduced by using piezoelectric technology. For this reason, the effects of a novel piezoelectric device on immunohistochemistry and periosteal histology were examined and compared to conventional preparation of the periosteum using a periosteal elevator. Lewis rats were randomly assigned to one of five groups (n=50). Subperiosteal preparation was performed using either a piezoelectric device or a periosteal elevator. Immunohistochemical and histological analyses were performed immediately after preparation as well as three and eight days postoperatively. A statistical analysis of the histological colouring was performed offline using analysis of variance (ANOVA) on ranks (p<0.05). At all times, immunohistochemical and histological analysis demonstrated a significantly more homogenous tissue structure in the group of rats that underwent piezosurgery than in the group of rats that underwent treatment with a periosteal elevator. The use of a piezoelectric device for subperiosteal preparation is associated with more harmonious immunohistochemical and histological results for the periosteum than the use of a conventional periosteal elevator. As a result, piezoelectric devices can be expected to have a positive effect primarily on soft tissue, in particular of the periosteal as well as on surrounding tissues.
Does the extended Glasgow Outcome Scale add value to the conventional Glasgow Outcome Scale?
Weir, James; Steyerberg, Ewout W; Butcher, Isabella; Lu, Juan; Lingsma, Hester F; McHugh, Gillian S; Roozenbeek, Bob; Maas, Andrew I R; Murray, Gordon D
2012-01-01
The Glasgow Outcome Scale (GOS) is firmly established as the primary outcome measure for use in Phase III trials of interventions in traumatic brain injury (TBI). However, the GOS has been criticized for its lack of sensitivity to detect small but clinically relevant changes in outcome. The Glasgow Outcome Scale-Extended (GOSE) potentially addresses this criticism, and in this study we estimate the efficiency gain associated with using the GOSE in place of the GOS in ordinal analysis of 6-month outcome. The study uses both simulation and the reanalysis of existing data from two completed TBI studies, one an observational cohort study and the other a randomized controlled trial. As expected, the results show that using an ordinal technique to analyze the GOS gives a substantial gain in efficiency relative to the conventional analysis, which collapses the GOS onto a binary scale (favorable versus unfavorable outcome). We also found that using the GOSE gave a modest but consistent increase in efficiency relative to the GOS in both studies, corresponding to a reduction in the required sample size of the order of 3-5%. We recommend that the GOSE be used in place of the GOS as the primary outcome measure in trials of TBI, with an appropriate ordinal approach being taken to the statistical analysis.
Lee, Seung Hyun; Lee, Young Han; Hahn, Seok; Yang, Jaemoon; Song, Ho-Taek; Suh, Jin-Suck
2017-01-01
Background Synthetic magnetic resonance imaging (MRI) allows reformatting of various synthetic images by adjustment of scanning parameters such as repetition time (TR) and echo time (TE). Optimized MR images can be reformatted from T1, T2, and proton density (PD) values to achieve maximum tissue contrast between joint fluid and adjacent soft tissue. Purpose To demonstrate the method for optimization of TR and TE by synthetic MRI and to validate the optimized images by comparison with conventional shoulder MR arthrography (MRA) images. Material and Methods Thirty-seven shoulder MRA images acquired by synthetic MRI were retrospectively evaluated for PD, T1, and T2 values at the joint fluid and glenoid labrum. Differences in signal intensity between the fluid and labrum were observed between TR of 500-6000 ms and TE of 80-300 ms in T2-weighted (T2W) images. Conventional T2W and synthetic images were analyzed for diagnostic agreement of supraspinatus tendon abnormalities (kappa statistics) and image quality scores (one-way analysis of variance with post-hoc analysis). Results Optimized mean values of TR and TE were 2724.7 ± 1634.7 and 80.1 ± 0.4, respectively. Diagnostic agreement for supraspinatus tendon abnormalities between conventional and synthetic MR images was excellent (κ = 0.882). The mean image quality score of the joint space in optimized synthetic images was significantly higher compared with those in conventional and synthetic images (2.861 ± 0.351 vs. 2.556 ± 0.607 vs. 2.750 ± 0.439; P < 0.05). Conclusion Synthetic MRI with optimized TR and TE for shoulder MRA enables optimization of soft-tissue contrast.
Mitra, Sumita B; Oxman, Joe D; Falsafi, Afshin; Ton, Tiffany T
2011-12-01
To compare the long-term fluoride release kinetics of a novel nano-filled two-paste resin-modified glass-ionomer (RMGI), Ketac Nano (KN) with that of two powder-liquid resin-modified glass-ionomers, Fuji II LC (FLC) and Vitremer (VT) and one conventional glass-ionomer, Fuji IX (FIX). Fluoride release was measured in vitro using ion-selective electrodes. Kinetic analysis was done using regression analysis and compared with existing models for GIs and compomers. In a separate experiment the samples of KN and two conventional glass-ionomers, FIX and Ketac Molar (KM) were subjected to a treatment with external fluoride source (Oral-B Neutra-Foam) after 3 months of fluoride release and the recharge behavior studied for an additional 7-day period. The cumulative amount of fluoride released from KN, VT and FLC and the release profiles were statistically similar but greater than that for FIX at P < 0.05. All four materials, including KN, showed a burst of fluoride ions at shorter times (t) and an overall rate dependence on t1/2 typical for glass-ionomers. The coating of KN with its primer and of DY with its adhesive did not significantly alter the fluoride release behavior of the respective materials. The overall rate for KN was significantly higher than for the compomer DY. DY showed a linear rate of release vs. t and no burst effect as expected for compomers. The nanoionomer KN showed fluoride recharge behavior similar to the conventional glass ionomers FIX and KM. Thus, it was concluded that the new RMGI KN exhibits fluoride ion release behavior similar to typical conventional and RMGIs and that the primer does not impede the release of fluoride.
Final visual acuity results in the early treatment for retinopathy of prematurity study.
Good, William V; Hardy, Robert J; Dobson, Velma; Palmer, Earl A; Phelps, Dale L; Tung, Betty; Redford, Maryann
2010-06-01
To compare visual acuity at 6 years of age in eyes that received early treatment for high-risk prethreshold retinopathy of prematurity (ROP) with conventionally managed eyes. Infants with symmetrical, high-risk prethreshold ROP (n = 317) had one eye randomized to earlier treatment at high-risk prethreshold disease and the other eye managed conventionally, treated if ROP progressed to threshold severity. For asymmetric cases (n = 84), the high-risk prethreshold eye was randomized to either early treatment or conventional management. The main outcome measure was ETDRS visual acuity measured at 6 years of age by masked testers. Retinal structure was assessed as a secondary outcome. Analysis of all subjects with high-risk prethreshold ROP showed no statistically significant benefit for early treatment (24.3% vs 28.6% [corrected] unfavorable outcome; P = .15). Analysis of 6-year visual acuity results according to the Type 1 and 2 clinical algorithm showed a benefit for Type 1 eyes (25.1% vs 32.8%; P = .02) treated early but not Type 2 eyes (23.6% vs 19.4%; P = .37). Early-treated eyes showed a significantly better structural outcome compared with conventionally managed eyes (8.9% vs 15.2% unfavorable outcome; P < .001), with no greater risk of ocular complications. Early treatment for Type 1 high-risk prethreshold eyes improved visual acuity outcomes at 6 years of age. Early treatment for Type 2 high-risk prethreshold eyes did not. Application to Clinical Practice Type 1 eyes, not Type 2 eyes, should be treated early. These results are particularly important considering that 52% of Type 2 high-risk prethreshold eyes underwent regression of ROP without requiring treatment. Trial Registration clinicaltrials.gov Identifier: NCT00027222.
Atalay, Altay; Koc, Ayse Nedret; Suel, Ahmet; Sav, Hafize; Demir, Gonca; Elmali, Ferhan; Cakir, Nuri; Seyedmousavi, Seyedmojtaba
2016-09-01
Aspergillus species cause a wide range of diseases in humans, including allergies, localized infections, or fatal disseminated diseases. Rapid detection and identification of Aspergillus spp. facilitate effective patient management. In the current study we compared conventional morphological methods with PCR sequencing, rep-PCR, and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for the identification of Aspergillus strains. A total of 24 consecutive clinical isolates of Aspergillus were collected during 2012-2014. Conventional morphology and rep-PCR were performed in our Mycology Laboratory. The identification, evaluation, and reporting of strains using MALDI-TOF-MS were performed by BioMérieux Diagnostic, Inc. in Istanbul. DNA sequence analysis of the clinical isolates was performed by the BMLabosis laboratory in Ankara. Samples consisted of 18 (75%) lower respiratory tract specimens, 3 otomycosis (12.5%) ear tissues, 1 sample from keratitis, and 1 sample from a cutaneous wound. According to DNA sequence analysis, 12 (50%) specimens were identified as A. fumigatus, 8 (33.3%) as A. flavus, 3 (12.5%) as A. niger, and 1 (4.2%) as A. terreus. Statistically, there was good agreement between the conventional morphology and rep-PCR and MALDI-TOF methods; kappa values were κ = 0.869, 0.871, and 0.916, respectively (P < 0.001). The good level of agreement between the methods included in the present study and sequence method could be due to the identification of Aspergillus strains that were commonly encountered. Therefore, it was concluded that studies conducted with a higher number of isolates, which include other Aspergillus strains, are required. © 2016 Wiley Periodicals, Inc.
Three Dimensional CFD Analysis of the GTX Combustor
NASA Technical Reports Server (NTRS)
Steffen, C. J., Jr.; Bond, R. B.; Edwards, J. R.
2002-01-01
The annular combustor geometry of a combined-cycle engine has been analyzed with three-dimensional computational fluid dynamics. Both subsonic combustion and supersonic combustion flowfields have been simulated. The subsonic combustion analysis was executed in conjunction with a direct-connect test rig. Two cold-flow and one hot-flow results are presented. The simulations compare favorably with the test data for the two cold flow calculations; the hot-flow data was not yet available. The hot-flow simulation indicates that the conventional ejector-ramjet cycle would not provide adequate mixing at the conditions tested. The supersonic combustion ramjet flowfield was simulated with frozen chemistry model. A five-parameter test matrix was specified, according to statistical design-of-experiments theory. Twenty-seven separate simulations were used to assemble surrogate models for combustor mixing efficiency and total pressure recovery. ScramJet injector design parameters (injector angle, location, and fuel split) as well as mission variables (total fuel massflow and freestream Mach number) were included in the analysis. A promising injector design has been identified that provides good mixing characteristics with low total pressure losses. The surrogate models can be used to develop performance maps of different injector designs. Several complex three-way variable interactions appear within the dataset that are not adequately resolved with the current statistical analysis.
Al-Saleh, Ayman; Alazzoni, Ashraf; Al Shalash, Saleh; Ye, Chenglin; Mbuagbaw, Lawrence; Thabane, Lehana; Jolly, Sanjit S.
2014-01-01
Background High-sensitivity cardiac troponin assays have been adopted by many clinical centres worldwide; however, clinicians are uncertain how to interpret the results. We sought to assess the utility of these assays in diagnosing acute myocardial infarction (MI). Methods We carried out a systematic review and meta-analysis of studies comparing high-sensitivity with conventional assays of cardiac troponin levels among adults with suspected acute MI in the emergency department. We searched MEDLINE, EMBASE and Cochrane databases up to April 2013 and used bivariable random-effects modelling to obtain summary parameters for diagnostic accuracy. Results We identified 9 studies that assessed the use of high-sensitivity troponin T assays (n = 9186 patients). The summary sensitivity of these tests in diagnosing acute MI at presentation to the emergency department was estimated to be 0.94 (95% confidence interval [CI] 0.89–0.97); for conventional tests, it was 0.72 (95% CI 0.63–0.79). The summary specificity was 0.73 (95% CI 0.64–0.81) for the high-sensitivity assay compared with 0.95 (95% CI 0.93–0.97) for the conventional assay. The differences in estimates of the summary sensitivity and specificity between the high-sensitivity and conventional assays were statistically significant (p < 0.01). The area under the curve was similar for both tests carried out 3–6 hours after presentation. Three studies assessed the use of high-sensitivity troponin I assays and showed similar results. Interpretation Used at presentation to the emergency department, the high-sensitivity cardiac troponin assay has improved sensitivity, but reduced specificity, compared with the conventional troponin assay. With repeated measurements over 6 hours, the area under the curve is similar for both tests, indicating that the major advantage of the high-sensitivity test is early diagnosis. PMID:25295240
Shamata, Awatif; Thompson, Tim
2018-05-10
Non-contact three-dimensional (3D) surface scanning has been applied in forensic medicine and has been shown to mitigate shortcoming of traditional documentation methods. The aim of this paper is to assess the efficiency of structured light 3D surface scanning in recording traumatic injuries of live cases in clinical forensic medicine. The work was conducted in Medico-Legal Centre in Benghazi, Libya. A structured light 3D surface scanner and ordinary digital camera with close-up lens were used to record the injuries and to have 3D and two-dimensional (2D) documents of the same traumas. Two different types of comparison were performed. Firstly, the 3D wound documents were compared to 2D documents based on subjective visual assessment. Additionally, 3D wound measurements were compared to conventional measurements and this was done to determine whether there was a statistical significant difference between them. For this, Friedman test was used. The study established that the 3D wound documents had extra features over the 2D documents. Moreover; the 3D scanning method was able to overcome the main deficiencies of the digital photography. No statistically significant difference was found between the 3D and conventional wound measurements. The Spearman's correlation established strong, positive correlation between the 3D and conventional measurement methods. Although, the 3D surface scanning of the injuries of the live subjects faced some difficulties, the 3D results were appreciated, the validity of 3D measurements based on the structured light 3D scanning was established. Further work will be achieved in forensic pathology to scan open injuries with depth information. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Anitharaj, Velmurugan; Stephen, Selvaraj; Pradeep, Jothimani; Park, Sungman; Kim, Seung-Han; Kim, Young Jin; Kim, Eun-Ye; Kim, Yoon-Won
2016-11-01
Scrub Typhus (ST) is being reported from different parts of India in the recent past. However, the diagnosis and confirmation of ST cases require specific serological and molecular diagnostic tests. Both rapid and conventional ELISA tests need to be properly evaluated. Evaluation of a new ST IgM Immunochromatography (ICT) test kit (InBios Scrub Typhus Detect IgM Rapid Test) and compare it with another rapid kit, conventional ELISA kit and Weil-Felix (WF) test. This prospective study was carried out in Mahatma Gandhi Medical College and Research Institute, Puducherry, during November 2015 to June 2016. Clinically suspected 220 ST patients were examined by a new kit, InBios Scrub Typhus Detect IgM Rapid Test, taking the conventional InBios Scrub Typhus Detect IgM ELISA as reference. Additional comparison was made with ImmuneMed Scrub Typhus Rapid, and WF test (single OXK titers ≥1:320). Statistical analysis was performed (Chi-square, Spearman's correlation and Kappa) using IBM SPSS Statistics 17 for Windows (SPSS Inc; Chicago, USA). Percentage Sensitivity, Specificity, Positive Predictive and Negative Predictive Values for InBios, ImmuneMed and WF were 99.25, 93.02, 95.68, 98.77; 94.87, 94.19, 96.21, 92.05 and 50.38, 95.51, 94.29, 56.67 respectively. A total of 134 patients were positive in reference standard InBios IgM ELISA. This new rapid ST IgM kit validated for the first time in India, showed good sensitivity and specificity. As a Point-of-Care (PoC) test, the kit would be helpful in both urban and remote rural parts of India.
Anitharaj, Velmurugan; Pradeep, Jothimani; Park, Sungman; Kim, Seung-Han; Kim, Young Jin; Kim, Eun-Ye; Kim, Yoon-Won
2016-01-01
Introduction Scrub Typhus (ST) is being reported from different parts of India in the recent past. However, the diagnosis and confirmation of ST cases require specific serological and molecular diagnostic tests. Both rapid and conventional ELISA tests need to be properly evaluated. Aim Evaluation of a new ST IgM Immunochromatography (ICT) test kit (InBios Scrub Typhus Detect IgM Rapid Test) and compare it with another rapid kit, conventional ELISA kit and Weil-Felix (WF) test. Materials and Methods This prospective study was carried out in Mahatma Gandhi Medical College and Research Institute, Puducherry, during November 2015 to June 2016. Clinically suspected 220 ST patients were examined by a new kit, InBios Scrub Typhus Detect IgM Rapid Test, taking the conventional InBios Scrub Typhus Detect IgM ELISA as reference. Additional comparison was made with ImmuneMed Scrub Typhus Rapid, and WF test (single OXK titers ≥1:320). Statistical analysis was performed (Chi-square, Spearman’s correlation and Kappa) using IBM SPSS Statistics 17 for Windows (SPSS Inc; Chicago, USA). Results Percentage Sensitivity, Specificity, Positive Predictive and Negative Predictive Values for InBios, ImmuneMed and WF were 99.25, 93.02, 95.68, 98.77; 94.87, 94.19, 96.21, 92.05 and 50.38, 95.51, 94.29, 56.67 respectively. A total of 134 patients were positive in reference standard InBios IgM ELISA. Conclusion This new rapid ST IgM kit validated for the first time in India, showed good sensitivity and specificity. As a Point-of-Care (PoC) test, the kit would be helpful in both urban and remote rural parts of India. PMID:28050364
Alqahtani, Fawaz
2017-01-01
The purpose of this study was to determine the effect of two extraoral computer-aided design (CAD) and computer-aided manufacturing (CAM) systems, in comparison with conventional techniques, on the marginal fit of monolithic CAD/CAM lithium disilicate ceramic crowns. This is an in vitro interventional study. The study was carried out at the Department of Prosthodontics, School of Dentistry, Prince Sattam Bin Abdul-Aziz University, Saudi Arabia, from December 2015 to April 2016. A marginal gap of 60 lithium disilicate crowns was evaluated by scanning electron microscopy. In total, 20 pressable lithium disilicate (IPS e.max Press [Ivoclar Vivadent]) ceramic crowns were fabricated using the conventional lost-wax technique as a control group. The experimental all-ceramic crowns were produced based on a scan stone model and milled using two extraoral CAD/CAM systems: the Cerec group was fabricated using the Cerec CAD/CAM system, and the Trios group was fabricated using Trios CAD and milled using Wieland Zenotec CAM. One-way analysis of variance (ANOVA) and the Scheffe post hoc test were used for statistical comparison of the groups (α=0.05). The mean (±standard deviation) of the marginal gap of each group was as follows: the Control group was 91.15 (±15.35) µm, the Cerec group was 111.07 (±6.33) µm, and the Trios group was 60.17 (±11.09) µm. One-way ANOVA and the Scheffe post hoc test showed a statistically significant difference in the marginal gap between all groups. It can be concluded from the current study that all-ceramic crowns, fabricated using the CAD/CAM system, show a marginal accuracy that is acceptable in clinical environments. The Trios CAD group displayed the smallest marginal gap.
Tzavidis, Nikos; Salvati, Nicola; Schmid, Timo; Flouri, Eirini; Midouhas, Emily
2016-02-01
Multilevel modelling is a popular approach for longitudinal data analysis. Statistical models conventionally target a parameter at the centre of a distribution. However, when the distribution of the data is asymmetric, modelling other location parameters, e.g. percentiles, may be more informative. We present a new approach, M -quantile random-effects regression, for modelling multilevel data. The proposed method is used for modelling location parameters of the distribution of the strengths and difficulties questionnaire scores of children in England who participate in the Millennium Cohort Study. Quantile mixed models are also considered. The analyses offer insights to child psychologists about the differential effects of risk factors on children's outcomes.
[The future of forensic DNA analysis for criminal justice].
Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent
2017-11-01
In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.
Qiu, Jin; Cheng, Jiajing; Wang, Qingying; Hua, Jie
2014-01-01
Background The aim of this study was to compare the effects of the levonorgestrel-releasing intrauterine system (LNG-IUS) with conventional medical treatment in reducing heavy menstrual bleeding. Material/Methods Relevant studies were identified by a search of MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, and clinical trials registries (from inception to April 2014). Randomized controlled trials comparing the LNG-IUS with conventional medical treatment (mefenamic acid, tranexamic acid, norethindrone, medroxyprogesterone acetate injection, or combined oral contraceptive pills) in patients with menorrhagia were included. Results Eight randomized controlled trials that included 1170 women (LNG-IUS, n=562; conventional medical treatment, n=608) met inclusion criteria. The LNG-IUS was superior to conventional medical treatment in reducing menstrual blood loss (as measured by the alkaline hematin method or estimated by pictorial bleeding assessment chart scores). More women were satisfied with the LNG-IUS than with the use of conventional medical treatment (odds ratio [OR] 5.19, 95% confidence interval [CI] 2.73–9.86). Compared with conventional medical treatment, the LNG-IUS was associated with a lower rate of discontinuation (14.6% vs. 28.9%, OR 0.39, 95% CI 0.20–0.74) and fewer treatment failures (9.2% vs. 31.0%, OR 0.18, 95% CI 0.10–0.34). Furthermore, quality of life assessment favored LNG-IUS over conventional medical treatment, although use of various measurements limited our ability to pool the data for more powerful evidence. Serious adverse events were statistically comparable between treatments. Conclusions The LNG-IUS was the more effective first choice for management of menorrhagia compared with conventional medical treatment. Long-term, randomized trials are required to further investigate patient-based outcomes and evaluate the cost-effectiveness of the LNG-IUS and other medical treatments. PMID:25245843
Effects of a Short Drilling Implant Protocol on Osteotomy Site Temperature and Drill Torque.
Mihali, Sorin G; Canjau, Silvana; Cernescu, Anghel; Bortun, Cristina M; Wang, Hom-Lay; Bratu, Emanuel
2018-02-01
To establish a protocol for reducing the drilling sequence during implant site preparation based on temperature and insertion torque. The traditional conventional drilling sequence (used several drills with 0.6-mm increment each time) was compared with the proposed short drilling protocol (only used 2 drills: initial and final drill). One hundred drilling osteotomies were performed in bovine and porcine bones. Sets of 2 osteotomy sites were created in 5 bone densities using 2 types of drilling protocols. Thermographic pictures were captured throughout all drilling procedures and analyzed using ThermaCAM Researcher Professional 2.10. Torque values were determined during drilling by measuring electrical input and drill speed. There were statistically significant differences in bone temperature between the conventional and short drilling protocols during implant site preparation (analysis of variance P = 0.0008). However, there were no significant differences between the 2 types of drilling protocols for both implant diameters. Implant site preparation time was significantly reduced when using the short drilling protocol compared with the conventional drilling protocol (P < 0.001). Within the limitations of the study, the short drilling protocol proposed herein may represent a safe approach for implant site preparation.
Visual Biofeedback Balance Training Using Wii Fit after Stroke: A Randomized Controlled Trial
Barcala, Luciana; Grecco, Luanda André Collange; Colella, Fernanda; Lucareli, Paulo Roberto Garcia; Salgado, Afonso Shiguemi Inoue; Oliveira, Claudia Santos
2013-01-01
[Purpose] The aim of the present study was to investigate the effect of balance training with visual biofeedback on balance, body symmetry, and function among individuals with hemiplegia following a stroke. [Subjects and Methods] The present study was performed using a randomized controlled clinical trial with a blinded evaluator. The subjects were twenty adults with hemiplegia following a stroke. The experimental group performed balance training with visual biofeedback using Wii Fit® together with conventional physical therapy. The control group underwent conventional physical therapy alone. The intervention lasted five weeks, with two sessions per week. Body symmetry (baropodometry), static balance (stabilometry), functional balance (Berg Balance Scale), functional mobility (Timed Up and Go test), and independence in activities of daily living (Functional Independence Measure) were assessed before and after the intervention. [Results] No statistically significant differences were found between the experimental and control groups. In the intragroup analysis, both groups demonstrated a significant improvement in all variables studied. [Conclusion] The physical therapy program combined with balance training involving visual biofeedback (Wii Fit®) led to an improvement in body symmetry, balance, and function among stroke victims. However, the improvement was similar to that achieved with conventional physical therapy alone. PMID:24259909
A new structure of permeable pavement for mitigating urban heat island.
Liu, Yong; Li, Tian; Peng, Hangyu
2018-09-01
The urban heat island (UHI) effect has been a great threat to human habitation, and how to mitigate this problem has been a global concern over decades. This paper addresses the cooling effect of a novel permeable pavement called evaporation-enhancing permeable pavement, which has capillary columns in aggregate and a liner at the bottom. To explore the efficiency of mitigating the UHI, bench-scale permeable pavement units with capillary columns were developed and compared with conventional permeable pavement. Criteria of capillary capacities of the column, evaporation rates, and surface temperature of the pavements were monitored under simulated rainfall and Shanghai local weather conditions. Results show the capillary column was important in increasing evaporation by lifting water from the bottom to the surface, and the evaporation-enhancing permeable pavement was cooler than a conventional permeable pavement by as much as 9.4°C during the experimental period. Moreover, the cooling effect of the former pavement could persist more than seven days under the condition of no further rainfall. Statistical analysis result reveals that evaporation-enhancing permeable pavement can mitigate the UHI effect significantly more than a conventional permeable pavement. Copyright © 2018 Elsevier B.V. All rights reserved.
Lima, Ana Paula Barbosa; Vitti, Rafael Pino; Amaral, Marina; Neves, Ana Christina Claro; da Silva Concilio, Lais Regiane
2018-04-01
This study evaluated the dimensional stability of a complete-arch prosthesis processed by conventional method in water bath or microwave energy and polymerized by two different curing cycles. Forty maxillary complete-arch prostheses were randomly divided into four groups (n = 10): MW1 - acrylic resin cured by one microwave cycle; MW2 - acrylic resin cured by two microwave cycles: WB1 - conventional acrylic resin polymerized using one curing cycle in a water bath; WB2 - conventional acrylic resin polymerized using two curing cycles in a water bath. For evaluation of dimensional stability, occlusal vertical dimension (OVD) and area of contact points were measured in two different measurement times: before and after the polymerization method. A digital caliper was used for OVD measurement. Occlusal contact registration strips were used between maxillary and mandibular dentures to measure the contact points. The images were measured using the software IpWin32, and the differences before and after the polymerization methods were calculated. The data were statistically analyzed using the one-way ANOVA and Tukey test (α = .05). he results demonstrated significant statistical differences for OVD between different measurement times for all groups. MW1 presented the highest OVD values, while WB2 had the lowest OVD values ( P <.05). No statistical differences were found for area of contact points among the groups ( P =.7150). The conventional acrylic resin polymerized using two curing cycles in a water bath led to less difference in OVD of complete-arch prosthesis.
The role of simulation in the design of a neural network chip
NASA Technical Reports Server (NTRS)
Desai, Utpal; Roppel, Thaddeus A.; Padgett, Mary L.
1993-01-01
An iterative, simulation-based design procedure for a neural network chip is introduced. For this design procedure, the goal is to produce a chip layout for a neural network in which the weights are determined by transistor gate width-to-length ratios. In a given iteration, the current layout is simulated using the circuit simulator SPICE, and layout adjustments are made based on conventional gradient-decent methods. After the iteration converges, the chip is fabricated. Monte Carlo analysis is used to predict the effect of statistical fabrication process variations on the overall performance of the neural network chip.
Area estimation using multiyear designs and partial crop identification
NASA Technical Reports Server (NTRS)
Sielken, R. L., Jr.
1984-01-01
Statistical procedures were developed for large area assessments using both satellite and conventional data. Crop acreages, other ground cover indices, and measures of change were the principal characteristics of interest. These characteristics are capable of being estimated from samples collected possibly from several sources at varying times, with different levels of identification. Multiyear analysis techniques were extended to include partially identified samples; the best current year sampling design corresponding to a given sampling history was determined; weights reflecting the precision or confidence in each observation were identified and utilized, and the variation in estimates incorporating partially identified samples were quantified.
Trace element fingerprinting of jewellery rubies by external beam PIXE
NASA Astrophysics Data System (ADS)
Calligaro, T.; Poirot, J.-P.; Querré, G.
1999-04-01
External beam PIXE analysis allows the non-destructive in situ characterisation of gemstones mounted on jewellery pieces. This technique was used for the determination of the geographical origin of 64 rubies set on a high-valued necklace. The trace element content of these gemstones was measured and compared to that of a set of rubies of known sources. Multivariate statistical processing of the results allowed us to infer the provenance of rubies : one comes from Thailand/Cambodia deposit while the remaining are attributed to Burma. This highlights the complementary capabilities of PIXE and conventional gemological observations.
ICRH system performance during ITER-Like Wall operations at JET and the outlook for DT campaign
NASA Astrophysics Data System (ADS)
Monakhov, Igor; Blackman, Trevor; Dumortier, Pierre; Durodié, Frederic; Jacquet, Philippe; Lerche, Ernesto; Noble, Craig
2017-10-01
Performance of JET ICRH system since installation of the metal ITER-Like Wall (ILW) has been assessed statistically. The data demonstrate steady increase of the RF power coupled to plasmas over recent years with the maximum pulse-average and peak values exceeding respectively 6MW and 8MW in 2016. Analysis and extrapolation of power capabilities of conventional JET ICRH antennas is provided and key performance-limiting factors are discussed. The RF plant operational frequency options are presented highlighting the issues of efficient ICRH application within a foreseeable range of DT plasma scenarios.
NASA Astrophysics Data System (ADS)
Miyazawa, Arata; Hong, Young-Joo; Makita, Shuichi; Kasaragod, Deepa K.; Miura, Masahiro; Yasuno, Yoshiaki
2017-02-01
Local statistics are widely utilized for quantification and image processing of OCT. For example, local mean is used to reduce speckle, local variation of polarization state (degree-of-polarization-uniformity (DOPU)) is used to visualize melanin. Conventionally, these statistics are calculated in a rectangle kernel whose size is uniform over the image. However, the fixed size and shape of the kernel result in a tradeoff between image sharpness and statistical accuracy. Superpixel is a cluster of pixels which is generated by grouping image pixels based on the spatial proximity and similarity of signal values. Superpixels have variant size and flexible shapes which preserve the tissue structure. Here we demonstrate a new superpixel method which is tailored for multifunctional Jones matrix OCT (JM-OCT). This new method forms the superpixels by clustering image pixels in a 6-dimensional (6-D) feature space (spatial two dimensions and four dimensions of optical features). All image pixels were clustered based on their spatial proximity and optical feature similarity. The optical features are scattering, OCT-A, birefringence and DOPU. The method is applied to retinal OCT. Generated superpixels preserve the tissue structures such as retinal layers, sclera, vessels, and retinal pigment epithelium. Hence, superpixel can be utilized as a local statistics kernel which would be more suitable than a uniform rectangle kernel. Superpixelized image also can be used for further image processing and analysis. Since it reduces the number of pixels to be analyzed, it reduce the computational cost of such image processing.
The Circulation Analysis of Serial Use: Numbers Game or Key to Service?
Raisig, L. Miles
1967-01-01
The conventionally erected and reported circulation analysis of serial use in the individual and the feeder library is found to be statistically invalid and misleading, since it measures neither the intellectual use of the serial's contents nor the physical handlings of serial units, and is nonrepresentative of the in-depth library use of serials. It fails utterly to report or even to suggest the relation of intralibrary and interlibrary serial resources. The actual mechanics of the serial use analysis, and the active variables in the library situation which affect serial use, are demonstrated in a simulated analysis and are explained at length. A positive design is offered for the objective gathering and reporting of data on the local intellectual use and physical handling of serials and the relating of resources. Data gathering in the feeder library, and implications for the extension of the feeder library's resources, are discussed. PMID:6055863
Gassaway, Julie; Jones, Michael L; Sweatman, W Mark; Young, Tamara
2017-10-16
Evaluate effects of revised education classes on classroom engagement during inpatient rehabilitation for individuals with spinal cord injury/disease (SCI/D). Multiple-baseline, quasi-experimental design with video recorded engagement observations during conventional and revised education classes; visual and statistical analysis of difference in positive engagement responses observed in classes using each approach. 81 patients (72% male, 73% white, mean age 36 SD 15.6) admitted for SCI/D inpatient rehabilitation in a non-profit rehabilitation hospital, who attended one or more of 33 care self-management education classes that were video recorded. All study activities were approved by the host facility institutional review board. Conventional nurse-led self-management classes were replaced with revised peer-led classes incorporating approaches to promote transformative learning. Revised classes were introduced across three subject areas in a step-wise fashion over 15 weeks. Positive engagement responses (asking questions, participating in discussion, gesturing, raising hand, or otherwise noting approval) were documented from video recordings of 14 conventional and 19 revised education classes. Significantly higher average (per patient per class) positive engagement responses were observed in the revised compared to conventional classes (p=0.008). Redesigning SCI inpatient rehabilitation care self-management classes to promote transformative learning increased patient engagement. Additional research is needed to examine longer term outcomes and replicability in other settings.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing
Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called “outsourcing”, “fragmentation production”, and “trade in tasks” has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics. PMID:28081201
Multi-template tensor-based morphometry: Application to analysis of Alzheimer's disease
Koikkalainen, Juha; Lötjönen, Jyrki; Thurfjell, Lennart; Rueckert, Daniel; Waldemar, Gunhild; Soininen, Hilkka
2012-01-01
In this paper methods for using multiple templates in tensor-based morphometry (TBM) are presented and comparedtothe conventional single-template approach. TBM analysis requires non-rigid registrations which are often subject to registration errors. When using multiple templates and, therefore, multiple registrations, it can be assumed that the registration errors are averaged and eventually compensated. Four different methods are proposed for multi-template TBM. The methods were evaluated using magnetic resonance (MR) images of healthy controls, patients with stable or progressive mild cognitive impairment (MCI), and patients with Alzheimer's disease (AD) from the ADNI database (N=772). The performance of TBM features in classifying images was evaluated both quantitatively and qualitatively. Classification results show that the multi-template methods are statistically significantly better than the single-template method. The overall classification accuracy was 86.0% for the classification of control and AD subjects, and 72.1%for the classification of stable and progressive MCI subjects. The statistical group-level difference maps produced using multi-template TBM were smoother, formed larger continuous regions, and had larger t-values than the maps obtained with single-template TBM. PMID:21419228
ERIC Educational Resources Information Center
Fernandes, Tania; Kolinsky, Regine; Ventura, Paulo
2009-01-01
This study combined artificial language learning (ALL) with conventional experimental techniques to test whether statistical speech segmentation outputs are integrated into adult listeners' mental lexicon. Lexicalization was assessed through inhibitory effects of novel neighbors (created by the parsing process) on auditory lexical decisions to…
Learning from Friends: Measuring Influence in a Dyadic Computer Instructional Setting
ERIC Educational Resources Information Center
DeLay, Dawn; Hartl, Amy C.; Laursen, Brett; Denner, Jill; Werner, Linda; Campe, Shannon; Ortiz, Eloy
2014-01-01
Data collected from partners in a dyadic instructional setting are, by definition, not statistically independent. As a consequence, conventional parametric statistical analyses of change and influence carry considerable risk of bias. In this article, we illustrate a strategy to overcome this obstacle: the longitudinal actor-partner interdependence…
50 CFR 300.107 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... area designated by the Food and Agriculture Organization of the United Nations as Statistical Area 51 or Statistical Area 57 in the eastern and western Indian Ocean outside and north of the Convention Area shall be issued a preapproval. (2) Harvesting vessels. (i) In addition to any AMLR harvesting...
50 CFR 300.107 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... area designated by the Food and Agriculture Organization of the United Nations as Statistical Area 51 or Statistical Area 57 in the eastern and western Indian Ocean outside and north of the Convention Area shall be issued a preapproval. (2) Harvesting vessels. (i) In addition to any AMLR harvesting...
50 CFR 300.107 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... area designated by the Food and Agriculture Organization of the United Nations as Statistical Area 51 or Statistical Area 57 in the eastern and western Indian Ocean outside and north of the Convention Area shall be issued a preapproval. (2) Harvesting vessels. (i) In addition to any AMLR harvesting...
50 CFR 300.107 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... area designated by the Food and Agriculture Organization of the United Nations as Statistical Area 51 or Statistical Area 57 in the eastern and western Indian Ocean outside and north of the Convention Area shall be issued a preapproval. (2) Harvesting vessels. (i) In addition to any AMLR harvesting...
50 CFR 300.107 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... area designated by the Food and Agriculture Organization of the United Nations as Statistical Area 51 or Statistical Area 57 in the eastern and western Indian Ocean outside and north of the Convention Area shall be issued a preapproval. (2) Harvesting vessels. (i) In addition to any AMLR harvesting...
Challenging Conventional Wisdom for Multivariate Statistical Models with Small Samples
ERIC Educational Resources Information Center
McNeish, Daniel
2017-01-01
In education research, small samples are common because of financial limitations, logistical challenges, or exploratory studies. With small samples, statistical principles on which researchers rely do not hold, leading to trust issues with model estimates and possible replication issues when scaling up. Researchers are generally aware of such…
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
Two abstracts and seventeen articles on computer assisted instruction (CAI) presented at the 1976 Association for Educational Data Systems (AEDS) convention are included here. Four new computer programs are described: Author System for Education and Training (ASET); GNOSIS, a Swedish/English CAI package; Statistical Interactive Programming System…
ERIC Educational Resources Information Center
Darwazeh, Afnan N.
The aim of this study was to investigate some of the learner variables that may have an influence on university academic achievement in a distance versus a conventional education setting. Descriptive and analytical statistics were used to analyze data by using "Pearson r," and "F-test." Results revealed that the university…
Aftab, Syed Arafat; Tay, Kiang Hiong; Irani, Farah G; Gong Lo, Richard Hoau; Gogna, Apoorva; Haaland, Benjamin; Tan, Seck Guan; Chng, Siew Png; Pasupathy, Shanker; Choong, Hui Lin; Tan, Bien Soo
2014-02-01
To compare the efficacy and safety of cutting balloon angioplasty (CBA) versus high-pressure balloon angioplasty (HPBA) for the treatment of hemodialysis autogenous fistula stenoses resistant to conventional percutaneous transluminal angioplasty (PTA). In a prospective, randomized clinical trial involving patients with dysfunctional, stenotic hemodialysis arteriovenous fistulas (AVFs), patients were randomized to receive CBA or HPBA if conventional PTA had suboptimal results (ie, residual stenosis > 30%). A total of 516 patients consented to participate in the study from October 2008 to September 2011, 85% of whom (n = 439) had technically successful conventional PTA. The remaining 71 patients (mean age, 60 y; 49 men) with suboptimal PTA results were eventually randomized: 36 to the CBA arm and 35 to the HPBA arm. Primary and secondary target lesion patencies were determined by Kaplan-Meier analysis. Clinical success rates were 100% in both arms. Primary target lesion patency rates at 6 months were 66.4% and 39.9% for CBA and HPBA, respectively (P = .01). Secondary target lesion patency rates at 6 months were 96.5% for CBA and 80.0% for HPBA (P = .03). There was a single major complication of venous perforation following CBA. The 30-day mortality rate was 1.4%, with one non-procedure-related death in the HPBA group. Primary and secondary target lesion patency rates of CBA were statistically superior to those of HPBA following suboptimal conventional PTA. For AVF stenoses resistant to conventional PTA, CBA may be a better second-line treatment given its superior patency rates. © 2014 SIR Published by SIR All rights reserved.
Demirel, Serdar; Attigah, Nicolas; Bruijnen, Hans; Ringleb, Peter; Eckstein, Hans-Henning; Fraedrich, Gustav; Böckler, Dittmar
2012-07-01
Carotid endarterectomy (CEA) is beneficial in patients with symptomatic carotid artery stenosis. However, randomized trials have not provided evidence concerning the optimal CEA technique, conventional or eversion. The outcome of 563 patients within the surgical randomization arm of the Stent-Protected Angioplasty versus Carotid Endarterectomy in Symptomatic Patients (SPACE-1) trial was analyzed by surgical technique subgroups: eversion endarterectomy versus conventional endarterectomy with patch angioplasty. The primary end point was ipsilateral stroke or death within 30 days after surgery. Secondary outcome events included perioperative adverse events and the 2-year risk of restenosis, stroke, and death. Both groups were similar in terms of demographic and other baseline clinical variables. Shunt frequency was higher in the conventional CEA group (65% versus 17%; P<0.0001). The risk of ipsilateral stroke or death within 30 days after surgery was significantly greater with eversion CEA (9% versus 3%; P=0.005). There were no statistically significant differences in the rate of perioperative secondary outcome events with the exception of a significantly higher risk of intraoperative ipsilateral stroke rate in the eversion CEA group (4% versus 0.3%; P=0.0035). The 2-year risk of ipsilateral stroke occurring after 30 days was significantly higher in the conventional CEA group (2.9% versus 0%; P=0.017). In patients with symptomatic carotid artery stenosis, conventional CEA appears to be associated with better periprocedural neurological outcome than eversion CEA. Eversion CEA, however, may be more effective for long-term prevention of ipsilateral stroke. These findings should be interpreted with caution noting the limitations of the post hoc, nonrandomized nature of the analysis.
Kaur, Ravinder; Dhakad, Megh Singh; Goyal, Ritu; Haque, Absarul; Mukhopadhyay, Gauranga
2016-01-01
Candida infection is a major cause of morbidity and mortality in immunocompromised patients; an accurate and early identification is a prerequisite need to be taken as an effective measure for the management of patients. The purpose of this study was to compare the conventional identification of Candida species with identification by Vitek-2 system and the antifungal susceptibility testing (AST) by broth microdilution method with Vitek-2 AST system. A total of 172 Candida isolates were subjected for identification by the conventional methods, Vitek-2 system, restriction fragment length polymorphism, and random amplified polymorphic DNA analysis. AST was carried out as per the Clinical and Laboratory Standards Institute M27-A3 document and by Vitek-2 system. Candida albicans (82.51%) was the most common Candida species followed by Candida tropicalis (6.29%), Candida krusei (4.89%), Candida parapsilosis (3.49%), and Candida glabrata (2.79%). With Vitek-2 system, of the 172 isolates, 155 Candida isolates were correctly identified, 13 were misidentified, and four were with low discrimination. Whereas with conventional methods, 171 Candida isolates were correctly identified and only a single isolate of C. albicans was misidentified as C. tropicalis . The average measurement of agreement between the Vitek-2 system and conventional methods was >94%. Most of the isolates were susceptible to fluconazole (88.95%) and amphotericin B (97.67%). The measurement of agreement between the methods of AST was >94% for fluconazole and >99% for amphotericin B, which was statistically significant ( P < 0.01). The study confirmed the importance and reliability of conventional and molecular methods, and the acceptable agreements suggest Vitek-2 system an alternative method for speciation and sensitivity testing of Candida species infections.
Analysis of laparoscopic port site complications: A descriptive study
Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya
2013-01-01
CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110
Vojdani, M; Torabi, K; Farjood, E; Khaledi, Aar
2013-09-01
Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student's t- test was used for statistical analysis (α=0.05). The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student's t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um.
Vojdani, M; Torabi, K; Farjood, E; Khaledi, AAR
2013-01-01
Statement of Problem: Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. Purpose: This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Materials and Method: Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student’s t- test was used for statistical analysis (α=0.05). Results: The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student’s t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Conclusion: Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um. PMID:24724133
Unsedated transnasal small-caliber esophagogastroduodenoscopy in elderly and bedridden patients.
Yuki, Mika; Amano, Yuji; Komazawa, Yoshinori; Fukuhara, Hiroyuki; Shizuku, Toshihiro; Yamamoto, Shun; Kinoshita, Yoshikazu
2009-11-28
To evaluate the safety of unsedated transnasal small-caliber esophagogastroduodenoscopy (EGD) for elderly and critically ill bedridden patients. One prospective randomized comparative study and one crossover comparative study between transnasal small-caliber EGD and transoral conventional EGD was done (Study 1). For the comparative study, we enrolled 240 elderly patients aged > 65 years old. For the crossover analysis, we enrolled 30 bedridden patients with percutaneous endoscopic gastrostomy (PEG) (Study 2). We evaluated cardiopulmonary effects by measuring arterial oxygen saturation (SpO(2)) and calculating the rate-pressure product (RPP) (pulse rate x systolic blood pressure/100) at baseline, 2 and 5 min after endoscopic intubation in Study 1. To assess the risk for endoscopy-related aspiration pneumonia during EGD, we also measured blood leukocyte counts and serum C-reactive protein (CRP) levels before and 3 d after EGD in Study 2. In Study 1, we observed significant decreases in SpO(2) during conventional transoral EGD, but not during transnasal small-caliber EGD (0.24% vs -0.24% after 2 min, and 0.18% vs -0.29% after 5 min, P = 0.034, P = 0.044). Significant differences of the RPP were not found between conventional transoral and transnasal small-caliber EGD. In Study 2, crossover analysis showed statistically significant increases of the RPP at 2 min after intubation and the end of endoscopy (26.8 and 34.6 vs 3.1 and 15.2, P = 0.044, P = 0.046), and decreases of SpO(2) (-0.8% vs -0.1%, P = 0.042) during EGD with transoral conventional in comparison with transnasal small-caliber endoscopy. Thus, for bedridden patients with PEG feeding, who were examined in the supine position, transoral conventional EGD more severely suppressed cardiopulmonary function than transnasal small-caliber EGD. There were also significant increases in the markers of inflammation, blood leukocyte counts and serum CRP values, in bedridden patients after transoral conventional EGD, but not after transnasal small-caliber EGD performed with the patient in the supine position. Leukocyte count increased from 6053 +/- 1975/L to 6900 +/- 3392/L (P = 0.0008) and CRP values increased from 0.93 +/- 0.24 to 2.49 +/- 0.91 mg/dL (P = 0.0005) at 3 d after transoral conventional EGD. Aspiration pneumonia, possibly caused by the endoscopic examination, was found subsequently in two of 30 patients after transoral conventional EGD. Transnasal small-caliber EGD is a safer method than transoral conventional EGD in critically ill, bedridden patients who are undergoing PEG feeding.
Unsedated transnasal small-caliber esophagogastroduodenoscopy in elderly and bedridden patients
Yuki, Mika; Amano, Yuji; Komazawa, Yoshinori; Fukuhara, Hiroyuki; Shizuku, Toshihiro; Yamamoto, Shun; Kinoshita, Yoshikazu
2009-01-01
AIM: To evaluate the safety of unsedated transnasal small-caliber esophagogastroduodenoscopy (EGD) for elderly and critically ill bedridden patients. METHODS: One prospective randomized comparative study and one crossover comparative study between transnasal small-caliber EGD and transoral conventional EGD was done (Study 1). For the comparative study, we enrolled 240 elderly patients aged > 65 years old. For the crossover analysis, we enrolled 30 bedridden patients with percutaneous endoscopic gastrostomy (PEG) (Study 2). We evaluated cardiopulmonary effects by measuring arterial oxygen saturation (SpO2) and calculating the rate-pressure product (RPP) (pulse rate × systolic blood pressure/100) at baseline, 2 and 5 min after endoscopic intubation in Study 1. To assess the risk for endoscopy-related aspiration pneumonia during EGD, we also measured blood leukocyte counts and serum C-reactive protein (CRP) levels before and 3 d after EGD in Study 2. RESULTS: In Study 1, we observed significant decreases in SpO2 during conventional transoral EGD, but not during transnasal small-caliber EGD (0.24% vs -0.24% after 2 min, and 0.18% vs -0.29% after 5 min, P = 0.034, P = 0.044). Significant differences of the RPP were not found between conventional transoral and transnasal small-caliber EGD. In Study 2, crossover analysis showed statistically significant increases of the RPP at 2 min after intubation and the end of endoscopy (26.8 and 34.6 vs 3.1 and 15.2, P = 0.044, P = 0.046), and decreases of SpO2 (-0.8% vs -0.1%, P = 0.042) during EGD with transoral conventional in comparison with transnasal small-caliber endoscopy. Thus, for bedridden patients with PEG feeding, who were examined in the supine position, transoral conventional EGD more severely suppressed cardiopulmonary function than transnasal small-caliber EGD. There were also significant increases in the markers of inflammation, blood leukocyte counts and serum CRP values, in bedridden patients after transoral conventional EGD, but not after transnasal small-caliber EGD performed with the patient in the supine position. Leukocyte count increased from 6053 ± 1975/L to 6900 ± 3392/L (P = 0.0008) and CRP values increased from 0.93 ± 0.24 to 2.49 ± 0.91 mg/dL (P = 0.0005) at 3 d after transoral conventional EGD. Aspiration pneumonia, possibly caused by the endoscopic examination, was found subsequently in two of 30 patients after transoral conventional EGD. CONCLUSION: Transnasal small-caliber EGD is a safer method than transoral conventional EGD in critically ill, bedridden patients who are undergoing PEG feeding. PMID:19938199
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Nonlinear dynamic analysis of voices before and after surgical excision of vocal polyps
NASA Astrophysics Data System (ADS)
Zhang, Yu; McGilligan, Clancy; Zhou, Liang; Vig, Mark; Jiang, Jack J.
2004-05-01
Phase space reconstruction, correlation dimension, and second-order entropy, methods from nonlinear dynamics, are used to analyze sustained vowels generated by patients before and after surgical excision of vocal polyps. Two conventional acoustic perturbation parameters, jitter and shimmer, are also employed to analyze voices before and after surgery. Presurgical and postsurgical analyses of jitter, shimmer, correlation dimension, and second-order entropy are statistically compared. Correlation dimension and second-order entropy show a statistically significant decrease after surgery, indicating reduced complexity and higher predictability of postsurgical voice dynamics. There is not a significant postsurgical difference in shimmer, although jitter shows a significant postsurgical decrease. The results suggest that jitter and shimmer should be applied to analyze disordered voices with caution; however, nonlinear dynamic methods may be useful for analyzing abnormal vocal function and quantitatively evaluating the effects of surgical excision of vocal polyps.
Quantifying predictability in a model with statistical features of the atmosphere
Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya
2002-01-01
The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863
Saadati, Farzaneh; Ahmad Tarmizi, Rohani
2015-01-01
Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’ because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students’ problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students. PMID:26132553
Economics of Undiscovered Oil and Gas in the North Slope of Alaska: Economic Update and Synthesis
Attanasi, E.D.; Freeman, P.A.
2009-01-01
The U.S. Geological Survey (USGS) has published assessments by geologists of undiscovered conventional oil and gas accumulations in the North Slope of Alaska; these assessments contain a set of scientifically based estimates of undiscovered, technically recoverable quantities of oil and gas in discrete oil and gas accumulations that can be produced with conventional recovery technology. The assessments do not incorporate economic factors such as recovery costs and product prices. The assessors considered undiscovered conventional oil and gas resources in four areas of the North Slope: (1) the central North Slope, (2) the National Petroleum Reserve in Alaska (NPRA), (3) the 1002 Area of the Arctic National Wildlife Refuge (ANWR), and (4) the area west of the NPRA, called in this report the 'western North Slope'. These analyses were prepared at different times with various minimum assessed oil and gas accumulation sizes and with slightly different assumptions. Results of these past studies were recently supplemented with information by the assessment geologists that allowed adjustments for uniform minimum assessed accumulation sizes and a consistent set of assumptions. The effort permitted the statistical aggregation of the assessments of the four areas composing the study area. This economic analysis is based on undiscovered assessed accumulation distributions represented by the four-area aggregation and incorporates updates of costs and technological and fiscal assumptions used in the initial economic analysis that accompanied the geologic assessment of each study area.
Correlation analysis between pulmonary function test parameters and CT image parameters of emphysema
NASA Astrophysics Data System (ADS)
Liu, Cheng-Pei; Li, Chia-Chen; Yu, Chong-Jen; Chang, Yeun-Chung; Wang, Cheng-Yi; Yu, Wen-Kuang; Chen, Chung-Ming
2016-03-01
Conventionally, diagnosis and severity classification of Chronic Obstructive Pulmonary Disease (COPD) are usually based on the pulmonary function tests (PFTs). To reduce the need of PFT for the diagnosis of COPD, this paper proposes a correlation model between the lung CT images and the crucial index of the PFT, FEV1/FVC, a severity index of COPD distinguishing a normal subject from a COPD patient. A new lung CT image index, Mirage Index (MI), has been developed to describe the severity of COPD primarily with emphysema disease. Unlike conventional Pixel Index (PI) which takes into account all voxels with HU values less than -950, the proposed approach modeled these voxels by different sizes of bullae balls and defines MI as a weighted sum of the percentages of the bullae balls of different size classes and locations in a lung. For evaluation of the efficacy of the proposed model, 45 emphysema subjects of different severity were involved in this study. In comparison with the conventional index, PI, the correlation between MI and FEV1/FVC is -0.75+/-0.08, which substantially outperforms the correlation between PI and FEV1/FVC, i.e., -0.63+/-0.11. Moreover, we have shown that the emphysematous lesion areas constituted by small bullae balls are basically irrelevant to FEV1/FVC. The statistical analysis and special case study results show that MI can offer better assessment in different analyses.
Physical properties of conventional and Super Slick elastomeric ligatures after intraoral use.
Crawford, Nicola Louise; McCarthy, Caroline; Murphy, Tanya C; Benson, Philip Edward
2010-01-01
To investigate the change in the physical properties of conventional and Super Slick elastomeric ligatures after they have been in the mouth. Nine healthy volunteers took part. One orthodontic bracket was bonded to a premolar tooth in each of the four quadrants of the mouth. Two conventional and two Super Slick elastomeric ligatures were placed at random locations on either side of the mouth. The ligatures were collected after various time intervals and tested using an Instron Universal testing machine. The two outcome measures were failure load and the static frictional resistance. The failure load for conventional ligatures was reduced to 67% of the original value after 6 weeks in situ. Super Slick elastomeric ligatures showed a comparable reduction after 6 weeks in situ (63% of original value). There were no statistical differences in the static friction between conventional and Super Slick elastomerics that had been in situ for either 24 hours (P = .686) or 6 weeks (P = .416). There was a good correlation between failure load and static friction (r = .49). There were statistically significant differences in the failure loads of elastomerics that had not be placed in the mouth and those that had been in the mouth for 6 weeks. There were no differences in the static frictional forces produced by conventional and Super Slick ligatures either before or after they had been placed in the mouth. There appears to be a direct proportional relationship between failure load and static friction of elastomeric ligatures.
Advanced Gear Alloys for Ultra High Strength Applications
NASA Technical Reports Server (NTRS)
Shen, Tony; Krantz, Timothy; Sebastian, Jason
2011-01-01
Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.
Oral 5-aminosalicylic acid for maintenance of remission in ulcerative colitis.
Wang, Yongjun; Parker, Claire E; Feagan, Brian G; MacDonald, John K
2016-05-09
Oral 5-aminosalicylic (5-ASA) preparations were intended to avoid the adverse effects of sulfasalazine (SASP) while maintaining its therapeutic benefits. Previously, it was found that 5-ASA drugs were more effective than placebo but had a statistically significant therapeutic inferiority relative to SASP. This updated review includes more recent studies and evaluates the effectiveness, dose-responsiveness, and safety of 5-ASA preparations used for maintenance of remission in quiescent ulcerative colitis. The primary objectives were to assess the efficacy, dose-responsiveness and safety of oral 5-ASA compared to placebo, SASP, or 5-ASA comparators for maintenance of remission in quiescent ulcerative colitis. A secondary objective was to compare the efficacy and safety of once daily dosing of oral 5-ASA with conventional (two or three times daily) dosing regimens. A literature search for relevant studies (inception to 9 July 2015) was performed using MEDLINE, EMBASE and the Cochrane Library. Review articles and conference proceedings were also searched to identify additional studies. Studies were accepted for analysis if they were randomized controlled trials with a minimum treatment duration of six months. Studies of oral 5-ASA therapy for treatment of patients with quiescent ulcerative colitis compared with placebo, SASP or other 5-ASA formulations were considered for inclusion. Studies that compared once daily 5-ASA treatment with conventional dosing of 5-ASA and 5-ASA dose ranging studies were also considered for inclusion. The primary outcome was the failure to maintain clinical or endoscopic remission. Secondary outcomes included adherence, adverse events, withdrawals due to adverse events, and withdrawals or exclusions after entry. Trials were separated into five comparison groups: 5-ASA versus placebo, 5-ASA versus sulfasalazine, once daily dosing versus conventional dosing, 5-ASA versus comparator 5-ASA formulation, and 5-ASA dose-ranging. Placebo-controlled trials were subgrouped by dosage. Once daily versus conventional dosing studies were subgrouped by formulation. 5-ASA-controlled trials were subgrouped by common 5-ASA comparators (e.g. Asacol and Salofalk). Dose-ranging studies were subgrouped by 5-ASA formulation. We calculated the risk ratio (RR) and 95% confidence intervals (95% CI) for each outcome. Data were analyzed on an intention-to-treat basis. Forty-one studies (8928 patients) were included. The majority of included studies were rated as low risk of bias. Ten studies were rated at high risk of bias. Seven of these studies were single-blind and three studies were open-label. However, two open-label studies and four of the single-blind studies utilized investigator performed endoscopy as an endpoint, which may protect against bias. 5-ASA was significantly superior to placebo for maintenance of clinical or endoscopic remission. Forty-one per cent of 5-ASA patients relapsed compared to 58% of placebo patients (7 studies, 1298 patients; RR 0.69, 95% CI 0.62 to 0.77). There was a trend towards greater efficacy with higher doses of 5-ASA with a statistically significant benefit for the 1 to 1.9 g/day (RR 0.65; 95% CI 0.56 to 0.76) and the > 2 g/day subgroups (RR 0.73, 95% CI 0.60 to 0.89). SASP was significantly superior to 5-ASA for maintenance of remission. Forty-eight per cent of 5-ASA patients relapsed compared to 43% of SASP patients (12 studies, 1655 patients; RR 1.14, 95% CI 1.03 to 1.27). A GRADE analysis indicated that the overall quality of the evidence for the primary outcome for the placebo and SASP-controlled studies was high. No statistically significant differences in efficacy or adherence were found between once daily and conventionally dosed 5-ASA. Twenty-nine per cent of once daily patients relapsed over 12 months compared to 31% of conventionally dosed patients (8 studies, 3127 patients; RR 0.91, 95% CI 0.82 to 1.01). Eleven per cent of patients in the once daily group failed to adhere to their medication regimen compared to 9% of patients in the conventional dosing group (6 studies, 1462 patients; RR 1.22, 95% CI 0.91 to 1.64). There does not appear to be any difference in efficacy among the various 5-ASA formulations. Forty-four per cent of patients in the 5-ASA group relapsed compared to 41% of patients in the 5-ASA comparator group (6 studies, 707 patients; RR 1.08, 95% CI 0.91 to 1.28). A pooled analysis of two studies showed no statistically significant difference in efficacy between Balsalazide 6 g and 3 g/day. Twenty-three per cent of patients in the 6 g/day group relapsed compared to 33% of patients in the 3 g/day group (216 patients; RR 0.76; 95% CI 0.45 to 2.79). One study found Balsalazide 4 g to be superior to 2 g/day. Thirty-seven per cent of patients in the 4 g/day Balsalazide group relapsed compared to 55% of patients in the 2 g/day group (133 patients; RR 0.66; 95% CI 0.45 to 0.97). One study found a statistically significant difference between Salofalk granules 3 g and 1.5 g/day. Twenty-five per cent of patients in the Salofalk 3 g/day group relapsed compared to 39% of patients in the 1.5 g/day group (429 patients; RR 0.65; 95% CI 0.49 to 0.86). Common adverse events included flatulence, abdominal pain, nausea, diarrhea, headache, dyspepsia, and nasopharyngitis. There were no statistically significant differences in the incidence of adverse events between 5-ASA and placebo, 5-ASA and SASP, once daily and conventionally dosed 5-ASA, 5-ASA and comparator 5-ASA formulations and 5-ASA dose ranging studies. The trials that compared 5-ASA and SASP may have been biased in favour of SASP because most trials enrolled patients known to be tolerant to SASP which may have minimized SASP-related adverse events. 5-ASA was superior to placebo for maintenance therapy in ulcerative colitis. However, 5-ASA had a statistically significant therapeutic inferiority relative to SASP. Oral 5-ASA administered once daily is as effective and safe as conventional dosing for maintenance of remission in quiescent ulcerative colitis. There does not appear to be any difference in efficacy or safety between the various formulations of 5-ASA. Patients with extensive ulcerative colitis or with frequent relapses may benefit from a higher dose of maintenance therapy. High dose therapy appears to be as safe as low dose and is not associated with a higher incidence of adverse events.
de Castro, Denise Tornavoi; Lepri, César Penazzo; Valente, Mariana Lima da Costa; dos Reis, Andréa Cândido
2016-01-01
The aim of this study was to compare the compressive strength of a silorane-based composite resin (Filtek P90) to that of conventional composite resins (Charisma, Filtek Z250, Fill Magic, and NT Premium) before and after accelerated artificial aging (AAA). For each composite resin, 16 cylindrical specimens were prepared and divided into 2 groups. One group underwent analysis of compressive strength in a universal testing machine 24 hours after preparation, and the other was subjected first to 192 hours of AAA and then the compressive strength test. Data were analyzed by analysis of variance, followed by the Tukey HSD post hoc test (α = 0.05). Some statistically significant differences in compressive strength were found among the commercial brands (P < 0.001). The conventional composite resin Fill Magic presented the best performance before (P < 0.05) and after AAA (P < 0.05). Values for compressive strength of the silorane-based composite were among the lowest obtained, both before and after aging. Comparison of each material before and after AAA revealed that the aging process did not influence the compressive strength of the tested resins (P = 0.785).
Stoetzer, Marcus; Felgenträger, Dörthe; Kampmann, Andreas; Schumann, Paul; Rücker, Martin; Gellrich, Nils-Claudius; von See, Constantin
2014-07-01
Subperiosteal preparation using a periosteal elevator leads to disturbances of local periosteal microcirculation. Soft-tissue damage can usually be considerably reduced using piezoelectric technology. For this reason, we investigated the effects of a novel piezoelectric device on local periosteal microcirculation and compared this approach with the conventional preparation of the periosteum using a periosteal elevator. A total of 20 Lewis rats were randomly assigned to one of two groups. Subperiosteal preparation was performed using either a piezoelectric device or a periosteal elevator. Intravital microscopy was performed immediately after the procedure as well as three and eight days postoperatively. Statistical analysis of microcirculatory parameters was performed offline using analysis of variance (ANOVA) on ranks (p<0.05). At all time points investigated, intravital microscopy demonstrated significantly higher levels of periosteal perfusion in the group of rats that underwent piezosurgery than in the group of rats that underwent treatment with a periosteal elevator. The use of a piezoelectric device for subperiosteal preparation is associated with better periosteal microcirculation than the use of a conventional periosteal elevator. As a result, piezoelectric devices can be expected to have a positive effect on bone metabolism. Copyright © 2014 Elsevier Inc. All rights reserved.
Lu, Chao; Lv, Xueyou; Lin, Yiming; Li, Dejian; Chen, Lihua; Ji, Feng; Li, Youming; Yu, Chaohui
2016-01-01
Abstract Conventional forceps biopsy (CFB) is the most popular way to screen for gastric epithelial neoplasia (GEN) and adenocarcinoma of gastric epithelium. The aim of this study was to compare the diagnostic accuracy between conventional forceps biopsy and endoscopic submucosal dissection (ESD). Four hundred forty-four patients who finally undertook ESD in our hospital were enrolled from Jan 1, 2009 to Sep 1, 2015. We retrospectively assessed the characteristics of pathological results of CFB and ESD. The concordance rate between CFB and ESD specimens was 68.92% (306/444). Men showed a lower concordance rate (63.61% vs 79.33%; P = 0.001) and concordance patients were younger (P = 0.048). In multivariate analysis, men significantly had a lower concordance rate (coefficient −0.730, P = 0.002) and a higher rate of pathological upgrade (coefficient −0.648, P = 0.015). Locations of CFB did not influence the concordance rate statistically. The concordance rate was relatively high in our hospital. According to our analysis, old men plus gastric fundus or antrum of CFB were strongly suggested to perform ESD if precancerous lesions were found. And young women with low-grade intraepithelial neoplasia could select regular follow-up. PMID:27472723
Why weight? Modelling sample and observational level variability improves power in RNA-seq analyses.
Liu, Ruijie; Holik, Aliaksei Z; Su, Shian; Jansz, Natasha; Chen, Kelan; Leong, Huei San; Blewitt, Marnie E; Asselin-Labat, Marie-Liesse; Smyth, Gordon K; Ritchie, Matthew E
2015-09-03
Variations in sample quality are frequently encountered in small RNA-sequencing experiments, and pose a major challenge in a differential expression analysis. Removal of high variation samples reduces noise, but at a cost of reducing power, thus limiting our ability to detect biologically meaningful changes. Similarly, retaining these samples in the analysis may not reveal any statistically significant changes due to the higher noise level. A compromise is to use all available data, but to down-weight the observations from more variable samples. We describe a statistical approach that facilitates this by modelling heterogeneity at both the sample and observational levels as part of the differential expression analysis. At the sample level this is achieved by fitting a log-linear variance model that includes common sample-specific or group-specific parameters that are shared between genes. The estimated sample variance factors are then converted to weights and combined with observational level weights obtained from the mean-variance relationship of the log-counts-per-million using 'voom'. A comprehensive analysis involving both simulations and experimental RNA-sequencing data demonstrates that this strategy leads to a universally more powerful analysis and fewer false discoveries when compared to conventional approaches. This methodology has wide application and is implemented in the open-source 'limma' package. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Safety evaluation of joint and conventional lane merge configurations for freeway work zones.
Ishak, Sherif; Qi, Yan; Rayaprolu, Pradeep
2012-01-01
Inefficient operation of traffic in work zone areas not only leads to an increase in travel time delays, queue length, and fuel consumption but also increases the number of forced merges and roadway accidents. This study evaluated the safety performance of work zones with a conventional lane merge (CLM) configuration in Louisiana. Analysis of variance (ANOVA) was used to compare the crash rates for accidents involving fatalities, injuries, and property damage only (PDO) in each of the following 4 areas: (1) advance warning area, (2) transition area, (3) work area, and (4) termination area. The analysis showed that the advance warning area had higher fatality, injury, and PDO crash rates when compared to the transition area, work area, and termination area. This finding confirmed the need to make improvements in the advance warning area where merging maneuvers take place. Therefore, a new lane merge configuration, called joint lane merge (JLM), was proposed and its safety performance was examined and compared to the conventional lane merge configuration using a microscopic simulation model (VISSIM), which was calibrated with real-world data from an existing work zone on I-55 and used to simulate a total of 25 different scenarios with different levels of demand and traffic composition. Safety performance was evaluated using 2 surrogate measures: uncomfortable decelerations and speed variance. Statistical analysis was conducted to determine whether the differences in safety performance between both configurations were significant. The safety analysis indicated that JLM outperformed CLM in most cases with low to moderate flow rates and that the percentage of trucks did not have a significant impact on the safety performance of either configuration. Though the safety analysis did not clearly indicate which lane merge configuration is safer for the overall work zone area, it was able to identify the possibly associated safety changes within the work zone area under different traffic conditions. Copyright © 2012 Taylor & Francis Group, LLC
Nonequilibrium thermodynamics in sheared hard-sphere materials.
Lieou, Charles K C; Langer, J S
2012-06-01
We combine the shear-transformation-zone (STZ) theory of amorphous plasticity with Edwards' statistical theory of granular materials to describe shear flow in a disordered system of thermalized hard spheres. The equations of motion for this system are developed within a statistical thermodynamic framework analogous to that which has been used in the analysis of molecular glasses. For hard spheres, the system volume V replaces the internal energy U as a function of entropy S in conventional statistical mechanics. In place of the effective temperature, the compactivity X=∂V/∂S characterizes the internal state of disorder. We derive the STZ equations of motion for a granular material accordingly, and predict the strain rate as a function of the ratio of the shear stress to the pressure for different values of a dimensionless, temperature-like variable near a jamming transition. We use a simplified version of our theory to interpret numerical simulations by Haxton, Schmiedeberg, and Liu, and in this way are able to obtain useful insights about internal rate factors and relations between jamming and glass transitions.
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
A statistical method for the conservative adjustment of false discovery rate (q-value).
Lai, Yinglei
2017-03-14
q-value is a widely used statistical method for estimating false discovery rate (FDR), which is a conventional significance measure in the analysis of genome-wide expression data. q-value is a random variable and it may underestimate FDR in practice. An underestimated FDR can lead to unexpected false discoveries in the follow-up validation experiments. This issue has not been well addressed in literature, especially in the situation when the permutation procedure is necessary for p-value calculation. We proposed a statistical method for the conservative adjustment of q-value. In practice, it is usually necessary to calculate p-value by a permutation procedure. This was also considered in our adjustment method. We used simulation data as well as experimental microarray or sequencing data to illustrate the usefulness of our method. The conservativeness of our approach has been mathematically confirmed in this study. We have demonstrated the importance of conservative adjustment of q-value, particularly in the situation that the proportion of differentially expressed genes is small or the overall differential expression signal is weak.
On Mars too, expect macroweather
NASA Astrophysics Data System (ADS)
Boisvert, Jean-Philippe; Lovejoy, Shaun; Muller, Jan-Peter
2015-04-01
Terrestrial atmospheric and oceanic spectra show drastic transitions at τw ˜ 10 days and τow ˜ 1 year respectively; this has been theorized as the lifetime of planetary scale structures. For wind and temperature, the forms of the low and high frequency parts of the spectra (macroweather, weather) as well as the τw can be theoretically estimated, the latter depending notably on the solar induced turbulent energy flux. We extend the theory to other planets and test it using Viking lander and reanalysis data from Mars. When the Martian spectra are scaled by the theoretical amount, they agree very well with their terrestrial atmospheric counterparts. Although the usual interpretation of Martian atmospheric dynamics is highly mechanistic (e.g. wave and tidal explanations are invoked), trace moment analysis of the reanalysis fields shows that the statistics well respect the predictions of multiplicative cascade theories. This shows that statistical scaling can be compatible with conventional deterministic thinking. However, since we are usually interested in statistical knowledge, it is the former not the latter that is of primary interest. We discuss the implications for understanding planetary fluid dynamical systems.
Himmelreich, Uwe; Somorjai, Ray L.; Dolenko, Brion; Lee, Ok Cha; Daniel, Heide-Marie; Murray, Ronan; Mountford, Carolyn E.; Sorrell, Tania C.
2003-01-01
Nuclear magnetic resonance (NMR) spectra were acquired from suspensions of clinically important yeast species of the genus Candida to characterize the relationship between metabolite profiles and species identification. Major metabolites were identified by using two-dimensional correlation NMR spectroscopy. One-dimensional proton NMR spectra were analyzed by using a staged statistical classification strategy. Analysis of NMR spectra from 442 isolates of Candida albicans, C. glabrata, C. krusei, C. parapsilosis, and C. tropicalis resulted in rapid, accurate identification when compared with conventional and DNA-based identification. Spectral regions used for the classification of the five yeast species revealed species-specific differences in relative amounts of lipids, trehalose, polyols, and other metabolites. Isolates of C. parapsilosis and C. glabrata with unusual PCR fingerprinting patterns also generated atypical NMR spectra, suggesting the possibility of intraspecies discontinuity. We conclude that NMR spectroscopy combined with a statistical classification strategy is a rapid, nondestructive, and potentially valuable method for identification and chemotaxonomic characterization that may be broadly applicable to fungi and other microorganisms. PMID:12902244
Liao, Yen-Nung; Liu, Ching-Shen; Tsai, Tong-Rong; Hung, Yu-Chiang; Chang, Shun-Jen; Lin, Hong-Long; Chen, Ying-Chou; Lai, Han-Ming; Yu, Shan-Fu; Chen, Chung-Jen
2011-07-01
Systemic lupus erythematosus (SLE) is a chronic systemic autoimmune disease. Prolonged complete remission is rare. Most patients with SLE need long-term treatment with glucocorticoid and immunomodulators. However, side effects because of the above medications are common. We evaluated the effect of adding-on Dan-Chi-Liu-Wei combination (DCLWC) on SLE patients with conventional therapy in tapering steroid and preventing disease flare-up. This was a double-blind and randomized controlled trial. Sixty-six SLE patients were recruited into this study and 53 patients who fulfilled the 1997 revised criteria for the classification of SLE with an SLE disease activity index (SLEDAI) score of 2-12 and a steroid (measured with prednisolone) daily dose of less than 20mg/d were enrolled. The patients were randomized into either an experimental or control group. We checked the urine analysis, hemogram, liver function, renal function, C3, C4, erythrocyte sedimentation rate, and anti-dsDNA, evaluated the SLEDAI score, and recorded the steroid dose at 0 months, 3 months, and 6 months, respectively. After 6 months of study, the C4 and blood urea nitrogen level revealed a statistically significant difference in either group. There was a tendency toward a decreased SLEDAI score in the experimental group (p=0.083) but not in the control group (p=0.867). The steroid dose was not statistically significant in either group. Renal function and liver function revealed no statistically significant statistics changes in either group. Adding-on DCLWC to conventional therapy for the treatment of SLE was safe and might have a borderline effect in decreasing disease activity, but it was not possible to taper the dosage of steroid after 6 months of clinical trial. Therefore, a long-term follow-up and a large-scale study are necessary to confirm the effect of DCLWC. Copyright © 2011 Elsevier Taiwan LLC. All rights reserved.
DUQUE, Jussaro Alves; VIVAN, Rodrigo Ricci; CAVENAGO, Bruno Cavalini; AMOROSO-SILVA, Pablo Andrés; BERNARDES, Ricardo Affonso; de VASCONCELOS, Bruno Carvalho; DUARTE, Marco Antonio Hungaro
2017-01-01
Abstract Objective This study aimed to evaluate the influence of the NiTi wire in Conventional NiTi (ProTaper Universal PTU) and Controlled Memory NiTi (ProTaper Gold PTG) instrument systems on the quality of root canal preparation. Material and Methods Twelve mandibular molars with separate mesial canals were scanned using a high-definition microcomputed tomography system. The PTU and PTG instruments were used to shape twelve mesial canals each. The canals were scanned after preparation with F2 and F3 instruments of the PTU and PTG systems. The analyzed parameters included the remaining dentin thickness at the apical and cervical levels, root canal volume and untouched canal walls. Data was analyzed for statistical significance by the Friedman and Dunn’s tests. For the comparison of data between groups, the Mann-Whitney test was used. Results In the pre-operative analysis, there were no statistically significant differences between the groups in terms of the area and volume of root canals (P>.05). There was also no statistically significant difference between the systems with respect to root canal volume after use of the F2 and F3 instruments. There was no statistical difference in the dentin thickness at the first apical level between, before and after instrumentation for both systems. At the 3 cervical levels, the PTG maintained centralization of the preparation on the transition between the F2 and F3 instruments, which did not occur with the PTU. Conclusion The Conventional NiTi (PTU) and Controlled Memory NiTi (PTG) instruments displayed comparable capabilities for shaping the straight mesial root canals of mandibular molars, although the PTG was better than the PTU at maintaining the centralization of the shape in the cervical portion. PMID:28198973
Huang, Zhi; Wang, Xin-zhi; Hou, Yue-Zhong
2015-02-01
Making impressions for maxillectomy patients is an essential but difficult task. This study developed a novel method to fabricate individual trays by computer-aided design (CAD) and rapid prototyping (RP) to simplify the process and enhance patient safety. Five unilateral maxillectomy patients were recruited for this study. For each patient, a computed tomography (CT) scan was taken. Based on the 3D surface reconstruction of the target area, an individual tray was manufactured by CAD/RP. With a conventional custom tray as control, two final impressions were made using the different types of tray for each patient. The trays were sectioned, and in each section the thickness of the material was measured at six evenly distributed points. Descriptive statistics and paired t-test were used to examine the difference of the impression thickness. SAS 9.3 was applied in the statistical analysis. Afterwards, all casts were then optically 3D scanned and compared digitally to evaluate the feasibility of this method. Impressions of all five maxillectomy patients were successfully made with individual trays fabricated by CAD/RP and traditional trays. The descriptive statistics of impression thickness measurement showed slightly more uneven results in the traditional trays, but no statistical significance was shown. A 3D digital comparison showed acceptable discrepancies within 1 mm in the majority of cast areas. The largest difference of 3 mm was observed in the buccal wall of the defective areas. Moderate deviations of 1 to 2 mm were detected in the buccal and labial vestibular groove areas. This study confirmed the feasibility of a novel method of fabricating individual trays by CAD/RP. Impressions made by individual trays manufactured using CAD/RP had a uniform thickness, with an acceptable level of accuracy compared to those made through conventional processes. © 2014 by the American College of Prosthodontists.
Franceschi, Massimo; Caffarra, Paolo; Savarè, Rita; Cerutti, Renata; Grossi, Enzo
2011-01-01
The early differentiation of Alzheimer's disease (AD) from frontotemporal dementia (FTD) may be difficult. The Tower of London (ToL), thought to assess executive functions such as planning and visuo-spatial working memory, could help in this purpose. Twentytwo Dementia Centers consecutively recruited patients with early FTD or AD. ToL performances of these groups were analyzed using both the conventional statistical approaches and the Artificial Neural Networks (ANNs) modelling. Ninety-four non aphasic FTD and 160 AD patients were recruited. ToL Accuracy Score (AS) significantly (p < 0.05) differentiated FTD from AD patients. However, the discriminant validity of AS checked by ROC curve analysis, yielded no significant results in terms of sensitivity and specificity (AUC 0.63). The performances of the 12 Success Subscores (SS) together with age, gender and schooling years were entered into advanced ANNs developed by Semeion Institute. The best ANNs were selected and submitted to ROC curves. The non-linear model was able to discriminate FTD from AD with an average AUC for 7 independent trials of 0.82. The use of hidden information contained in the different items of ToL and the non linear processing of the data through ANNs allows a high discrimination between FTD and AD in individual patients.
Terrain-analysis procedures for modeling radar backscatter
Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis
1978-01-01
The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.
The importance of proving the null.
Gallistel, C R
2009-04-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? (c) 2009 APA, all rights reserved
SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliopoulos, AS; Sun, X; Floros, D
Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less
An evaluation of student and clinician perception of digital and conventional implant impressions.
Lee, Sang J; Macarthur, Robert X; Gallucci, German O
2013-11-01
The accuracy and efficiency of digital implant impressions should match conventional impressions. Comparisons should be made with clinically relevant data. The purpose of this study was to evaluate the difficulty level and operator's perception between dental students and experienced clinicians when making digital and conventional implant impressions. Thirty experienced dental professionals and 30 second-year dental students made conventional and digital impressions of a single implant model. A visual analog scale (VAS) and multiple-choice questionnaires were used to assess the participant's perception of difficulty, preference, and effectiveness. Wilcoxon signed-rank test within the groups and Wilcoxon rank-sum test between the groups were used for statistical analysis (α=.05). On a 0 to 100 VAS, the student group scored a mean difficulty level of 43.1 (±18.5) for the conventional impression technique and 30.6 (±17.6) for the digital impression technique (P=.006). The clinician group scored a mean (standard deviation) difficulty level of 30.9 (±19.6) for conventional impressions and 36.5 (±20.6) for digital impressions (P=.280). Comparison between groups showed a mean difficulty level with the conventional impression technique significantly higher in the student group (P=.030). The digital impression was not significantly different between the groups (P=.228). Sixty percent of the students preferred the digital impression and 7% the conventional impression; 33% expressed no preference. In the clinician group, 33% preferred the digital impression and 37% the conventional impression; 30% had no preference. Seventy-seven percent of the student group felt most effective with digital impressions, 10% with conventional impressions, and 13% with either technique, whereas 40% of the clinician group chose the digital impression as the most effective technique, 53% the conventional impression, and 7% either technique. The conventional impression was more difficult to perform for the student group than the clinician group; however, the difficulty level of the digital impression was the same in both groups. It was also determined that the student group preferred the digital impression as the most efficient impression technique, and the clinician group had an even distribution in the choice of preferred and efficient impression techniques. Copyright © 2013 Editorial Council for the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Is laser conditioning a valid alternative to conventional etching for aesthetic brackets?
Sfondrini, M F; Calderoni, G; Vitale, M C; Gandini, P; Scribante, A
2018-03-01
ER:Yag lasers have been described as a more conservative alternative to conventional acid-etching enamel conditioning technique, when bonding conventional metallic orthodontic brackets. Since the use of aesthetic orthodontic brackets is constantly increasing, the purpose of the present report has been to test laser conditioning with different aesthetic brackets. Study Design: Five different aesthetic brackets (microfilled copolymer, glass fiber, sapphire, polyoxymethylene and sintered ceramic) were tested for shear bond strength and Adhesive Remnant Index scores using two different enamel conditioning techniques (acid etching and ER:Yag laser application). Two hundred bovine incisors were extracted, cleaned and embedded in resin. Specimens were then divided into 10 groups with random tables. Half of the specimens were conditioned with conventional orthophosphoric acid gel, the other half with ER:Yag laser. Different aesthetic brackets (microfilled copolymer, glass fiber, sapphire, polyoxymethylene and sintered ceramic) were then bonded to the teeth. Subsequently all groups were tested in shear mode with a Universal Testing Machine. Shear bond strength values and adhesive remnant index scores were recorded. Statistical analysis was performed. When considering conventional acid etching technique, sapphire, polyoxymethylene and sintered ceramic brackets exhibited the highest SBS values. Lowest values were reported for microfilled copolymer and glass fiber appliances. A significant decrease in SBS values after laser conditioning was reported for sapphire, polyoxymethylene and sintered ceramic brackets, whereas no significant difference was reported for microfilled copolymer and glass fiber brackets. Significant differences in ARI scores were also reported. Laser etching can significantly reduce bonding efficacy of sapphire, polyoxymethylene and sintered ceramic brackets.
Quantifying Attachment and Antibiotic Resistance of from Conventional and Organic Swine Manure.
Zwonitzer, Martha R; Soupir, Michelle L; Jarboe, Laura R; Smith, Douglas R
2016-03-01
Broad-spectrum antibiotics are often administered to swine, contributing to the occurrence of antibiotic-resistant bacteria in their manure. During land application, the bacteria in swine manure preferentially attach to particles in the soil, affecting their transport in overland flow. However, a quantitative understanding of these attachment mechanisms is lacking, and their relationship to antibiotic resistance is unknown. The objective of this study is to examine the relationships between antibiotic resistance and attachment to very fine silica sand in collected from swine manure. A total of 556 isolates were collected from six farms, two organic and four conventional (antibiotics fed prophylactically). Antibiotic resistance was quantified using 13 antibiotics at three minimum inhibitory concentrations: resistant, intermediate, and susceptible. Of the 556 isolates used in the antibiotic resistance assays, 491 were subjected to an attachment assay. Results show that isolates from conventional systems were significantly more resistant to amoxicillin, ampicillin, chlortetracycline, erythromycin, kanamycin, neomycin, streptomycin, tetracycline, and tylosin ( < 0.001). Results also indicate that isolated from conventional systems attached to very fine silica sand at significantly higher levels than those from organic systems ( < 0.001). Statistical analysis showed that a significant relationship did not exist between antibiotic resistance levels and attachment in from conventional systems but did for organic systems ( < 0.001). Better quantification of these relationships is critical to understanding the behavior of in the environment and preventing exposure of human populations to antibiotic-resistant bacteria. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Wang, Hai-Peng; Wang, Cui-Yan; Pan, Zheng-Lun; Zhao, Jun-Yu; Zhao, Bin
2016-01-01
Background: Conventional magnetic resonance imaging (MRI) is the preferred neuroimaging method in the evaluation of neuropsychiatric systemic lupus erythematosus (NPSLE). The purpose of this study was to investigate the association between clinical and immunological features with MRI abnormalities in female patients with NPSLE, to screen for the value of conventional MRI in NPSLE. Methods: A total of 59 female NPSLE patients with conventional MRI examinations were enrolled in this retrospective study. All patients were classified into different groups according to MRI abnormalities. Both clinical and immunological features were compared between MRI abnormal and normal groups. One-way analysis of variance was used to compare the systemic lupus erythematosus disease activity index (SLEDAI) score for MRI abnormalities. Multivariate logistic regression analysis investigated the correlation between immunological features, neuropsychiatric manifestations, and MRI abnormalities. Results: Thirty-six NPSLE patients (61%) showed a variety of MRI abnormalities. There were statistically significant differences in SLEDAI scores (P < 0.001), incidence of neurologic disorders (P = 0.001), levels of 24-h proteinuria (P = 0.001) and immunoglobulin M (P = 0.004), and incidence of acute confusional state (P = 0.002), cerebrovascular disease (P = 0.004), and seizure disorder (P = 0.028) between MRI abnormal and normal groups. In the MRI abnormal group, SLEDAI scores for cerebral atrophy (CA), cortex involvement, and restricted diffusion (RD) were much higher than in the MRI normal group (P < 0.001, P = 0.002, P = 0.038, respectively). Statistically significant positive correlations between seizure disorder and cortex involvement (odds ratio [OR] = 14.90; 95% confidence interval [CI], 1.50–151.70; P = 0.023) and cerebrovascular disease and infratentorial involvement (OR = 10.00; 95% CI, 1.70–60.00; P = 0.012) were found. Conclusions: MRI abnormalities in NPSLE, especially CA, cortex involvement, and RD might be markers of high systemic lupus erythematosus activity. Some MRI abnormalities might correspond to neuropsychiatric manifestations and might be helpful in understanding the pathophysiology of NPSLE. PMID:26904988
Analysis by gender and Visual Imagery Reactivity of conventional and imagery Rorschach.
Yanovski, A; Menduke, H; Albertson, M G
1995-06-01
Examined here are the effects of gender and Visual Imagery Reactivity in 80 consecutively selected psychiatric outpatients. The participants were grouped by gender and by the amounts of responsiveness to preceding therapy work using imagery (Imagery Nonreactors and Reactors). In the group of Imagery Nonreactors were 13 men and 22 women, and in the Reactor group were 17 men and 28 women. Compared were the responses to standard Rorschach (Conventional condition) with visual associations to memory images of Rorschach inkblots (Imagery condition). Responses were scored using the Visual Imagery Reactivity (VIR) scoring system, a general, test-nonspecific scoring method. Nonparametric statistical analysis showed that critical indicators of Imagery Reactivity encoded as High Affect/Conflict score and its derivatives associated with sexual or bizarre content were not significantly associated with gender; neither was Neutral Content score which categorizes "non-Reactivity." These results support the notion that system's criteria of Visual Imagery Reactivity can be applied equally to both men and women for the classification of Imagery Reactors and Nonreactors. Discussed are also the speculative consequences of extending the tolerance range of significance levels for the interaction between Reactivity and sex above the customary limit of p < .05 in borderline cases. The results of such an analysis may imply a trend towards more rigid defensiveness under Imagery and toward lesser verbal productivity in response to either the Conventional or the Imagery task among women who are Nonreactors. In Reactors, men produced significantly more Sexual Reference scores (in the subcategory not associated with High Affect/Conflict) than women, but this could be attributed to the effect of tester's and subjects' gender combined.
A Protein Domain and Family Based Approach to Rare Variant Association Analysis.
Richardson, Tom G; Shihab, Hashem A; Rivas, Manuel A; McCarthy, Mark I; Campbell, Colin; Timpson, Nicholas J; Gaunt, Tom R
2016-01-01
It has become common practice to analyse large scale sequencing data with statistical approaches based around the aggregation of rare variants within the same gene. We applied a novel approach to rare variant analysis by collapsing variants together using protein domain and family coordinates, regarded to be a more discrete definition of a biologically functional unit. Using Pfam definitions, we collapsed rare variants (Minor Allele Frequency ≤ 1%) together in three different ways 1) variants within single genomic regions which map to individual protein domains 2) variants within two individual protein domain regions which are predicted to be responsible for a protein-protein interaction 3) all variants within combined regions from multiple genes responsible for coding the same protein domain (i.e. protein families). A conventional collapsing analysis using gene coordinates was also undertaken for comparison. We used UK10K sequence data and investigated associations between regions of variants and lipid traits using the sequence kernel association test (SKAT). We observed no strong evidence of association between regions of variants based on Pfam domain definitions and lipid traits. Quantile-Quantile plots illustrated that the overall distributions of p-values from the protein domain analyses were comparable to that of a conventional gene-based approach. Deviations from this distribution suggested that collapsing by either protein domain or gene definitions may be favourable depending on the trait analysed. We have collapsed rare variants together using protein domain and family coordinates to present an alternative approach over collapsing across conventionally used gene-based regions. Although no strong evidence of association was detected in these analyses, future studies may still find value in adopting these approaches to detect previously unidentified association signals.
Lai, Foon Yin; Thai, Phong K; O'Brien, Jake; Gartner, Coral; Bruno, Raimondo; Kele, Benjamin; Ort, Christoph; Prichard, Jeremy; Kirkbride, Paul; Hall, Wayne; Carter, Steve; Mueller, Jochen F
2013-11-01
Wastewater analysis provides a non-intrusive way of measuring drug use within a population. We used this approach to determine daily use of conventional illicit drugs [cannabis, cocaine, methamphetamine and 3,4-methylenedioxymethamphetamine (MDMA)] and emerging illicit psychostimulants (benzylpiperazine, mephedrone and methylone) in two consecutive years (2010 and 2011) at an annual music festival. Daily composite wastewater samples, representative of the festival, were collected from the on-site wastewater treatment plant and analysed for drug metabolites. Data over 2 years were compared using Wilcoxon matched-pair test. Data from 2010 festival were compared with data collected at the same time from a nearby urban community using equivalent methods. Conventional illicit drugs were detected in all samples whereas emerging illicit psychostimulants were found only on specific days. The estimated per capita consumption of MDMA, cocaine and cannabis was similar between the two festival years. Statistically significant (P < 0.05; Z = -2.0-2.2) decreases were observed in use of methamphetamine and one emerging illicit psychostimulant (benzyl piperazine). Only consumption of MDMA was elevated at the festival compared with the nearby urban community. Rates of substance use at this festival remained relatively consistent over two monitoring years. Compared with the urban community, drug use among festival goers was only elevated for MDMA, confirming its popularity in music settings. Our study demonstrated that wastewater analysis can objectively capture changes in substance use at a music setting without raising major ethical issues. It would potentially allow effective assessments of drug prevention strategies in such settings in the future. © 2013 Australasian Professional Society on Alcohol and other Drugs.
Mirzakouchaki, Behnam; Shirazi, Sajjad; Sharghi, Reza; Shirazi, Samaneh; Moghimi, Mahsan; Shahrbaf, Shirin
2016-02-01
Different in-vitro studies have reported various results regarding shear bond strength (SBS) of orthodontic brackets when SEP technique is compared to conventional system. This in-vivo study was designed to compare the effect of conventional acid-etching and self-etching primer adhesive (SEP) systems on SBS and debonding characteristics of metal and ceramic orthodontic brackets. 120 intact first maxillary and mandibular premolars of 30 orthodontic patients were selected and bonded with metal and ceramic brackets using conventional acid-etch or self-etch primer system. The bonded brackets were incorporated into the wire during the study period to simulate the real orthodontic treatment condition. The teeth were extracted and debonded after 30 days. The SBS, debonding characteristics and adhesive remnant indices (ARI) were determined in all groups. The mean SBS of metal brackets was 10.63±1.42 MPa in conventional and 9.38±1.53 MPa in SEP system, (P=0.004). No statistically significant difference was noted between conventional and SEP systems in ceramic brackets. The frequency of 1, 2 and 3 ARI scores and debonding within the adhesive were the most common among all groups. No statistically significant difference was observed regarding ARI or failure mode of debonded specimens in different brackets or bonding systems. The SBS of metal brackets bonded using conventional system was significantly higher than SEP system, although the SBS of SEP system was clinically acceptable. No significant difference was found between conventional and SEP systems used with ceramic brackets. Total SBS of metal brackets was significantly higher than ceramic brackets. Due to adequate SBS of SEP system in bonding the metal brackets, it can be used as an alternative for conventional system. Shear bond strength, Orthodontic brackets, Adhesive remnant index, self-etch.
Mirzakouchaki, Behnam; Sharghi, Reza; Shirazi, Samaneh; Moghimi, Mahsan; Shahrbaf, Shirin
2016-01-01
Background Different in-vitro studies have reported various results regarding shear bond strength (SBS) of orthodontic brackets when SEP technique is compared to conventional system. This in-vivo study was designed to compare the effect of conventional acid-etching and self-etching primer adhesive (SEP) systems on SBS and debonding characteristics of metal and ceramic orthodontic brackets. Material and Methods 120 intact first maxillary and mandibular premolars of 30 orthodontic patients were selected and bonded with metal and ceramic brackets using conventional acid-etch or self-etch primer system. The bonded brackets were incorporated into the wire during the study period to simulate the real orthodontic treatment condition. The teeth were extracted and debonded after 30 days. The SBS, debonding characteristics and adhesive remnant indices (ARI) were determined in all groups. Results The mean SBS of metal brackets was 10.63±1.42 MPa in conventional and 9.38±1.53 MPa in SEP system, (P=0.004). No statistically significant difference was noted between conventional and SEP systems in ceramic brackets. The frequency of 1, 2 and 3 ARI scores and debonding within the adhesive were the most common among all groups. No statistically significant difference was observed regarding ARI or failure mode of debonded specimens in different brackets or bonding systems. Conclusions The SBS of metal brackets bonded using conventional system was significantly higher than SEP system, although the SBS of SEP system was clinically acceptable. No significant difference was found between conventional and SEP systems used with ceramic brackets. Total SBS of metal brackets was significantly higher than ceramic brackets. Due to adequate SBS of SEP system in bonding the metal brackets, it can be used as an alternative for conventional system. Key words:Shear bond strength, Orthodontic brackets, Adhesive remnant index, self-etch. PMID:26855704
Percutaneous dilatational versus conventional surgical tracheostomy in intensive care patients
Youssef, Tarek F.; Ahmed, Mohamed Rifaat; Saber, Aly
2011-01-01
Background: Tracheostomy is usually performed in patients with difficult weaning from mechanical ventilation or some catastrophic neurologic insult. Conventional tracheostomy involves dissection of the pretracheal tissues and insertion of the tracheostomy tube into the trachea under direct vision. Percutaneous dilatational tracheostomy is increasingly popular and has gained widespread acceptance in many intensive care unit and trauma centers. Aim: Aim of the study was to compare percutaneous dilatational tracheostomy versus conventional tracheostomy in intensive care patients. Patients and Methods: 64 critically ill patients admitted to intensive care unit subjected to tracheostomy and randomly divided into two groups; percutaneous dilatational tracheostomy and conventional tracheostomy. Results: Mean duration of the procedure was similar between the two procedures while the mean size of tracheostomy tube was smaller in percutaneous technique. In addition, the Lowest SpO2 during procedure, PaCO2 after operation and intra-operative bleeding for both groups were nearly similar without any statistically difference. Postoperative infection after 7 days seen to be statistically lowered and the length of scar tend to be smaller among PDT patients. Conclusion: PDT technique is effective and safe as CST with low incidence of post operative complication. PMID:22361497
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…
ERIC Educational Resources Information Center
Umeasiegbu, Veronica I.; Bishop, Malachy; Mpofu, Elias
2013-01-01
This article presents an analysis of the United Nations Convention on the Rights of Persons with Disabilities (CRPD) in relation to prior United Nations conventions on disability and U.S. disability policy law with a view to identifying the conventional and also the incremental advances of the CRPD. Previous United Nations conventions related to…
Weak-value amplification and optimal parameter estimation in the presence of correlated noise
NASA Astrophysics Data System (ADS)
Sinclair, Josiah; Hallaji, Matin; Steinberg, Aephraim M.; Tollaksen, Jeff; Jordan, Andrew N.
2017-11-01
We analytically and numerically investigate the performance of weak-value amplification (WVA) and related parameter estimation methods in the presence of temporally correlated noise. WVA is a special instance of a general measurement strategy that involves sorting data into separate subsets based on the outcome of a second "partitioning" measurement. Using a simplified correlated noise model that can be analyzed exactly together with optimal statistical estimators, we compare WVA to a conventional measurement method. We find that WVA indeed yields a much lower variance of the parameter of interest than the conventional technique does, optimized in the absence of any partitioning measurements. In contrast, a statistically optimal analysis that employs partitioning measurements, incorporating all partitioned results and their known correlations, is found to yield an improvement—typically slight—over the noise reduction achieved by WVA. This result occurs because the simple WVA technique is not tailored to any specific noise environment and therefore does not make use of correlations between the different partitions. We also compare WVA to traditional background subtraction, a familiar technique where measurement outcomes are partitioned to eliminate unknown offsets or errors in calibration. Surprisingly, for the cases we consider, background subtraction turns out to be a special case of the optimal partitioning approach, possessing a similar typically slight advantage over WVA. These results give deeper insight into the role of partitioning measurements (with or without postselection) in enhancing measurement precision, which some have found puzzling. They also resolve previously made conflicting claims about the usefulness of weak-value amplification to precision measurement in the presence of correlated noise. We finish by presenting numerical results to model a more realistic laboratory situation of time-decaying correlations, showing that our conclusions hold for a wide range of statistical models.
Kalkhoff, Will; Marcussen, Kristen; Serpe, Richard T
2016-07-01
After many years of research across disciplines, it remains unclear whether people are more motivated to seek appraisals that accurately match self-views (self-verification) or are as favorable as possible (self-enhancement). Within sociology, mixed findings in identity theory have fueled the debate. A problem here is that a commonly employed statistical approach does not take into account the direction of a discrepancy between how we see ourselves and how we think others see us in terms of a given identity, yet doing so is critical for determining which self-motive is at play. We offer a test of three competing models of identity processes, including a new "mixed motivations" model where self-verification and self-enhancement operate simultaneously. We compare the models using the conventional statistical approach versus response surface analysis. The latter method allows us to determine whether identity discrepancies involving over-evaluation are as distressing as those involving under-evaluation. We use nationally representative data and compare results across four different identities and multiple outcomes. The two statistical approaches lead to the same conclusions more often than not and mostly support identity theory and its assumption that people seek self-verification. However, response surface tests reveal patterns that are mistaken as evidence of self-verification by conventional procedures, especially for the spouse identity. We also find that identity discrepancies have different effects on distress and self-conscious emotions (guilt and shame). Our findings have implications not only for research on self and identity across disciplines, but also for many other areas of research that incorporate these concepts and/or use difference scores as explanatory variables. Copyright © 2016 Elsevier Inc. All rights reserved.
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-07-01
Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories.
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-01-01
Introduction: Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. Materials and Methods: A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Results: Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Conclusion: Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories. PMID:25328332
NASA Astrophysics Data System (ADS)
Saleh, H.; Suryadi, D.; Dahlan, J. A.
2018-01-01
The aim of this research was to find out whether 7E learning cycle under hypnoteaching model can enhance students’ mathematical problem-solving skill. This research was quasi-experimental study. The design of this study was pretest-posttest control group design. There were two groups of sample used in the study. The experimental group was given 7E learning cycle under hypnoteaching model, while the control group was given conventional model. The population of this study was the student of mathematics education program at one university in Tangerang. The statistical analysis used to test the hypothesis of this study were t-test and Mann-Whitney U. The result of this study show that: (1) The students’ achievement of mathematical problem solving skill who obtained 7E learning cycle under hypnoteaching model are higher than the students who obtained conventional model; (2) There are differences in the students’ enhancement of mathematical problem-solving skill based on students’ prior mathematical knowledge (PMK) category (high, middle, and low).
Quantifying Cancer Risk from Radiation.
Keil, Alexander P; Richardson, David B
2017-12-06
Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.
A single test for rejecting the null hypothesis in subgroups and in the overall sample.
Lin, Yunzhi; Zhou, Kefei; Ganju, Jitendra
2017-01-01
In clinical trials, some patient subgroups are likely to demonstrate larger effect sizes than other subgroups. For example, the effect size, or informally the benefit with treatment, is often greater in patients with a moderate condition of a disease than in those with a mild condition. A limitation of the usual method of analysis is that it does not incorporate this ordering of effect size by patient subgroup. We propose a test statistic which supplements the conventional test by including this information and simultaneously tests the null hypothesis in pre-specified subgroups and in the overall sample. It results in more power than the conventional test when the differences in effect sizes across subgroups are at least moderately large; otherwise it loses power. The method involves combining p-values from models fit to pre-specified subgroups and the overall sample in a manner that assigns greater weight to subgroups in which a larger effect size is expected. Results are presented for randomized trials with two and three subgroups.
Spray visualization of alternative fuels at hot ambient conditions
NASA Astrophysics Data System (ADS)
Kannaiyan, Kumaran; Sadr, Reza
2017-11-01
Gas-to-Liquid (GTL) has gained significant interest as drop-in alternative jet fuel owing to its cleaner combustion characteristics. The physical and evaporation properties of GTL fuels are different from those of the conventional jet fuels. Those differences will have an effect on the spray, and in turn, the combustion performance. In this study, the non-reacting near nozzle spray dynamics such as spray cone angle, liquid sheet breakup and liquid velocity of GTL fuel will be investigated and compared with those of the conventional jet fuel. This work is a follow up of the preliminary study performed at atmospheric ambient conditions where differences were observed in the near nozzle spray characteristics between the fuels. Whereas, in this study the spray visualization will be performed in a hot and inert environment to account for the difference in evaporation characteristics of the fuels. The spray visualization images will be captured using the shadowgraph technique. A rigorous statistical analysis of the images will be performed to compare the spray dynamics between the fuels.
Evaluation of mechanical and thermal properties of commonly used denture base resins.
Phoenix, Rodney D; Mansueto, Michael A; Ackerman, Neal A; Jones, Robert E
2004-03-01
The purpose of this investigation was to evaluate and compare the mechanical and thermal properties of 6 commonly used polymethyl methacrylate denture base resins. Sorption, solubility, color stability, adaptation, flexural stiffness, and hardness were assessed to determine compliance with ADA Specification No. 12. Thermal assessments were performed using differential scanning calorimetry and dynamic mechanical analysis. Results were assessed using statistical and observational analyses. All materials satisfied ADA requirements for sorption, solubility, and color stability. Adaptation testing indicated that microwave-activated systems provided better adaptation to associated casts than conventional heat-activated resins. According to flexural testing results, microwaveable resins were relatively stiff, while rubber-modified resins were more flexible. Differential scanning calorimetry indicated that microwave-activated systems were more completely polymerized than conventional heat-activated materials. The microwaveable resins displayed better adaptation, greater stiffness, and greater surface hardness than other denture base resins included in this investigation. Elastomeric toughening agents yielded decreased stiffness, decreased surface hardness, and decreased glass transition temperatures.
Physiological ICSI (PICSI) vs. Conventional ICSI in Couples with Male Factor: A Systematic Review.
Avalos-Durán, Georgina; Ángel, Ana María Emilia Cañedo-Del; Rivero-Murillo, Juana; Zambrano-Guerrero, Jaime Enoc; Carballo-Mondragón, Esperanza; Checa-Vizcaíno, Miguel Ángel
2018-04-19
To determine the efficacy of the physiological ICSI technique (PICSI) vs. conventional ICSI in the prognosis of couples, with respect to the following outcome measures: live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage rates. A systematic review of the literature, extracting raw data and performing data analysis. Patient(s): Couples with the male factor, who were subjected to in-vitro fertilization. Main Outcome Measures: rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage. In the systematic search, we found 2,918 studies and an additional study from other sources; only two studies fulfilled the inclusion criteria for this systematic review. The rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage were similar for both groups. There is no statistically significant difference between PICSI vs. ICSI, for any of the outcomes analyzed in this study. Enough information is still not available to prove the efficacy of the PICSI technique over ICSI in couples with male factor.
Use of Negative Pressure Wound Therapy for Lower Limb Bypass Incisions.
Tan, Kah Wei; Lo, Zhiwen Joseph; Hong, Qiantai; Narayanan, Sriram; Tan, Glenn Wei Leong; Chandrasekar, Sadhana
2017-12-25
Objective : The use of negative pressure wound therapy (NPWT) for post-surgical cardiothoracic, orthopedic, plastic, and obstetric and gynecologic procedures has been described. However, there are no data regarding its use for lower limb bypass incisions. We aimed to investigate the outcomes of NPWT in preventing surgical site infection (SSI) in patients with lower limb arterial bypass incisions. Materials and Methods : We retrospectively used data of 42 patients who underwent lower limb arterial bypass with reversed great saphenous vein between March 2014 and June 2016 and compared conventional wound therapy and NPWT with regard to preventing SSI. Results : Twenty-eight (67%) patients underwent conventional wound therapy and 14 (33%) underwent NPWT. There were no statistical differences regarding patient characteristics and mean SSI risk scores between the two patient groups (13.7% for conventional wound therapy vs. 13.4% for NPWT; P=0.831). In the conventional group, nine instances of SSI (32%) and three (11%) of these required subsequent surgical wound debridement, whereas in the NPWT group, there was no SSI incidence (P=0.019). Secondary outcomes such as the length of hospital stay, 30-day readmission rate, and need for secondary vascular procedures were not statistically different between the two groups. Conclusion : The use of NPWT for lower limb arterial bypass incisions is superior to that of conventional wound therapy because it may prevent SSIs.
The Spinel Explorer--Interactive Visual Analysis of Spinel Group Minerals.
Luján Ganuza, María; Ferracutti, Gabriela; Gargiulo, María Florencia; Castro, Silvia Mabel; Bjerg, Ernesto; Gröller, Eduard; Matković, Krešimir
2014-12-01
Geologists usually deal with rocks that are up to several thousand million years old. They try to reconstruct the tectonic settings where these rocks were formed and the history of events that affected them through the geological time. The spinel group minerals provide useful information regarding the geological environment in which the host rocks were formed. They constitute excellent indicators of geological environments (tectonic settings) and are of invaluable help in the search for mineral deposits of economic interest. The current workflow requires the scientists to work with different applications to analyze spine data. They do use specific diagrams, but these are usually not interactive. The current workflow hinders domain experts to fully exploit the potentials of tediously and expensively collected data. In this paper, we introduce the Spinel Explorer-an interactive visual analysis application for spinel group minerals. The design of the Spinel Explorer and of the newly introduced interactions is a result of a careful study of geologists' tasks. The Spinel Explorer includes most of the diagrams commonly used for analyzing spinel group minerals, including 2D binary plots, ternary plots, and 3D Spinel prism plots. Besides specific plots, conventional information visualization views are also integrated in the Spinel Explorer. All views are interactive and linked. The Spinel Explorer supports conventional statistics commonly used in spinel minerals exploration. The statistics views and different data derivation techniques are fully integrated in the system. Besides the Spinel Explorer as newly proposed interactive exploration system, we also describe the identified analysis tasks, and propose a new workflow. We evaluate the Spinel Explorer using real-life data from two locations in Argentina: the Frontal Cordillera in Central Andes and Patagonia. We describe the new findings of the geologists which would have been much more difficult to achieve using the current workflow only. Very positive feedback from geologists confirms the usefulness of the Spinel Explorer.