Sample records for classical statistical techniques

  1. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  2. Actinic cheilitis: aesthetic and functional comparative evaluation of vermilionectomy using the classic and W-plasty techniques.

    PubMed

    Rossoe, Ed Wilson Tsuneo; Tebcherani, Antonio José; Sittart, José Alexandre; Pires, Mario Cezar

    2011-01-01

    Chronic actinic cheilitis is actinic keratosis located on the vermilion border. Treatment is essential because of the potential for malignant transformation. To evaluate the aesthetic and functional results of vermilionectomy using the classic and W-plasty techniques in actinic cheilitis. In the classic technique, the scar is linear and in the W-plasty one, it is a broken line. 32 patients with clinical and histopathological diagnosis of actinic cheilitis were treated. Out of the 32 patients, 15 underwent the W-plasty technique and 17 underwent the classic one. We evaluated parameters such as scar retraction and functional changes. A statistically significant association between the technique used and scar retraction was found, which was positive when using the classic technique (p = 0.01 with Yates' correction). The odds ratio was calculated at 11.25, i.e., there was a greater chance of retraction in patients undergoing the classic technique. Both techniques revealed no functional changes. We evaluated postoperative complications such as the presence of crusts, dry lips, paresthesia, and suture dehiscence. There was no statistically significant association between complications and the technique used (p = 0.69). We concluded that vermilionectomy using the W-plasty technique shows better cosmetic results and similar complication rates.

  3. Classical Statistics and Statistical Learning in Imaging Neuroscience

    PubMed Central

    Bzdok, Danilo

    2017-01-01

    Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896

  4. Rotation of EOFs by the Independent Component Analysis: Towards A Solution of the Mixing Problem in the Decomposition of Geophysical Time Series

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2001-01-01

    The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.

  5. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  6. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  7. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  8. Performance Characterization of an Instrument.

    ERIC Educational Resources Information Center

    Salin, Eric D.

    1984-01-01

    Describes an experiment designed to teach students to apply the same statistical awareness to instrumentation they commonly apply to classical techniques. Uses propagation of error techniques to pinpoint instrumental limitations and breakdowns and to demonstrate capabilities and limitations of volumetric and gravimetric methods. Provides lists of…

  9. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  10. The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics

    DTIC Science & Technology

    1974-08-01

    VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated

  11. Algorithms for tensor network renormalization

    NASA Astrophysics Data System (ADS)

    Evenbly, G.

    2017-01-01

    We discuss in detail algorithms for implementing tensor network renormalization (TNR) for the study of classical statistical and quantum many-body systems. First, we recall established techniques for how the partition function of a 2 D classical many-body system or the Euclidean path integral of a 1 D quantum system can be represented as a network of tensors, before describing how TNR can be implemented to efficiently contract the network via a sequence of coarse-graining transformations. The efficacy of the TNR approach is then benchmarked for the 2 D classical statistical and 1 D quantum Ising models; in particular the ability of TNR to maintain a high level of accuracy over sustained coarse-graining transformations, even at a critical point, is demonstrated.

  12. Structural Equation Modeling: Possibilities for Language Learning Researchers

    ERIC Educational Resources Information Center

    Hancock, Gregory R.; Schoonen, Rob

    2015-01-01

    Although classical statistical techniques have been a valuable tool in second language (L2) research, L2 research questions have started to grow beyond those techniques' capabilities, and indeed are often limited by them. Questions about how complex constructs relate to each other or to constituent subskills, about longitudinal development in…

  13. Integration of ecological indices in the multivariate evaluation of an urban inventory of street trees

    Treesearch

    J. Grabinsky; A. Aldama; A. Chacalo; H. J. Vazquez

    2000-01-01

    Inventory data of Mexico City's street trees were studied using classical statistical arboricultural and ecological statistical approaches. Multivariate techniques were applied to both. Results did not differ substantially and were complementary. It was possible to reduce inventory data and to group species, boroughs, blocks, and variables.

  14. A heuristic statistical stopping rule for iterative reconstruction in emission tomography.

    PubMed

    Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D

    2013-01-01

    We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.

  15. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  16. Acute effect of scapular proprioceptive neuromuscular facilitation (PNF) techniques and classic exercises in adhesive capsulitis: a randomized controlled trial

    PubMed Central

    Balcı, Nilay Comuk; Yuruk, Zeliha Ozlem; Zeybek, Aslican; Gulsen, Mustafa; Tekindal, Mustafa Agah

    2016-01-01

    [Purpose] The aim of our study was to compare the initial effects of scapular proprioceptive neuromuscular facilitation techniques and classic exercise interventions with physiotherapy modalities on pain, scapular dyskinesis, range of motion, and function in adhesive capsulitis. [Subjects and Methods] Fifty-three subjects were allocated to 3 groups: scapular proprioceptive neuromuscular facilitation exercies and physiotherapy modalities, classic exercise and physiotherapy modalities, and only physiotherapy modalities. The intervention was applied in a single session. The Visual Analog Scale, Lateral Scapular Slide Test, range of motion and Simple Shoulder Test were evaluated before and just after the one-hour intervention in the same session (all in one session). [Results] All of the groups showed significant differences in shoulder flexion and abduction range of motion and Simple Shoulder Test scores. There were statistically significant differences in Visual Analog Scale scores in the proprioceptive neuromuscular facilitation and control groups, and no treatment method had significant effect on the Lateral Scapular Slide Test results. There were no statistically significant differences between the groups before and after the intervention. [Conclusion] Proprioceptive neuromuscular facilitation, classic exercise, and physiotherapy modalities had immediate effects on adhesive capsulitis in our study. However, there was no additional benefit of exercises in one session over physiotherapy modalities. Also, an effective treatment regimen for shoulder rehabilitation of adhesive capsulitis patients should include scapular exercises. PMID:27190456

  17. Analytic Methods for Adjusting Subjective Rating Schemes.

    ERIC Educational Resources Information Center

    Cooper, Richard V. L.; Nelson, Gary R.

    Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…

  18. Emotion Recognition From Singing Voices Using Contemporary Commercial Music and Classical Styles.

    PubMed

    Hakanpää, Tua; Waaramaa, Teija; Laukkanen, Anne-Maria

    2018-02-22

    This study examines the recognition of emotion in contemporary commercial music (CCM) and classical styles of singing. This information may be useful in improving the training of interpretation in singing. This is an experimental comparative study. Thirteen singers (11 female, 2 male) with a minimum of 3 years' professional-level singing studies (in CCM or classical technique or both) participated. They sang at three pitches (females: a, e1, a1, males: one octave lower) expressing anger, sadness, joy, tenderness, and a neutral state. Twenty-nine listeners listened to 312 short (0.63- to 4.8-second) voice samples, 135 of which were sung using a classical singing technique and 165 of which were sung in a CCM style. The listeners were asked which emotion they heard. Activity and valence were derived from the chosen emotions. The percentage of correct recognitions out of all the answers in the listening test (N = 9048) was 30.2%. The recognition percentage for the CCM-style singing technique was higher (34.5%) than for the classical-style technique (24.5%). Valence and activation were better perceived than the emotions themselves, and activity was better recognized than valence. A higher pitch was more likely to be perceived as joy or anger, and a lower pitch as sorrow. Both valence and activation were better recognized in the female CCM samples than in the other samples. There are statistically significant differences in the recognition of emotions between classical and CCM styles of singing. Furthermore, in the singing voice, pitch affects the perception of emotions, and valence and activity are more easily recognized than emotions. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  19. A Matched Filter Technique for Slow Radio Transient Detection and First Demonstration with the Murchison Widefield Array

    NASA Astrophysics Data System (ADS)

    Feng, L.; Vaulin, R.; Hewitt, J. N.; Remillard, R.; Kaplan, D. L.; Murphy, Tara; Kudryavtseva, N.; Hancock, P.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Deshpande, A. A.; Gaensler, B. M.; Greenhill, L. J.; Hazelton, B. J.; Johnston-Hollitt, M.; Lonsdale, C. J.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Oberoi, D.; Ord, S. M.; Prabu, T.; Udaya Shankar, N.; Srivani, K. S.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.

    2017-03-01

    Many astronomical sources produce transient phenomena at radio frequencies, but the transient sky at low frequencies (<300 MHz) remains relatively unexplored. Blind surveys with new wide-field radio instruments are setting increasingly stringent limits on the transient surface density on various timescales. Although many of these instruments are limited by classical confusion noise from an ensemble of faint, unresolved sources, one can in principle detect transients below the classical confusion limit to the extent that the classical confusion noise is independent of time. We develop a technique for detecting radio transients that is based on temporal matched filters applied directly to time series of images, rather than relying on source-finding algorithms applied to individual images. This technique has well-defined statistical properties and is applicable to variable and transient searches for both confusion-limited and non-confusion-limited instruments. Using the Murchison Widefield Array as an example, we demonstrate that the technique works well on real data despite the presence of classical confusion noise, sidelobe confusion noise, and other systematic errors. We searched for transients lasting between 2 minutes and 3 months. We found no transients and set improved upper limits on the transient surface density at 182 MHz for flux densities between ˜20 and 200 mJy, providing the best limits to date for hour- and month-long transients.

  20. From classical to quantum and back: Hamiltonian adaptive resolution path integral, ring polymer, and centroid molecular dynamics

    NASA Astrophysics Data System (ADS)

    Kreis, Karsten; Kremer, Kurt; Potestio, Raffaello; Tuckerman, Mark E.

    2017-12-01

    Path integral-based methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, including path-integral molecular dynamics, which allows for the calculation of quantum statistical properties, and ring-polymer and centroid molecular dynamics, which allow the calculation of approximate quantum dynamical properties. To this end, we derive a new integration algorithm that also makes use of multiple time-stepping. The scheme is validated via adaptive classical-path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins.

  1. Confidence of compliance: a Bayesian approach for percentile standards.

    PubMed

    McBride, G B; Ellis, J C

    2001-04-01

    Rules for assessing compliance with percentile standards commonly limit the number of exceedances permitted in a batch of samples taken over a defined assessment period. Such rules are commonly developed using classical statistical methods. Results from alternative Bayesian methods are presented (using beta-distributed prior information and a binomial likelihood), resulting in "confidence of compliance" graphs. These allow simple reading of the consumer's risk and the supplier's risks for any proposed rule. The influence of the prior assumptions required by the Bayesian technique on the confidence results is demonstrated, using two reference priors (uniform and Jeffreys') and also using optimistic and pessimistic user-defined priors. All four give less pessimistic results than does the classical technique, because interpreting classical results as "confidence of compliance" actually invokes a Bayesian approach with an extreme prior distribution. Jeffreys' prior is shown to be the most generally appropriate choice of prior distribution. Cost savings can be expected using rules based on this approach.

  2. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  3. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  4. A statistical theory for sound radiation and reflection from a duct

    NASA Technical Reports Server (NTRS)

    Cho, Y. C.

    1979-01-01

    A new analytical method is introduced for the study of the sound radiation and reflection from the open end of a duct. The sound is thought of as an aggregation of the quasiparticles-phonons. The motion of the latter is described in terms of the statistical distribution, which is derived from the classical wave theory. The results are in good agreement with the solutions obtained using the Wiener-Hopf technique when the latter is applicable, but the new method is simple and provides straightforward physical interpretation of the problem. Furthermore, it is applicable to a problem involving a duct in which modes are difficult to determine or cannot be defined at all, whereas the Wiener-Hopf technique is not.

  5. Hearing Outcome With the Use of Glass Ionomer Cement as an Alternative to Crimping in Stapedotomy.

    PubMed

    Elzayat, Saad; Younes, Ahmed; Fouad, Ayman; Erfan, Fatthe; Mahrous, Ali

    2017-10-01

    To evaluate early hearing outcomes using glass ionomer cement to fix the Teflon piston prosthesis onto the long process of incus to minimize residual conductive hearing loss after stapedotomy. Original report of prospective randomized control study. Tertiary referral center. A total of 80 consecutive patients with otosclerosis were randomized into two groups. Group A is a control group in which 40 patients underwent small fenestra stapedotomy using the classic technique. Group B included 40 patients who were subjected to small fenestra stapedotomy with fixation of the incus-prosthesis junction with glass ionomer bone cement. Stapedotomy with the classical technique in group A and the alternative technique in group B. The audiometric results before and after surgery. Analysis of the results was performed using the paired t test to compare between pre and postoperative results. χ test was used to compare the results of the two groups. A p value less than 0.05 was considered significant from the statistical standpoint. Significant postoperative improvement of both pure-tone air conduction thresholds and air-bone gaps were reported in the two studied groups. The postoperative average residual air-bone gap and hearing gain were statistically significant in group B (p < 0.05) compared with group A. The use of glass ionomer bone cement in primary otosclerosis surgery using the aforementioned prosthesis and the surgical technique is of significant value in producing maximal closure of the air-bone gap and better audiological outcomes.

  6. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  7. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  8. A magnet built on bronchoscopic suction for extraction of tracheobronchial headscarf pins: a novel technique and review of a tertiary centre experience†

    PubMed Central

    Elsayed, Hany H.; Mostafa, Ahmed M.; Soliman, Saleh; El-Bawab, Hatem Y.; Moharram, Adel A.; El-Nori, Ahmed A.

    2016-01-01

    OBJECTIVES Airway metal pins are one of the most commonly inhaled foreign bodies in Eastern societies in young females wearing headscarves. We innovated a modified bronchoscopic technique to extract tracheobronchial headscarf pins by the insertion of a magnet to allow an easy and non-traumatic extraction of the pins. The aim of this study was to assess the feasibility and safety of our new technique and compare it with our large previous experience with the classic bronchoscopic method of extraction of tracheobronchial headscarf pins. METHODS We performed a study comparing our retrospective experience of classic bronchoscopic extraction from February 2004 to January 2014 and prospective experience with our modified technique using the magnet from January 2014 to June 2015. An institutional review board and new device approval were obtained. RESULTS Three hundred and twenty-six procedures on 315 patients were performed during our initial 10-year experience. Of them, 304 patients were females. The median age of our group was 13 (0–62). The median time from inhalation to procedure was 1 day (0–1022). After introducing our modified new technique using the magnet, 20 procedures were performed. Nineteen were females. The median time of the procedure and the need to forcefully bend the pin for extraction were in favour of the new technique in comparison with our classic approach (2 vs 6 min; P < 0.001) (2 patients = 20% vs 192 = 58%; P < 0.001). The conversion rate to surgery was also in favour of the modified technique but did not reach statistical significance (0 = 0% vs 15 = 4.8%; P = 0.32). All patients who underwent the modified technique were discharged home on the same day of the procedure. No procedural complications were recorded. All remain well on a follow-up period of up to 14 months. CONCLUSIONS Bronchoscopic extraction of tracheobronchial inhaled headscarf pins using a novel technique using homemade magnets was safer and simpler in comparison with our large experience with the classic approach. We advise the use of this device (or concept) in selected patients in centres dealing with this problem. PMID:26850113

  9. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  10. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  11. Diffusion-weighted imaging and demyelinating diseases: new aspects of an old advanced sequence.

    PubMed

    Rueda-Lopes, Fernanda C; Hygino da Cruz, Luiz C; Doring, Thomas M; Gasparetto, Emerson L

    2014-01-01

    The purpose of this article is to discuss classic applications in diffusion-weighted imaging (DWI) in demyelinating disease and progression of DWI in the near future. DWI is an advanced technique used in the follow-up of demyelinating disease patients, focusing on the diagnosis of a new lesion before contrast enhancement. With technical advances, diffusion-tensor imaging; new postprocessing techniques, such as tract-based spatial statistics; new ways of calculating diffusion, such as kurtosis; and new applications for DWI and its spectrum are about to arise.

  12. Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania

    NASA Astrophysics Data System (ADS)

    Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu

    2017-09-01

    This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.

  13. Influence of two different surgical techniques on the difficulty of impacted lower third molar extraction and their post-operative complications.

    PubMed

    Mavrodi, Alexandra; Ohanyan, Ani; Kechagias, Nikos; Tsekos, Antonis; Vahtsevanos, Konstantinos

    2015-09-01

    Post-operative complications of various degrees of severity are commonly observed in third molar impaction surgery. For this reason, a surgical procedure that decreases the trauma of bone and soft tissues should be a priority for surgeons. In the present study, we compare the efficacy and the post-operative complications of patients to whom two different surgical techniques were applied for impacted lower third molar extraction. Patients of the first group underwent the classical bur technique, while patients of the second group underwent another technique, in which an elevator was placed on the buccal surface of the impacted molar in order to luxate the alveolar socket more easily. Comparing the two techniques, we observed a statistically significant decrease in the duration of the procedure and in the need for tooth sectioning when applying the second surgical technique, while the post-operative complications were similar in the two groups. We also found a statistically significant lower incidence of lingual nerve lesions and only a slightly higher frequency of sharp mandibular bone irregularities in the second group, which however was not statistically significant. The results of our study indicate that the surgical technique using an elevator on the buccal surface of the tooth seems to be a reliable method to extract impacted third molars safely, easily, quickly and with the minimum trauma to the surrounding tissues.

  14. Degraded Chinese rubbing images thresholding based on local first-order statistics

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Hou, Ling-Ying; Huang, Han

    2017-06-01

    It is a necessary step for Chinese character segmentation from degraded document images in Optical Character Recognizer (OCR); however, it is challenging due to various kinds of noising in such an image. In this paper, we present three local first-order statistics method that had been adaptive thresholding for segmenting text and non-text of Chinese rubbing image. Both visual inspection and numerically investigate for the segmentation results of rubbing image had been obtained. In experiments, it obtained better results than classical techniques in the binarization of real Chinese rubbing image and PHIBD 2012 datasets.

  15. Full statistical mode reconstruction of a light field via a photon-number-resolved measurement

    NASA Astrophysics Data System (ADS)

    Burenkov, I. A.; Sharma, A. K.; Gerrits, T.; Harder, G.; Bartley, T. J.; Silberhorn, C.; Goldschmidt, E. A.; Polyakov, S. V.

    2017-05-01

    We present a method to reconstruct the complete statistical mode structure and optical losses of multimode conjugated optical fields using an experimentally measured joint photon-number probability distribution. We demonstrate that this method evaluates classical and nonclassical properties using a single measurement technique and is well suited for quantum mesoscopic state characterization. We obtain a nearly perfect reconstruction of a field comprised of up to ten modes based on a minimal set of assumptions. To show the utility of this method, we use it to reconstruct the mode structure of an unknown bright parametric down-conversion source.

  16. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  17. A Comparative Study between Universal Eclectic Septoplasty Technique and Cottle

    PubMed Central

    Amaral Neto, Odim Ferreira do; Mizoguchi, Flavio Massao; Freitas, Renato da Silva; Maniglia, João Jairney; Maniglia, Fábio Fabrício; Maniglia, Ricardo Fabrício

    2017-01-01

    Introduction  Since the last century surgical correction of nasal septum deviation has been improved. The Universal Eclectic Technique was recently reported and there are still few studies dedicated to address this surgical approach. Objective  The objective of this study is to compare the results of septal deviation correction achieved using the Universal Eclectic Technique (UET) with those obtained through Cottle's Technique. Methods  This is a prospective study with two consecutive case series totaling 90 patients (40 women and 50 men), aged between 18 and 55 years. We divided patients into two groups according to the surgical approach. Fifty-three patients underwent septoplasty through Universal Eclectic Technique (UET) and thirty-seven patients were submitted to classical Cottle's septoplasty technique. All patients have answered the Nasal Obstruction Symptom Evaluation Scale (NOSE) questionnaire to assess pre and postoperative nasal obstruction. Results  Statistical analysis showed a significantly shorter operating time for the UET group. Nasal edema assessment performed seven days after the surgery showed a prevalence of mild edema in UET group and moderate edema in Cottle's technique group. In regard to complication rates, UET presented a single case of septal hematoma while in Cottle's technique group we observed: 02 cases of severe edemas, 01 case of incapacitating headache, and 01 complaint of nasal pain. Conclusion  The Universal Eclectic Technique (UET) has proven to be a safe and effective surgical technique with faster symptomatic improvement, low complication rates, and reduced surgical time when compared with classical Cottle's technique. PMID:28680499

  18. Introduction to Geostatistics

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  19. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  20. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  1. On the implications of the classical ergodic theorems: analysis of developmental processes has to focus on intra-individual variation.

    PubMed

    Molenaar, Peter C M

    2008-01-01

    It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.

  2. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  3. Statistical ultrasonics: the influence of Robert F. Wagner

    NASA Astrophysics Data System (ADS)

    Insana, Michael F.

    2009-02-01

    An important ongoing question for higher education is how to successfully mentor the next generation of scientists and engineers. It has been my privilege to have been mentored by one of the best, Dr Robert F. Wagner and his colleagues at the CDRH/FDA during the mid 1980s. Bob introduced many of us in medical ultrasonics to statistical imaging techniques. These ideas continue to broadly influence studies on adaptive aperture management (beamforming, speckle suppression, compounding), tissue characterization (texture features, Rayleigh/Rician statistics, scatterer size and number density estimators), and fundamental questions about how limitations of the human eye-brain system for extracting information from textured images can motivate image processing. He adapted the classical techniques of signal detection theory to coherent imaging systems that, for the first time in ultrasonics, related common engineering metrics for image quality to task-based clinical performance. This talk summarizes my wonderfully-exciting three years with Bob as I watched him explore topics in statistical image analysis that formed a rational basis for many of the signal processing techniques used in commercial systems today. It is a story of an exciting time in medical ultrasonics, and of how a sparkling personality guided and motivated the development of junior scientists who flocked around him in admiration and amazement.

  4. A magnet built on bronchoscopic suction for extraction of tracheobronchial headscarf pins: a novel technique and review of a tertiary centre experience.

    PubMed

    Elsayed, Hany H; Mostafa, Ahmed M; Soliman, Saleh; El-Bawab, Hatem Y; Moharram, Adel A; El-Nori, Ahmed A

    2016-05-01

    Airway metal pins are one of the most commonly inhaled foreign bodies in Eastern societies in young females wearing headscarves. We innovated a modified bronchoscopic technique to extract tracheobronchial headscarf pins by the insertion of a magnet to allow an easy and non-traumatic extraction of the pins. The aim of this study was to assess the feasibility and safety of our new technique and compare it with our large previous experience with the classic bronchoscopic method of extraction of tracheobronchial headscarf pins. We performed a study comparing our retrospective experience of classic bronchoscopic extraction from February 2004 to January 2014 and prospective experience with our modified technique using the magnet from January 2014 to June 2015. An institutional review board and new device approval were obtained. Three hundred and twenty-six procedures on 315 patients were performed during our initial 10-year experience. Of them, 304 patients were females. The median age of our group was 13 (0-62). The median time from inhalation to procedure was 1 day (0-1022). After introducing our modified new technique using the magnet, 20 procedures were performed. Nineteen were females. The median time of the procedure and the need to forcefully bend the pin for extraction were in favour of the new technique in comparison with our classic approach (2 vs 6 min; P < 0.001) (2 patients = 20% vs 192 = 58%; P < 0.001). The conversion rate to surgery was also in favour of the modified technique but did not reach statistical significance (0 = 0% vs 15 = 4.8%; P = 0.32). All patients who underwent the modified technique were discharged home on the same day of the procedure. No procedural complications were recorded. All remain well on a follow-up period of up to 14 months. Bronchoscopic extraction of tracheobronchial inhaled headscarf pins using a novel technique using homemade magnets was safer and simpler in comparison with our large experience with the classic approach. We advise the use of this device (or concept) in selected patients in centres dealing with this problem. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  5. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  6. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  7. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Quantum Behavior of an Autonomous Maxwell Demon

    NASA Astrophysics Data System (ADS)

    Chapman, Adrian; Miyake, Akimasa

    2015-03-01

    A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.

  9. Legitimate Techniques for Improving the R-Square and Related Statistics of a Multiple Regression Model

    DTIC Science & Technology

    1981-01-01

    explanatory variable has been ommitted. Ramsey (1974) has developed a rather interesting test for detecting specification errors using estimates of the...Peter. (1979) A Guide to Econometrics , Cambridge, MA: The MIT Press. Ramsey , J.B. (1974), "Classical Model Selection Through Specification Error... Tests ," in P. Zarembka, Ed. Frontiers in Econometrics , New York: Academia Press. Theil, Henri. (1971), Principles of Econometrics , New York: John Wiley

  10. Bacterial turbulence in motion

    NASA Astrophysics Data System (ADS)

    Rusconi, Roberto; Smriga, Steven; Stocker, Roman; Secchi, Eleonora; Buzzaccaro, Stefano; Piazza, Roberto

    2014-11-01

    Dense suspensions of motile bacteria exhibit collective dynamics akin to those observed in classic, high Reynolds number turbulence, yet this analogy has remained largely qualitative. Here we present experiments in which a dense suspension of Bacillus subtilis bacteria was flown through narrow microchannels and the velocity statistics of the flowing suspension were accurately quantified with a recently developed velocimetry technique. This revealed a robust intermittency phenomenon, whereby the average velocity profile of the flowing suspension oscillated between a plug-like flow and a parabolic flow. This intermittency is a hallmark of classic turbulence and was associated with the presence of collective structures in the suspension. Furthermore, quantification of the Reynolds stress profile revealed a direct link between the turbulent nature of the suspension and its anomalous viscosity.

  11. Contribution of artificial intelligence to the knowledge of prognostic factors in laryngeal carcinoma.

    PubMed

    Zapater, E; Moreno, S; Fortea, M A; Campos, A; Armengot, M; Basterra, J

    2000-11-01

    Many studies have investigated prognostic factors in laryngeal carcinoma, with sometimes conflicting results. Apart from the importance of environmental factors, the different statistical methods employed may have influenced such discrepancies. A program based on artificial intelligence techniques is designed to determine the prognostic factors in a series of 122 laryngeal carcinomas. The results obtained are compared with those derived from two classical statistical methods (Cox regression and mortality tables). Tumor location was found to be the most important prognostic factor by all methods. The proposed intelligent system is found to be a sound method capable of detecting exceptional cases.

  12. Fundamental limits to superresolution fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Small, Alex

    2013-02-01

    Superresolution fluorescence microscopy techniques such as PALM, STORM, STED, and Structured Illumination Microscopy (SIM) enable imaging of live cells at nanometer resolution. The common theme in all of these techniques is that the diffraction limit is circumvented by controlling the states of fluorescent molecules. Although the samples are labeled very densely (i.e. with spacing much smaller than the Airy distance), not all of the molecules are emitting at the same time. Consequently, one does not encounter overlapping blurs. In the deterministic techniques (STED, SIM) the achievable resolution scales as the wavelength of light divided by the square root of the intensity of a beam used to control the fluorescent state. In the stochastic techniques (PALM, STORM), the achievable resolution scales as the wavelength of light divided by the square root of the number of photons collected. Although these limits arise from very different mechanisms (parabolic beam profiles for STED and SIM, statistics for PALM and STORM), in all cases the resolution scales inversely with the square root of a measure of the number of photons used in the experiment. We have developed a proof that this relationship between resolution and photon count is universal to techniques that control the states of fluorophores using classical light. Our proof encompasses linear and nonlinear optics, as well as computational post-processing techniques for extracting information beyond the diffraction limit. If there are techniques that can achieve a more efficient relationship between resolution and photon count, those techniques will require light exhibiting non-classical correlations.

  13. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  14. Biomechanical evaluation of knotless anatomical double-layer double-row rotator cuff repair: a comparative ex vivo study.

    PubMed

    Hepp, Pierre; Osterhoff, Georg; Engel, Thomas; Marquass, Bastian; Klink, Thomas; Josten, Christoph

    2009-07-01

    The layered configuration of the rotator cuff tendon is not taken into account in classic rotator cuff tendon repair techniques. The mechanical properties of (1) the classic double-row technique, (2) a double-layer double-row (DLDR) technique in simple suture configuration, and (3) a DLDR technique in mattress suture configuration are significantly different. Controlled laboratory study. Twenty-four sheep shoulders were assigned to 3 repair groups of full-thickness infraspinatus tears: group 1, traditional double-row repair; group 2, DLDR anchor repair with simple suture configuration; and group 3, DLDR knotless repair with mattress suture configuration. After ultrasound evaluation of the repair, each specimen was cyclically loaded with 10 to 100 N for 50 cycles. Each specimen was then loaded to failure at a rate of 1 mm/s. There were no statistically significant differences among the 3 testing groups for the mean footprint area. The cyclic loading test revealed no significant difference among the 3 groups with regard to elongation. For the load-to-failure test, groups 2 and 3 showed no differences in ultimate tensile load when compared with group 1. However, when compared to group 2, group 3 was found to have significantly higher values regarding ultimate load, ultimate elongation, and energy absorbed. The DLDR fixation techniques may provide strength of initial repair comparable with that of commonly used double-row techniques. When compared with the knotless technique with mattress sutures, simple suture configuration of DLDR repair may be too weak. Knotless DLDR rotator cuff repair may (1) restore the footprint by the use of double-row principles and (2) enable restoration of the shape and profile. Double-layer double-row fixation in mattress suture configuration has initial fixation strength comparable with that of the classic double-row fixation and so may potentially improve functional results of rotator cuff repair.

  15. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    ERIC Educational Resources Information Center

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  16. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  17. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  18. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  19. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  20. Comparison of classical statistical methods and artificial neural network in traffic noise prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nedic, Vladimir, E-mail: vnedic@kg.ac.rs; Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs; Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs

    2014-11-15

    Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. Themore » output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.« less

  1. Feynman graphs and the large dimensional limit of multipartite entanglement

    NASA Astrophysics Data System (ADS)

    Di Martino, Sara; Facchi, Paolo; Florio, Giuseppe

    2018-01-01

    In this paper, we extend the analysis of multipartite entanglement, based on techniques from classical statistical mechanics, to a system composed of n d-level parties (qudits). We introduce a suitable partition function at a fictitious temperature with the average local purity of the system as Hamiltonian. In particular, we analyze the high-temperature expansion of this partition function, prove the convergence of the series, and study its asymptotic behavior as d → ∞. We make use of a diagrammatic technique, classify the graphs, and study their degeneracy. We are thus able to evaluate their contributions and estimate the moments of the distribution of the local purity.

  2. Semi-Poisson statistics in quantum chaos.

    PubMed

    García-García, Antonio M; Wang, Jiao

    2006-03-01

    We investigate the quantum properties of a nonrandom Hamiltonian with a steplike singularity. It is shown that the eigenfunctions are multifractals and, in a certain range of parameters, the level statistics is described exactly by semi-Poisson statistics (SP) typical of pseudointegrable systems. It is also shown that our results are universal, namely, they depend exclusively on the presence of the steplike singularity and are not modified by smooth perturbations of the potential or the addition of a magnetic flux. Although the quantum properties of our system are similar to those of a disordered conductor at the Anderson transition, we report important quantitative differences in both the level statistics and the multifractal dimensions controlling the transition. Finally, the study of quantum transport properties suggests that the classical singularity induces quantum anomalous diffusion. We discuss how these findings may be experimentally corroborated by using ultracold atoms techniques.

  3. Quantum speedup of Monte Carlo methods.

    PubMed

    Montanaro, Ashley

    2015-09-08

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.

  4. Quantum speedup of Monte Carlo methods

    PubMed Central

    Montanaro, Ashley

    2015-01-01

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently. PMID:26528079

  5. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  6. Statistical mechanics in the context of special relativity. II.

    PubMed

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  7. Free Fermions and the Classical Compact Groups

    NASA Astrophysics Data System (ADS)

    Cunden, Fabio Deelan; Mezzadri, Francesco; O'Connell, Neil

    2018-06-01

    There is a close connection between the ground state of non-interacting fermions in a box with classical (absorbing, reflecting, and periodic) boundary conditions and the eigenvalue statistics of the classical compact groups. The associated determinantal point processes can be extended in two natural directions: (i) we consider the full family of admissible quantum boundary conditions (i.e., self-adjoint extensions) for the Laplacian on a bounded interval, and the corresponding projection correlation kernels; (ii) we construct the grand canonical extensions at finite temperature of the projection kernels, interpolating from Poisson to random matrix eigenvalue statistics. The scaling limits in the bulk and at the edges are studied in a unified framework, and the question of universality is addressed. Whether the finite temperature determinantal processes correspond to the eigenvalue statistics of some matrix models is, a priori, not obvious. We complete the picture by constructing a finite temperature extension of the Haar measure on the classical compact groups. The eigenvalue statistics of the resulting grand canonical matrix models (of random size) corresponds exactly to the grand canonical measure of free fermions with classical boundary conditions.

  8. Strategies for Fermentation Medium Optimization: An In-Depth Review

    PubMed Central

    Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.

    2017-01-01

    Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566

  9. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    NASA Astrophysics Data System (ADS)

    Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe

    2017-12-01

    This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances between the different methane and acetylene sources. The results from these controlled experiments demonstrate that, when the targeted and tracer gases are not well collocated, this new approach provides a better estimate of the emission rates than the tracer release technique. As an example, the relative error between the estimated and actual emission rates is reduced from 32 % with the tracer release technique to 16 % with the combined approach in the case of a tracer located 60 m upwind of a single methane source. Further studies and more complex implementations with more advanced transport models and more advanced optimisations of their configuration will be required to generalise the applicability of the approach and strengthen its robustness.

  10. Efficient computational model for classification of protein localization images using Extended Threshold Adjacency Statistics and Support Vector Machines.

    PubMed

    Tahir, Muhammad; Jan, Bismillah; Hayat, Maqsood; Shah, Shakir Ullah; Amin, Muhammad

    2018-04-01

    Discriminative and informative feature extraction is the core requirement for accurate and efficient classification of protein subcellular localization images so that drug development could be more effective. The objective of this paper is to propose a novel modification in the Threshold Adjacency Statistics technique and enhance its discriminative power. In this work, we utilized Threshold Adjacency Statistics from a novel perspective to enhance its discrimination power and efficiency. In this connection, we utilized seven threshold ranges to produce seven distinct feature spaces, which are then used to train seven SVMs. The final prediction is obtained through the majority voting scheme. The proposed ETAS-SubLoc system is tested on two benchmark datasets using 5-fold cross-validation technique. We observed that our proposed novel utilization of TAS technique has improved the discriminative power of the classifier. The ETAS-SubLoc system has achieved 99.2% accuracy, 99.3% sensitivity and 99.1% specificity for Endogenous dataset outperforming the classical Threshold Adjacency Statistics technique. Similarly, 91.8% accuracy, 96.3% sensitivity and 91.6% specificity values are achieved for Transfected dataset. Simulation results validated the effectiveness of ETAS-SubLoc that provides superior prediction performance compared to the existing technique. The proposed methodology aims at providing support to pharmaceutical industry as well as research community towards better drug designing and innovation in the fields of bioinformatics and computational biology. The implementation code for replicating the experiments presented in this paper is available at: https://drive.google.com/file/d/0B7IyGPObWbSqRTRMcXI2bG5CZWs/view?usp=sharing. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    PubMed

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  12. Connected Text Reading and Differences in Text Reading Fluency in Adult Readers

    PubMed Central

    Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke

    2013-01-01

    The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177

  13. Lenard-Balescu calculations and classical molecular dynamics simulations of electrical and thermal conductivities of hydrogen plasmas

    DOE PAGES

    Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...

    2014-12-04

    Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.

  14. Efficacy of the Greater Occipital Nerve Block for Cervicogenic Headache: Comparing Classical and Subcompartmental Techniques.

    PubMed

    Lauretti, Gabriela R; Corrêa, Selma W R O; Mattos, Anita L

    2015-09-01

    The aim of the study was to compare the efficacy of the greater occipital nerve (GON) block using the classical technique and different volumes of injectate with the subcompartmental technique for the treatment of cervicogenic headache (CH). Thirty patients acted as his/her own control. All patients were submitted to the GON block by the classical technique with 10 mg dexamethasone, plus 40 mg lidocaine (5 mL volume). Patients were randomly allocated into 1 of 3 groups (n = 10) when pain VAS was > 3 cm. Each group was submitted to a GON subcompartmental technique (10 mg dexamethasone + 40 mg lidocaine + nonionic iodine contrast + saline) under fluoroscopy using either 5, 10, or 15 mL final volume. Analgesia and quality of life were evaluated. The classical GON technique resulted in 2 weeks of analgesia and less rescue analgesic consumption, compared to 24 weeks after the subcompartmental technique (P < 0.01). Quality of life improved at 2 and 24 weeks after the classical and the suboccipital techniques, respectively (P < 0.05). The data revealed that groups were similar regarding analgesia when compared to volume of injection (P > 0.05). While the classical technique for GON block resulted in only 2 weeks of analgesia, the subcompartmental technique resulted in at least 24 weeks of analgesia, being 5 mL volume sufficient for the performance of the block under fluoroscopy. © 2014 World Institute of Pain.

  15. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  16. Western classical music development: a statistical analysis of composers similarity, differentiation and evolution.

    PubMed

    Georges, Patrick

    2017-01-01

    This paper proposes a statistical analysis that captures similarities and differences between classical music composers with the eventual aim to understand why particular composers 'sound' different even if their 'lineages' (influences network) are similar or why they 'sound' alike if their 'lineages' are different. In order to do this we use statistical methods and measures of association or similarity (based on presence/absence of traits such as specific 'ecological' characteristics and personal musical influences) that have been developed in biosystematics, scientometrics, and bibliographic coupling. This paper also represents a first step towards a more ambitious goal of developing an evolutionary model of Western classical music.

  17. A novel semiconductor-based, fully incoherent amplified spontaneous emission light source for ghost imaging

    PubMed Central

    Hartmann, Sébastien; Elsäßer, Wolfgang

    2017-01-01

    Initially, ghost imaging (GI) was demonstrated with entangled light from parametric down conversion. Later, classical light sources were introduced with the development of thermal light GI concepts. State-of-the-art classical GI light sources rely either on complex combinations of coherent light with spatially randomizing optical elements or on incoherent lamps with monochromating optics, however suffering strong losses of efficiency and directionality. Here, a broad-area superluminescent diode is proposed as a new light source for classical ghost imaging. The coherence behavior of this spectrally broadband emitting opto-electronic light source is investigated in detail. An interferometric two-photon detection technique is exploited in order to resolve the ultra-short correlation timescales. We thereby quantify the coherence time, the photon statistics as well as the number of spatial modes unveiling a complete incoherent light behavior. With a one-dimensional proof-of-principle GI experiment, we introduce these compact emitters to the field which could be beneficial for high-speed GI systems as well as for long range GI sensing in future applications. PMID:28150737

  18. InGaAs tunnel diodes for the calibration of semi-classical and quantum mechanical band-to-band tunneling models

    NASA Astrophysics Data System (ADS)

    Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.

    2014-05-01

    Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.

  19. Noninformative prior in the quantum statistical model of pure states

    NASA Astrophysics Data System (ADS)

    Tanaka, Fuyuhiko

    2012-06-01

    In the present paper, we consider a suitable definition of a noninformative prior on the quantum statistical model of pure states. While the full pure-states model is invariant under unitary rotation and admits the Haar measure, restricted models, which we often see in quantum channel estimation and quantum process tomography, have less symmetry and no compelling rationale for any choice. We adopt a game-theoretic approach that is applicable to classical Bayesian statistics and yields a noninformative prior for a general class of probability distributions. We define the quantum detection game and show that there exist noninformative priors for a general class of a pure-states model. Theoretically, it gives one of the ways that we represent ignorance on the given quantum system with partial information. Practically, our method proposes a default distribution on the model in order to use the Bayesian technique in the quantum-state tomography with a small sample.

  20. Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations

    NASA Astrophysics Data System (ADS)

    Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.

    2015-03-01

    Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.

  1. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards.

    PubMed

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  2. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  3. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  4. Quantum-classical boundary for precision optical phase estimation

    NASA Astrophysics Data System (ADS)

    Birchall, Patrick M.; O'Brien, Jeremy L.; Matthews, Jonathan C. F.; Cable, Hugo

    2017-12-01

    Understanding the fundamental limits on the precision to which an optical phase can be estimated is of key interest for many investigative techniques utilized across science and technology. We study the estimation of a fixed optical phase shift due to a sample which has an associated optical loss, and compare phase estimation strategies using classical and nonclassical probe states. These comparisons are based on the attainable (quantum) Fisher information calculated per number of photons absorbed or scattered by the sample throughout the sensing process. We find that for a given number of incident photons upon the unknown phase, nonclassical techniques in principle provide less than a 20 % reduction in root-mean-square error (RMSE) in comparison with ideal classical techniques in multipass optical setups. Using classical techniques in a different optical setup that we analyze, which incorporates additional stages of interference during the sensing process, the achievable reduction in RMSE afforded by nonclassical techniques falls to only ≃4 % . We explain how these conclusions change when nonclassical techniques are compared to classical probe states in nonideal multipass optical setups, with additional photon losses due to the measurement apparatus.

  5. Free-energy landscapes from adaptively biased methods: Application to quantum systems

    NASA Astrophysics Data System (ADS)

    Calvo, F.

    2010-10-01

    Several parallel adaptive biasing methods are applied to the calculation of free-energy pathways along reaction coordinates, choosing as a difficult example the double-funnel landscape of the 38-atom Lennard-Jones cluster. In the case of classical statistics, the Wang-Landau and adaptively biased molecular-dynamics (ABMD) methods are both found efficient if multiple walkers and replication and deletion schemes are used. An extension of the ABMD technique to quantum systems, implemented through the path-integral MD framework, is presented and tested on Ne38 against the quantum superposition method.

  6. [A new hypothesis for the treatment of amblyopia: the flicker stimulator].

    PubMed

    Parrozzani, A; Fedriga, P; Ferrari, E; De Vincentiis, L

    1984-01-01

    A variety of cells are involved in the pathogenesis of amblyopia : ON, OFF, ON-OFF cells, postsynaptic cells, neurons of striate cortex and the select interest of the macula. The need for stimulation of these cells in treating amblyopia forms the theoretical basis of the Flicker stimulator with red monochromatic light (LED, 655 nm). The authors present a clinical investigation on 35 subjects with anisometropic or strabismic amblyopia, before extensive treatment with classic anti-amblyopic techniques without satisfactory improvement obtaining significant statistical results (p less than 0,001).

  7. [Small infundibulectomy versus ventriculotomy in tetralogy of Fallot].

    PubMed

    Bojórquez-Ramos, Julio César

    2013-01-01

    the surgical correction of tetralogy of Fallot (TOF) is standardized on the way to close the septal defect, but differs in the way of expanding the right ventricular outflow tract (RVOT). The aim was to compare the early postoperative clinical course of the RVOT obstruction enlargement in classical ventriculotomy technique and the small infundibulectomy (SI). We analyzed the database of the pediatric heart surgery service from 2008 to 2011. Patients with non-complex TOF undergoing complete correction by classical ventriculotomy or SI were selected. Anova, χ(2) and Fisher statistical test were applied. the data included 47 patients, 55 % (26) male, mean age 43 months (6-172), classical ventriculotomy was performed in 61.7 % (29). This group had higher peak levels of lactate (9.07 versus 6.8 mmol/L) p = 0049, and greater magnitude in the index bleeding/kg in the first 12 hours (39.1 versus 20.3 mL/kg) p = 0.016. Death occurred in 9 cases (31.03 %) versus one (5.6 %) in the SI group with p = 0.037; complications exclusive as acute renal failure, hemopneumothorax, pneumonia, permanent AV-block and multiple organ failure were observed. morbidity and mortality was higher in classical ventriculotomy group in comparison with SI. This is possibly associated with higher blood volume.

  8. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  9. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  10. [Surgical closure of patent ductus arteriosus in premature neonates: Does the surgical technique affect the outcome?

    PubMed

    Avila-Alvarez, Alejandro; Serantes Lourido, Marta; Barriga Bujan, Rebeca; Blanco Rodriguez, Carolina; Portela-Torron, Francisco; Bautista-Hernandez, Victor

    2017-05-01

    Surgical closure of patent ductus arteriosus in premature neonates is an aggressive technique and is not free of complications. A study was designed with the aim of describing our experience with a less invasive technique, the extra-pleural approach via a posterior minithoracotomy, and to compare the results with the classic transpleural approach. A retrospective cohort study was conducted on premature neonates on whom surgical closure of the ductus was performed during a ten-year period (March 2005 to March 2015). A comparison was made of the acute complications, the outcomes on discharge, and follow-up, between the extra-pleural approach and the classic transpleural approach. The study included 48 patients, 30 in the classical approach and 18 in the extra-pleural group. The demographic and pre-operative characteristics were similar in both groups. No differences were found between the 2 groups in the incidence of acute post-operative complications (56.6 vs. 44.4%), on the dependence on oxygen at 36 weeks (33.3 vs. 55.5%), or in hospital mortality (10 vs. 16.6%). As regards the short-term progress, the extra-pleural group required fewer days until the withdrawal of supplementary oxygen (36.3 vs. 28.9) and until hospital discharge (67.5 vs. 53.2), although only the time until extubation achieved a statistically significant difference (11.5 vs. 2.7, P=.03). The extra-plural approach by posterior minithoracotomy for the surgical closure of ductus in the premature infant is viable and could bring some clinical benefits in the short-term. Copyright © 2015 Asociación Española de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  12. Pauli structures arising from confined particles interacting via a statistical potential

    NASA Astrophysics Data System (ADS)

    Batle, Josep; Ciftja, Orion; Farouk, Ahmed; Alkhambashi, Majid; Abdalla, Soliman

    2017-09-01

    There have been suggestions that the Pauli exclusion principle alone can lead a non-interacting (free) system of identical fermions to form crystalline structures dubbed Pauli crystals. Single-shot imaging experiments for the case of ultra-cold systems of free spin-polarized fermionic atoms in a two-dimensional harmonic trap appear to show geometric arrangements that cannot be characterized as Wigner crystals. This work explores this idea and considers a well-known approach that enables one to treat a quantum system of free fermions as a system of classical particles interacting with a statistical interaction potential. The model under consideration, though classical in nature, incorporates the quantum statistics by endowing the classical particles with an effective interaction potential. The reasonable expectation is that possible Pauli crystal features seen in experiments may manifest in this model that captures the correct quantum statistics as a first order correction. We use the Monte Carlo simulated annealing method to obtain the most stable configurations of finite two-dimensional systems of confined particles that interact with an appropriate statistical repulsion potential. We consider both an isotropic harmonic and a hard-wall confinement potential. Despite minor differences, the most stable configurations observed in our model correspond to the reported Pauli crystals in single-shot imaging experiments of free spin-polarized fermions in a harmonic trap. The crystalline configurations observed appear to be different from the expected classical Wigner crystal structures that would emerge should the confined classical particles had interacted with a pair-wise Coulomb repulsion.

  13. Gaining Insights on Nasopharyngeal Carcinoma Treatment Outcome Using Clinical Data Mining Techniques.

    PubMed

    Ghaibeh, A Ammar; Kasem, Asem; Ng, Xun Jin; Nair, Hema Latha Krishna; Hirose, Jun; Thiruchelvam, Vinesh

    2018-01-01

    The analysis of Electronic Health Records (EHRs) is attracting a lot of research attention in the medical informatics domain. Hospitals and medical institutes started to use data mining techniques to gain new insights from the massive amounts of data that can be made available through EHRs. Researchers in the medical field have often used descriptive statistics and classical statistical methods to prove assumed medical hypotheses. However, discovering new insights from large amounts of data solely based on experts' observations is difficult. Using data mining techniques and visualizations, practitioners can find hidden knowledge, identify interesting patterns, or formulate new hypotheses to be further investigated. This paper describes a work in progress on using data mining methods to analyze clinical data of Nasopharyngeal Carcinoma (NPC) cancer patients. NPC is the fifth most common cancer among Malaysians, and the data analyzed in this study was collected from three states in Malaysia (Kuala Lumpur, Sabah and Sarawak), and is considered to be the largest up-to-date dataset of its kind. This research is addressing the issue of cancer recurrence after the completion of radiotherapy and chemotherapy treatment. We describe the procedure, problems, and insights gained during the process.

  14. A comparative study of two different uncinectomy techniques: swing-door and classical.

    PubMed

    Singhania, Ankit A; Bansal, Chetan; Chauhan, Nirali; Soni, Saurav

    2012-01-01

    The aim of this study was to determine which technique of uncinectomy, classical or swing door technique. Four hundred eighty Cases of sinusitis were selected and operated for Functional Endoscopic Sinus Surgery (FESS). Out of these, in 240 uncinectomies classical uncinectomy was done whereas in another 240 uncinectomies swing door technique was used. Initially patients were medically managed treated according to their symptoms and prior management. Patients who had received previous adequate medical management were evaluated with CT scan of the sinuses. If disease still persists than they were operated for FESS. The authors' experience indicates that Functional endoscopic sinus surgery can be performed under local or general anesthesia, as permitted or tolerated. In this review classical technique was used in 240 uncinectomies. Out of this, ethmoidal complex injury was noted in 4 cases, missed maxillary ostium syndrome (incomplete removal) was reported in 12 patients and orbital fat exposure was encountered in 5 patients. As compared to 240 uncinectomies done with swing door technique, incomplete removal was evident in 2 cases and lacrimal duct injury was reported in 3 cases. 'Evidence that underscores how this 'swing door technique' successfully combines 'the conservation goals of the anterior-to-posterior approach and anatomic virtues of the posterior-to-anterior approach to ethmoidectomy of the total 480 uncinectomies operated. Out of which 240 uncinectomies have been performed using the 'swing-door' technique. The 240 uncinectomies performed using classical technique were used as controls. The incidence of orbital penetration, incomplete removal, ethmoidal complex injury and ostium non-identification was significantly less with the new technique. Three lacrimal injuries occurred with the 'swing-door' technique compared to no injuries with classical technique. The authors recommend swing door technique as it is easy to learn, allows complete removal of the uncinate flush with the lateral nasal wall and allows easy identification of the natural ostium of the maxillary sinus without injuring the ethmoidal complex.

  15. Computational studies of thermal and quantum phase transitions approached through non-equilibrium quenching

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-Wei

    Phase transitions and their associated critical phenomena are of fundamental importance and play a crucial role in the development of statistical physics for both classical and quantum systems. Phase transitions embody diverse aspects of physics and also have numerous applications outside physics, e.g., in chemistry, biology, and combinatorial optimization problems in computer science. Many problems can be reduced to a system consisting of a large number of interacting agents, which under some circumstances (e.g., changes of external parameters) exhibit collective behavior; this type of scenario also underlies phase transitions. The theoretical understanding of equilibrium phase transitions was put on a solid footing with the establishment of the renormalization group. In contrast, non-equilibrium phase transition are relatively less understood and currently a very active research topic. One important milestone here is the Kibble-Zurek (KZ) mechanism, which provides a useful framework for describing a system with a transition point approached through a non-equilibrium quench process. I developed two efficient Monte Carlo techniques for studying phase transitions, one is for classical phase transition and the other is for quantum phase transitions, both are under the framework of KZ scaling. For classical phase transition, I develop a non-equilibrium quench (NEQ) simulation that can completely avoid the critical slowing down problem. For quantum phase transitions, I develop a new algorithm, named quasi-adiabatic quantum Monte Carlo (QAQMC) algorithm for studying quantum quenches. I demonstrate the utility of QAQMC quantum Ising model and obtain high-precision results at the transition point, in particular showing generalized dynamic scaling in the quantum system. To further extend the methods, I study more complex systems such as spin-glasses and random graphs. The techniques allow us to investigate the problems efficiently. From the classical perspective, using the NEQ approach I verify the universality class of the 3D Ising spin-glasses. I also investigate the random 3-regular graphs in terms of both classical and quantum phase transitions. I demonstrate that under this simulation scheme, one can extract information associated with the classical and quantum spin-glass transitions without any knowledge prior to the simulation.

  16. Quantum state engineering of light with continuous-wave optical parametric oscillators.

    PubMed

    Morin, Olivier; Liu, Jianli; Huang, Kun; Barbosa, Felippe; Fabre, Claude; Laurat, Julien

    2014-05-30

    Engineering non-classical states of the electromagnetic field is a central quest for quantum optics(1,2). Beyond their fundamental significance, such states are indeed the resources for implementing various protocols, ranging from enhanced metrology to quantum communication and computing. A variety of devices can be used to generate non-classical states, such as single emitters, light-matter interfaces or non-linear systems(3). We focus here on the use of a continuous-wave optical parametric oscillator(3,4). This system is based on a non-linear χ(2) crystal inserted inside an optical cavity and it is now well-known as a very efficient source of non-classical light, such as single-mode or two-mode squeezed vacuum depending on the crystal phase matching. Squeezed vacuum is a Gaussian state as its quadrature distributions follow a Gaussian statistics. However, it has been shown that number of protocols require non-Gaussian states(5). Generating directly such states is a difficult task and would require strong χ(3) non-linearities. Another procedure, probabilistic but heralded, consists in using a measurement-induced non-linearity via a conditional preparation technique operated on Gaussian states. Here, we detail this generation protocol for two non-Gaussian states, the single-photon state and a superposition of coherent states, using two differently phase-matched parametric oscillators as primary resources. This technique enables achievement of a high fidelity with the targeted state and generation of the state in a well-controlled spatiotemporal mode.

  17. Noninvasive fetal QRS detection using an echo state network and dynamic programming.

    PubMed

    Lukoševičius, Mantas; Marozas, Vaidotas

    2014-08-01

    We address a classical fetal QRS detection problem from abdominal ECG recordings with a data-driven statistical machine learning approach. Our goal is to have a powerful, yet conceptually clean, solution. There are two novel key components at the heart of our approach: an echo state recurrent neural network that is trained to indicate fetal QRS complexes, and several increasingly sophisticated versions of statistics-based dynamic programming algorithms, which are derived from and rooted in probability theory. We also employ a standard technique for preprocessing and removing maternal ECG complexes from the signals, but do not take this as the main focus of this work. The proposed approach is quite generic and can be extended to other types of signals and annotations. Open-source code is provided.

  18. Metal-ceramic bond strength between a feldspathic porcelain and a Co-Cr alloy fabricated with Direct Metal Laser Sintering technique.

    PubMed

    Dimitriadis, Konstantinos; Spyropoulos, Konstantinos; Papadopoulos, Triantafillos

    2018-02-01

    The aim of the present study was to record the metal-ceramic bond strength of a feldspathic dental porcelain and a Co-Cr alloy, using the Direct Metal Laser Sintering technique (DMLS) for the fabrication of metal substrates. Ten metal substrates were fabricated with powder of a dental Co-Cr alloy using DMLS technique (test group) in dimensions according to ISO 9693. Another ten substrates were fabricated with a casing dental Co-Cr alloy using classic casting technique (control group) for comparison. Another three substrates were fabricated using each technique to record the Modulus of Elasticity ( E ) of the used alloys. All substrates were examined to record external and internal porosity. Feldspathic porcelain was applied on the substrates. Specimens were tested using the three-point bending test. The failure mode was determined using optical and scanning electron microscopy. The statistical analysis was performed using t-test. Substrates prepared using DMLS technique did not show internal porosity as compared to those produced using the casting technique. The E of control and test group was 222 ± 5.13 GPa and 227 ± 3 GPa, respectively. The bond strength was 51.87 ± 7.50 MPa for test group and 54.60 ± 6.20 MPa for control group. No statistically significant differences between the two groups were recorded. The mode of failure was mainly cohesive for all specimens. Specimens produced by the DMLS technique cover the lowest acceptable metal-ceramic bond strength of 25 MPa specified in ISO 9693 and present satisfactory bond strength for clinical use.

  19. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  20. Eigensystem analysis of classical relaxation techniques with applications to multigrid analysis

    NASA Technical Reports Server (NTRS)

    Lomax, Harvard; Maksymiuk, Catherine

    1987-01-01

    Classical relaxation techniques are related to numerical methods for solution of ordinary differential equations. Eigensystems for Point-Jacobi, Gauss-Seidel, and SOR methods are presented. Solution techniques such as eigenvector annihilation, eigensystem mixing, and multigrid methods are examined with regard to the eigenstructure.

  1. Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.

    PubMed

    Dołęgowska, Sabina

    2016-11-01

    In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  2. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  3. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  4. Bounding the Set of Classical Correlations of a Many-Body System

    NASA Astrophysics Data System (ADS)

    Fadel, Matteo; Tura, Jordi

    2017-12-01

    We present a method to certify the presence of Bell correlations in experimentally observed statistics, and to obtain new Bell inequalities. Our approach is based on relaxing the conditions defining the set of correlations obeying a local hidden variable model, yielding a convergent hierarchy of semidefinite programs (SDP's). Because the size of these SDP's is independent of the number of parties involved, this technique allows us to characterize correlations in many-body systems. As an example, we illustrate our method with the experimental data presented in Science 352, 441 (2016), 10.1126/science.aad8665.

  5. Tuning the Photon Statistics of a Strongly Coupled Nanophotonic System

    NASA Astrophysics Data System (ADS)

    Dory, C.; Fischer, K. A.; Müller, K.; Lagoudakis, K. G.; Sarmiento, T.; Rundquist, A.; Zhang, J. L.; Kelaita, Y.; Sapra, N. V.; Vučković, J.

    Strongly coupled quantum-dot-photonic-crystal cavity systems provide a nonlinear ladder of hybridized light-matter states, which are a promising platform for non-classical light generation. The transmission of light through such systems enables light generation with tunable photon counting statistics. By detuning the frequencies of quantum emitter and cavity, we can tune the transmission of light to strongly enhance either single- or two-photon emission processes. However, these nanophotonic systems show a strongly dissipative nature and classical light obscures any quantum character of the emission. In this work, we utilize a self-homodyne interference technique combined with frequency-filtering to overcome this obstacle. This allows us to generate emission with a strong two-photon component in the multi-photon regime, where we measure a second-order coherence value of g (2) [ 0 ] = 1 . 490 +/- 0 . 034 . We propose rate equation models that capture the dominant processes of emission both in the single- and multi-photon regimes and support them by quantum-optical simulations that fully capture the frequency filtering of emission from our solid-state system. Finally, we simulate a third-order coherence value of g (3) [ 0 ] = 0 . 872 +/- 0 . 021 . Army Research Office (ARO) (W911NF1310309), National Science Foundation (1503759), Stanford Graduate Fellowship.

  6. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  8. From Wald to Savage: homo economicus becomes a Bayesian statistician.

    PubMed

    Giocoli, Nicola

    2013-01-01

    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. An economic agent is deemed rational when she maximizes her subjective expected utility and consistently revises her beliefs according to Bayes's rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is of great historiographic importance. The story begins with Abraham Wald's behaviorist approach to statistics and culminates with Leonard J. Savage's elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. The latter's acknowledged fiasco to achieve a reinterpretation of traditional inference techniques along subjectivist and behaviorist lines raises the puzzle of how a failed project in statistics could turn into such a big success in economics. Possible answers call into play the emphasis on consistency requirements in neoclassical theory and the impact of the postwar transformation of U.S. business schools. © 2012 Wiley Periodicals, Inc.

  9. Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.

    PubMed

    Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh

    2016-12-01

    We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for zz^{*}) for an ideal classical gas. Also, according to the singular point of thermodynamic curvature, we consider the condensation of nonextensive Boson gas.

  10. Metal-ceramic bond strength between a feldspathic porcelain and a Co-Cr alloy fabricated with Direct Metal Laser Sintering technique

    PubMed Central

    Spyropoulos, Konstantinos

    2018-01-01

    PURPOSE The aim of the present study was to record the metal-ceramic bond strength of a feldspathic dental porcelain and a Co-Cr alloy, using the Direct Metal Laser Sintering technique (DMLS) for the fabrication of metal substrates. MATERIALS AND METHODS Ten metal substrates were fabricated with powder of a dental Co-Cr alloy using DMLS technique (test group) in dimensions according to ISO 9693. Another ten substrates were fabricated with a casing dental Co-Cr alloy using classic casting technique (control group) for comparison. Another three substrates were fabricated using each technique to record the Modulus of Elasticity (E) of the used alloys. All substrates were examined to record external and internal porosity. Feldspathic porcelain was applied on the substrates. Specimens were tested using the three-point bending test. The failure mode was determined using optical and scanning electron microscopy. The statistical analysis was performed using t-test. RESULTS Substrates prepared using DMLS technique did not show internal porosity as compared to those produced using the casting technique. The E of control and test group was 222 ± 5.13 GPa and 227 ± 3 GPa, respectively. The bond strength was 51.87 ± 7.50 MPa for test group and 54.60 ± 6.20 MPa for control group. No statistically significant differences between the two groups were recorded. The mode of failure was mainly cohesive for all specimens. CONCLUSION Specimens produced by the DMLS technique cover the lowest acceptable metal-ceramic bond strength of 25 MPa specified in ISO 9693 and present satisfactory bond strength for clinical use. PMID:29503711

  11. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  12. Management of Type II Odontoid Fracture for Osteoporotic Bone Structure: Preliminary Report.

    PubMed

    Cosar, Murat; Ozer, A Fahir; Alkan, Bahadır; Guven, Mustafa; Akman, Tarık; Aras, Adem Bozkurt; Ceylan, Davut; Tokmak, Mehmet

    2015-01-01

    Anterior transodontoid screw fixation technique is generally chosen for the management of type II odontoid fractures. The nonunion of type II odontoid fractures is still a major problem especially in elderly and osteoporotic patients. Eleven osteoporotic type II odontoid fracured patients were presented in this article. We have divided 11 patients in two groups as classical and Ozer's technique. We have also compared (radiologically and clinically) the classical anterior transodontoid screw fixation (group II: 6 cases) and Ozer's transodontoid screw fixation technique (group I: 5 cases) retrospectively. There was no difference regaring the clinical features of the groups. However, the radiological results showed 100% fusion for Ozer's screw fixation technique and 83% fusion for the classical screw fixation technique. In conclusion, we suggest that Ozer's technique may help to increase the fusion capacity for osteoporotic type II odontoid fractures.

  13. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  14. Universal self-similar dynamics of relativistic and nonrelativistic field theories near nonthermal fixed points

    NASA Astrophysics Data System (ADS)

    Piñeiro Orioli, Asier; Boguslavski, Kirill; Berges, Jürgen

    2015-07-01

    We investigate universal behavior of isolated many-body systems far from equilibrium, which is relevant for a wide range of applications from ultracold quantum gases to high-energy particle physics. The universality is based on the existence of nonthermal fixed points, which represent nonequilibrium attractor solutions with self-similar scaling behavior. The corresponding dynamic universality classes turn out to be remarkably large, encompassing both relativistic as well as nonrelativistic quantum and classical systems. For the examples of nonrelativistic (Gross-Pitaevskii) and relativistic scalar field theory with quartic self-interactions, we demonstrate that infrared scaling exponents as well as scaling functions agree. We perform two independent nonperturbative calculations, first by using classical-statistical lattice simulation techniques and second by applying a vertex-resummed kinetic theory. The latter extends kinetic descriptions to the nonperturbative regime of overoccupied modes. Our results open new perspectives to learn from experiments with cold atoms aspects about the dynamics during the early stages of our universe.

  15. Intermittent turbulence in flowing bacterial suspensions

    PubMed Central

    Secchi, Eleonora; Rusconi, Roberto; Buzzaccaro, Stefano; Salek, M. Mehdi; Smriga, Steven; Piazza, Roberto; Stocker, Roman

    2016-01-01

    Dense suspensions of motile bacteria, possibly including the human gut microbiome, exhibit collective dynamics akin to those observed in classic, high Reynolds number turbulence with important implications for chemical and biological transport, yet this analogy has remained primarily qualitative. Here, we present experiments in which a dense suspension of Bacillus subtilis bacteria was flowed through microchannels and the velocity statistics of the flowing suspension were quantified using a recently developed velocimetry technique coupled with vortex identification methods. Observations revealed a robust intermittency phenomenon, whereby the average velocity profile of the suspension fluctuated between a plug-like flow and a parabolic flow profile. This intermittency is a hallmark of the onset of classic turbulence and Lagrangian tracking revealed that it here originates from the presence of transient vortices in the active, collective motion of the bacteria locally reinforcing the externally imposed flow. These results link together two entirely different manifestations of turbulence and show the potential of the microfluidic approach to mimic the environment characteristic of certain niches of the human microbiome. PMID:27307513

  16. Intermittent turbulence in flowing bacterial suspensions.

    PubMed

    Secchi, Eleonora; Rusconi, Roberto; Buzzaccaro, Stefano; Salek, M Mehdi; Smriga, Steven; Piazza, Roberto; Stocker, Roman

    2016-06-01

    Dense suspensions of motile bacteria, possibly including the human gut microbiome, exhibit collective dynamics akin to those observed in classic, high Reynolds number turbulence with important implications for chemical and biological transport, yet this analogy has remained primarily qualitative. Here, we present experiments in which a dense suspension of Bacillus subtilis bacteria was flowed through microchannels and the velocity statistics of the flowing suspension were quantified using a recently developed velocimetry technique coupled with vortex identification methods. Observations revealed a robust intermittency phenomenon, whereby the average velocity profile of the suspension fluctuated between a plug-like flow and a parabolic flow profile. This intermittency is a hallmark of the onset of classic turbulence and Lagrangian tracking revealed that it here originates from the presence of transient vortices in the active, collective motion of the bacteria locally reinforcing the externally imposed flow. These results link together two entirely different manifestations of turbulence and show the potential of the microfluidic approach to mimic the environment characteristic of certain niches of the human microbiome. © 2016 The Author(s).

  17. [Comparison of two cesarean techniques: classic versus Misgav Ladach cesarean].

    PubMed

    Moreira, P; Moreau, J C; Faye, M E; Ka, S; Kane Guèye, S M; Faye, E O; Dieng, T; Diadhiou, F

    2002-10-01

    The aim of the study was to compare two cesarean section techniques Methodology. A prospective study was conducted UB 400 cesareans performed at the Gynecological and Obstetric Clinic of the Dakar Teaching Hospital between March 2000 and August 2000. Two hundred patients underwent the classical procedure (CL group) and the other 200 the Misgav Ladach procedure (ML group). Per- and post-operative data were compared between the two groups with Student's test and the Chi(2) test. A p-value less than 0.05 was considered statistically significant. The two groups were similar for socio-demographic and clinical data. The delay between the skin incision and infant delivery was significantly shorter in the ML group (5 minutes 26 seconds versus 6 minutes 20 seconds). The same trend was found for the length of operation (36 minutes 36 seconds versus 54 minutes 38 seconds). Fewer sutures were used in the ML group (2.92 versus 4.14). There is no significant difference for dose of analgesia, post-operative complications and hospital discharge. Cost analysis demonstrated that the Misgav Ladach procedure was 10000 FCFA (15 euros) less costly. Misgav Ladach method is simple, rapid, cost-effective cesarean procedure which appears to be an attractive alternative to traditional cesarean section.

  18. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  19. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  20. Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics

    NASA Astrophysics Data System (ADS)

    Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.

    2006-07-01

    We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.

  1. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  2. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  3. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    NASA Astrophysics Data System (ADS)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  4. Cryopreservation of human sperm: efficacy and use of a new nitrogen-free controlled rate freezer versus liquid nitrogen vapour freezing.

    PubMed

    Creemers, E; Nijs, M; Vanheusden, E; Ombelet, W

    2011-12-01

    Preservation of spermatozoa is an important aspect of assisted reproductive medicine. The aim of this study was to investigate the efficacy and use of a recently developed liquid nitrogen and cryogen-free controlled rate freezer and this compared with the classical liquid nitrogen vapour freezing method for the cryopreservation of human spermatozoa. Ten patients entering the IVF programme donated semen samples for the study. Samples were analysed according to the World Health Organization guidelines. No significant difference in total sperm motility after freeze-thawing between the new technique and classical technique was demonstrated. The advantage of the new freezing technique is that it uses no liquid nitrogen during the freezing process, hence being safer to use and clean room compatible. Investment costs are higher for the apparatus but running costs are only 1% in comparison with classical liquid nitrogen freezing. In conclusion, post-thaw motility of samples frozen with the classical liquid nitrogen vapour technique was comparable with samples frozen with the new nitrogen-free freezing technique. This latter technique can thus be a very useful asset to the sperm cryopreservation laboratory. © 2011 Blackwell Verlag GmbH.

  5. Quantum Mechanics From the Cradle?

    ERIC Educational Resources Information Center

    Martin, John L.

    1974-01-01

    States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)

  6. [Comparison of transverse short-axis classic and oblique long-axis "Syringe-Free" approaches for internal jugular venous catheterization under ultrasound guidance].

    PubMed

    Ince, Ilker; Arı, Muhammet Ali; Sulak, Muhammet Mustafa; Aksoy, Mehmet

    There are different ultrasound probe positions used for internal jugular venous catheter placement. Also, in-plane or out of plane needle approach may be used for catheterization. Transverse short-axis classic approach is the most popular performed approach in literature. "Syringe-Free" is a new described technique that is performed with oblique long-axis approach. We aimed to compare performance of these two approaches. This study was conducted as a prospective and randomized study. 80 patients were included the study and divided into two groups that were named Group C (transverse short-axis classic approach) and Group SF (oblique long-axis syringe-free approach) by a computer-generated randomization. The primary outcome was mean time that guidewire is seen in the internal jugular vein (performing time). The secondary outcomes were to compare number of needle pass, number of skin puncture and complications between two groups. Demographic and hemodynamic data were not significantly different. The mean performing time was 54.9±19.1s in Group C and 43.9±15.8s in Group SF. Significant differences were found between the groups (p=0.006). Mean number of needle pass was 3.2(±2.1) in Group C and 2.1(±1.6) in Group SF. There were statistically significant differences between two groups (p=0.002). The number of skin puncture was 1.6(±0.8) and 1.2(±0.5) in Group C and SF, respectively (p=0.027). "Syringe-Free" technique has lower performing time, number of needle pass and skin puncture. Also, it allows to follow progress of guide-wire under continuous ultrasound visualization and the procedure does not need assistance during catheter insertion. Namely, "Syringe-Free" is effective, safe and fast technique that may be used to place internal jugular venous catheter. Copyright © 2017 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.

  7. Investigation of Particle Sampling Bias in the Shear Flow Field Downstream of a Backward Facing Step

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Kjelgaard, Scott O.; Hepner, Timothy E.

    1990-01-01

    The flow field about a backward facing step was investigated to determine the characteristics of particle sampling bias in the various flow phenomena. The investigation used the calculation of the velocity:data rate correlation coefficient as a measure of statistical dependence and thus the degree of velocity bias. While the investigation found negligible dependence within the free stream region, increased dependence was found within the boundary and shear layers. Full classic correction techniques over-compensated the data since the dependence was weak, even in the boundary layer and shear regions. The paper emphasizes the necessity to determine the degree of particle sampling bias for each measurement ensemble and not use generalized assumptions to correct the data. Further, it recommends the calculation of the velocity:data rate correlation coefficient become a standard statistical calculation in the analysis of all laser velocimeter data.

  8. Destructive testings: dry drilling operations with TruPro system to collect samples in a powder form, from two hulls containing immobilized wastes in a hydraulic binder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pombet, Denis; Desnoyers, Yvon; Charters, Grant

    2013-07-01

    The TruPro{sup R} process enables to collect a significant number of samples to characterize radiological materials. This innovative and alternative technique is experimented for the ANDRA quality-control inspection of cemented packages. It proves to be quicker and more prolific than the current methodology. Using classical statistics and geo-statistics approaches, the physical and radiological characteristics of two hulls containing immobilized wastes (sludges or concentrates) in a hydraulic binder are assessed in this paper. The waste homogeneity is also evaluated in comparison to ANDRA criterion. Sensibility to sample size (support effect), presence of extreme values, acceptable deviation rate and minimum number ofmore » data are discussed. The final objectives are to check the homogeneity of the two characterized radwaste packages and also to validate and reinforce this alternative characterization methodology. (authors)« less

  9. A Revelation: Quantum-Statistics and Classical-Statistics are Analytic-Geometry Conic-Sections and Numbers/Functions: Euler, Riemann, Bernoulli Generating-Functions: Conics to Numbers/Functions Deep Subtle Connections

    NASA Astrophysics Data System (ADS)

    Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!

  10. Comparison of massage based on the tensegrity principle and classic massage in treating chronic shoulder pain.

    PubMed

    Kassolik, Krzysztof; Andrzejewski, Waldemar; Brzozowski, Marcin; Wilk, Iwona; Górecka-Midura, Lucyna; Ostrowska, Bożena; Krzyżanowski, Dominik; Kurpas, Donata

    2013-09-01

    The purpose of this study was to compare the clinical outcomes of classic massage to massage based on the tensegrity principle for patients with chronic idiopathic shoulder pain. Thirty subjects with chronic shoulder pain symptoms were divided into 2 groups, 15 subjects received classic (Swedish) massage to tissues surrounding the glenohumeral joint and 15 subjects received the massage using techniques based on the tensegrity principle. The tensegrity principle is based on directing treatment to the painful area and the tissues (muscles, fascia, and ligaments) that structurally support the painful area, thus treating tissues that have direct and indirect influence on the motion segment. Both treatment groups received 10 sessions over 2 weeks, each session lasted 20 minutes. The McGill Pain Questionnaire and glenohumeral ranges of motion were measured immediately before the first massage session, on the day the therapy ended 2 weeks after therapy started, and 1 month after the last massage. Subjects receiving massage based on the tensegrity principle demonstrated statistically significance improvement in the passive and active ranges of flexion and abduction of the glenohumeral joint. Pain decreased in both massage groups. This study showed increases in passive and active ranges of motion for flexion and abduction in patients who had massage based on the tensegrity principle. For pain outcomes, both classic and tensegrity massage groups demonstrated improvement. Copyright © 2013 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  11. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  12. [Inheritance and evolution of acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times].

    PubMed

    Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao

    2015-05-01

    There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.

  13. Information transport in classical statistical systems

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  14. Single-Trial Normalization for Event-Related Spectral Decomposition Reduces Sensitivity to Noisy Trials

    PubMed Central

    Grandchamp, Romain; Delorme, Arnaud

    2011-01-01

    In electroencephalography, the classical event-related potential model often proves to be a limited method to study complex brain dynamics. For this reason, spectral techniques adapted from signal processing such as event-related spectral perturbation (ERSP) – and its variant event-related synchronization and event-related desynchronization – have been used over the past 20 years. They represent average spectral changes in response to a stimulus. These spectral methods do not have strong consensus for comparing pre- and post-stimulus activity. When computing ERSP, pre-stimulus baseline removal is usually performed after averaging the spectral estimate of multiple trials. Correcting the baseline of each single-trial prior to averaging spectral estimates is an alternative baseline correction method. However, we show that this method leads to positively skewed post-stimulus ERSP values. We eventually present new single-trial-based ERSP baseline correction methods that perform trial normalization or centering prior to applying classical baseline correction methods. We show that single-trial correction methods minimize the contribution of artifactual data trials with high-amplitude spectral estimates and are robust to outliers when performing statistical inference testing. We then characterize these methods in terms of their time–frequency responses and behavior compared to classical ERSP methods. PMID:21994498

  15. Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.

    PubMed

    Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J

    2015-01-01

    The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.

  16. Time series modeling in traffic safety research.

    PubMed

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Comparison of two modalities: a novel technique, 'chromohysteroscopy', and blind endometrial sampling for the evaluation of abnormal uterine bleeding.

    PubMed

    Alay, Asli; Usta, Taner A; Ozay, Pinar; Karadugan, Ozgur; Ates, Ugur

    2014-05-01

    The objective of this study was to compare classical blind endometrial tissue sampling with hysteroscopic biopsy sampling following methylene blue dyeing in premenopausal and postmenopausal patients with abnormal uterine bleeding. A prospective case-control study was carried out in the Office Hysteroscopy Unit. Fifty-four patients with complaints of abnormal uterine bleeding were evaluated. Data of 38 patients were included in the statistical analysis. Three groups were compared by examining samples obtained through hysteroscopic biopsy before and after methylene blue dyeing, and classical blind endometrial tissue sampling. First, uterine cavity was evaluated with office hysteroscopy. Methylene blue dye was administered through the hysteroscopic inlet. Tissue samples were obtained from stained and non-stained areas. Blind endometrial sampling was performed in the same patients immediately after the hysteroscopy procedure. The results of hysteroscopic biopsy from methylene blue stained and non-stained areas and blind biopsy were compared. No statistically significant differences were determined in the comparison of biopsy samples obtained from methylene-blue stained, non-stained areas and blind biopsy (P > 0.05). We suggest that chromohysteroscopy is not superior to endometrial sampling in cases of abnormal uterine bleeding. Further studies with greater sample sizes should be performed to assess the validity of routine use of endometrial dyeing. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  18. High-resolution image reconstruction technique applied to the optical testing of ground-based astronomical telescopes

    NASA Astrophysics Data System (ADS)

    Jin, Zhenyu; Lin, Jing; Liu, Zhong

    2008-07-01

    By study of the classical testing techniques (such as Shack-Hartmann Wave-front Sensor) adopted in testing the aberration of ground-based astronomical optical telescopes, we bring forward two testing methods on the foundation of high-resolution image reconstruction technology. One is based on the averaged short-exposure OTF and the other is based on the Speckle Interferometric OTF by Antoine Labeyrie. Researches made by J.Ohtsubo, F. Roddier, Richard Barakat and J.-Y. ZHANG indicated that the SITF statistical results would be affected by the telescope optical aberrations, which means the SITF statistical results is a function of optical system aberration and the atmospheric Fried parameter (seeing). Telescope diffraction-limited information can be got through two statistics methods of abundant speckle images: by the first method, we can extract the low frequency information such as the full width at half maximum (FWHM) of the telescope PSF to estimate the optical quality; by the second method, we can get a more precise description of the telescope PSF with high frequency information. We will apply the two testing methods to the 2.4m optical telescope of the GMG Observatory, in china to validate their repeatability and correctness and compare the testing results with that of the Shack-Hartmann Wave-Front Sensor got. This part will be described in detail in our paper.

  19. [First discrimination of the meanings of the seven words relevant with acupuncture in Huangdi Nei- jing (Yellow Emnerors Internal Classic)].

    PubMed

    Li, Shuo; Fu, Haiyan; Ju, Baozhao

    2015-10-01

    Huangdi Neijing (Yellow Emperor's Internal Classic) is the earliest medical classic work existing at present in Chinese medical treasures and is the foundation of TCM. It not only contains rich medical words, but also supplements the new meanings of seven words, i. e. Wang, Xiu, Yuan, Fang, Xu, Jiu and Bian for removing needle, retaining needling, reinforcing technique, reducing technique, slow needling, moxibustion and stone-needle puncturing, respectively.

  20. Renormalization group theory outperforms other approaches in statistical comparison between upscaling techniques for porous media

    NASA Astrophysics Data System (ADS)

    Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.

    2017-09-01

    Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.

  1. Quantum state reconstruction and photon number statistics for low dimensional semiconductor opto-electronic devices

    NASA Astrophysics Data System (ADS)

    Böhm, Fabian; Grosse, Nicolai B.; Kolarczik, Mirco; Herzog, Bastian; Achtstein, Alexander; Owschimikow, Nina; Woggon, Ulrike

    2017-09-01

    Quantum state tomography and the reconstruction of the photon number distribution are techniques to extract the properties of a light field from measurements of its mean and fluctuations. These techniques are particularly useful when dealing with macroscopic or mesoscopic systems, where a description limited to the second order autocorrelation soon becomes inadequate. In particular, the emission of nonclassical light is expected from mesoscopic quantum dot systems strongly coupled to a cavity or in systems with large optical nonlinearities. We analyze the emission of a quantum dot-semiconductor optical amplifier system by quantifying the modifications of a femtosecond laser pulse propagating through the device. Using a balanced detection scheme in a self-heterodyning setup, we achieve precise measurements of the quadrature components and their fluctuations at the quantum noise limit1. We resolve the photon number distribution and the thermal-to-coherent evolution in the photon statistics of the emission. The interferometric detection achieves a high sensitivity in the few photon limit. From our data, we can also reconstruct the second order autocorrelation function with higher precision and time resolution compared with classical Hanbury Brown-Twiss experiments.

  2. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  3. Relative velocity change measurement based on seismic noise analysis in exploration geophysics

    NASA Astrophysics Data System (ADS)

    Corciulo, M.; Roux, P.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring techniques based on noise cross-correlation analysis are still debated in exploration geophysics even if recent studies showed impressive performance in seismology at larger scale. Time evolution of complex geological structure using noise data includes localization of noise sources and measurement of relative velocity variations. Monitoring relative velocity variations only requires the measurement of phase shifts of seismic noise cross-correlation functions computed for successive time recordings. The existing algorithms, such as the Stretching and the Doublet, classically need great efforts in terms of computation time, making them not practical when continuous dataset on dense arrays are acquired. We present here an innovative technique for passive monitoring based on the measure of the instantaneous phase of noise-correlated signals. The Instantaneous Phase Variation (IPV) technique aims at cumulating the advantages of the Stretching and Doublet methods while proposing a faster measurement of the relative velocity change. The IPV takes advantage of the Hilbert transform to compute in the time domain the phase difference between two noise correlation functions. The relative velocity variation is measured through the slope of the linear regression of the phase difference curve as a function of correlation time. The large amount of noise correlation functions, classically available at exploration scale on dense arrays, allows for a statistical analysis that further improves the precision of the estimation of the velocity change. In this work, numerical tests first aim at comparing the IPV performance to the Stretching and Doublet techniques in terms of accuracy, robustness and computation time. Then experimental results are presented using a seismic noise dataset with five days of continuous recording on 397 geophones spread on a ~1 km-squared area.

  4. Effect of postmortem sampling technique on the clinical significance of autopsy blood cultures.

    PubMed

    Hove, M; Pencil, S D

    1998-02-01

    Our objective was to investigate the value of postmortem autopsy blood cultures performed with an iodine-subclavian technique relative to the classical method of atrial heat searing and antemortem blood cultures. The study consisted of a prospective autopsy series with each case serving as its own control relative to subsequent testing, and a retrospective survey of patients coming to autopsy who had both autopsy blood cultures and premortem blood cultures. A busy academic autopsy service (600 cases per year) at University of Texas Medical Branch Hospitals, Galveston, Texas, served as the setting for this work. The incidence of non-clinically relevant (false-positive) culture results were compared using different methods for collecting blood samples in a prospective series of 38 adult autopsy specimens. One hundred eleven adult autopsy specimens in which both postmortem and antemortem blood cultures were obtained were studied retrospectively. For both studies, positive culture results were scored as either clinically relevant or false positives based on analysis of the autopsy findings and the clinical summary. The rate of false-positive culture results obtained by an iodine-subclavian technique from blood drawn soon after death were statistically significantly lower (13%) than using the classical method of obtaining blood through the atrium after heat searing at the time of the autopsy (34%) in the same set of autopsy subjects. When autopsy results were compared with subjects' antemortem blood culture results, there was no significant difference in the rate of non-clinically relevant culture results in a paired retrospective series of antemortem blood cultures and postmortem blood cultures using the iodine-subclavian postmortem method (11.7% v 13.5%). The results indicate that autopsy blood cultures obtained using the iodine-subclavian technique have reliability equivalent to that of antemortem blood cultures.

  5. Thermodynamics and statistical mechanics. [thermodynamic properties of gases

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.

  6. Quick Overview Scout 2008 Version 1.0

    EPA Science Inventory

    The Scout 2008 version 1.0 statistical software package has been updated from past DOS and Windows versions to provide classical and robust univariate and multivariate graphical and statistical methods that are not typically available in commercial or freeware statistical softwar...

  7. The Relationship between Background Classical Music and Reading Comprehension on Seventh and Eighth Grade Students

    ERIC Educational Resources Information Center

    Falcon, Evelyn

    2017-01-01

    The purpose of this study was to examine if there is any relationship on reading comprehension when background classical music is played in the setting of a 7th and 8th grade classroom. This study also examined if there was a statistically significant difference in test anxiety when listening to classical music while completing a test. Reading…

  8. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  9. Student Support for Research in Hierarchical Control and Trajectory Planning

    NASA Technical Reports Server (NTRS)

    Martin, Clyde F.

    1999-01-01

    Generally, classical polynomial splines tend to exhibit unwanted undulations. In this work, we discuss a technique, based on control principles, for eliminating these undulations and increasing the smoothness properties of the spline interpolants. We give a generalization of the classical polynomial splines and show that this generalization is, in fact, a family of splines that covers the broad spectrum of polynomial, trigonometric and exponential splines. A particular element in this family is determined by the appropriate control data. It is shown that this technique is easy to implement. Several numerical and curve-fitting examples are given to illustrate the advantages of this technique over the classical approach. Finally, we discuss the convergence properties of the interpolant.

  10. On the potential for the Partial Triadic Analysis to grasp the spatio-temporal variability of groundwater hydrochemistry

    NASA Astrophysics Data System (ADS)

    Gourdol, L.; Hissler, C.; Pfister, L.

    2012-04-01

    The Luxembourg sandstone aquifer is of major relevance for the national supply of drinking water in Luxembourg. The city of Luxembourg (20% of the country's population) gets almost 2/3 of its drinking water from this aquifer. As a consequence, the study of both the groundwater hydrochemistry, as well as its spatial and temporal variations, are considered as of highest priority. Since 2005, a monitoring network has been implemented by the Water Department of Luxembourg City, with a view to a more sustainable management of this strategic water resource. The data collected to date forms a large and complex dataset, describing spatial and temporal variations of many hydrochemical parameters. The data treatment issue is tightly connected to this kind of water monitoring programs and complex databases. Standard multivariate statistical techniques, such as principal components analysis and hierarchical cluster analysis, have been widely used as unbiased methods for extracting meaningful information from groundwater quality data and are now classically used in many hydrogeological studies, in particular to characterize temporal or spatial hydrochemical variations induced by natural and anthropogenic factors. But these classical multivariate methods deal with two-way matrices, usually parameters/sites or parameters/time, while often the dataset resulting from qualitative water monitoring programs should be seen as a datacube parameters/sites/time. Three-way matrices, such as the one we propose here, are difficult to handle and to analyse by classical multivariate statistical tools and thus should be treated with approaches dealing with three-way data structures. One possible analysis approach consists in the use of partial triadic analysis (PTA). The PTA was previously used with success in many ecological studies but never to date in the domain of hydrogeology. Applied to the dataset of the Luxembourg Sandstone aquifer, the PTA appears as a new promising statistical instrument for hydrogeologists, in particular to characterize temporal and spatial hydrochemical variations induced by natural and anthropogenic factors. This new approach for groundwater management offers potential for 1) identifying a common multivariate spatial structure, 2) untapping the different hydrochemical patterns and explaining their controlling factors and 3) analysing the temporal variability of this structure and grasping hydrochemical changes.

  11. Frequent statistics of link-layer bit stream data based on AC-IM algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Chenghong; Lei, Yingke; Xu, Yiming

    2017-08-01

    At present, there are many relevant researches on data processing using classical pattern matching and its improved algorithm, but few researches on statistical data of link-layer bit stream. This paper adopts a frequent statistical method of link-layer bit stream data based on AC-IM algorithm for classical multi-pattern matching algorithms such as AC algorithm has high computational complexity, low efficiency and it cannot be applied to binary bit stream data. The method's maximum jump distance of the mode tree is length of the shortest mode string plus 3 in case of no missing? In this paper, theoretical analysis is made on the principle of algorithm construction firstly, and then the experimental results show that the algorithm can adapt to the binary bit stream data environment and extract the frequent sequence more accurately, the effect is obvious. Meanwhile, comparing with the classical AC algorithm and other improved algorithms, AC-IM algorithm has a greater maximum jump distance and less time-consuming.

  12. Response to traumatic brain injury neurorehabilitation through an artificial intelligence and statistics hybrid knowledge discovery from databases methodology.

    PubMed

    Gibert, Karina; García-Rudolph, Alejandro; García-Molina, Alberto; Roig-Rovira, Teresa; Bernabeu, Montse; Tormos, José María

    2008-01-01

    Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.

  13. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  14. Continuous quantum measurement and the quantum to classical transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Tanmoy; Habib, Salman; Jacobs, Kurt

    2003-04-01

    While ultimately they are described by quantum mechanics, macroscopic mechanical systems are nevertheless observed to follow the trajectories predicted by classical mechanics. Hence, in the regime defining macroscopic physics, the trajectories of the correct classical motion must emerge from quantum mechanics, a process referred to as the quantum to classical transition. Extending previous work [Bhattacharya, Habib, and Jacobs, Phys. Rev. Lett. 85, 4852 (2000)], here we elucidate this transition in some detail, showing that once the measurement processes that affect all macroscopic systems are taken into account, quantum mechanics indeed predicts the emergence of classical motion. We derive inequalities thatmore » describe the parameter regime in which classical motion is obtained, and provide numerical examples. We also demonstrate two further important properties of the classical limit: first, that multiple observers all agree on the motion of an object, and second, that classical statistical inference may be used to correctly track the classical motion.« less

  15. Least Squares Procedures.

    ERIC Educational Resources Information Center

    Hester, Yvette

    Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…

  16. Scout 2008 Version 1.0 User Guide

    EPA Science Inventory

    The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...

  17. Prediction of lung cancer patient survival via supervised machine learning classification techniques.

    PubMed

    Lynch, Chip M; Abdollahi, Behnaz; Fuqua, Joshua D; de Carlo, Alexandra R; Bartholomai, James A; Balgemann, Rayeanne N; van Berkel, Victor H; Frieboes, Hermann B

    2017-12-01

    Outcomes for cancer patients have been previously estimated by applying various machine learning techniques to large datasets such as the Surveillance, Epidemiology, and End Results (SEER) program database. In particular for lung cancer, it is not well understood which types of techniques would yield more predictive information, and which data attributes should be used in order to determine this information. In this study, a number of supervised learning techniques is applied to the SEER database to classify lung cancer patients in terms of survival, including linear regression, Decision Trees, Gradient Boosting Machines (GBM), Support Vector Machines (SVM), and a custom ensemble. Key data attributes in applying these methods include tumor grade, tumor size, gender, age, stage, and number of primaries, with the goal to enable comparison of predictive power between the various methods The prediction is treated like a continuous target, rather than a classification into categories, as a first step towards improving survival prediction. The results show that the predicted values agree with actual values for low to moderate survival times, which constitute the majority of the data. The best performing technique was the custom ensemble with a Root Mean Square Error (RMSE) value of 15.05. The most influential model within the custom ensemble was GBM, while Decision Trees may be inapplicable as it had too few discrete outputs. The results further show that among the five individual models generated, the most accurate was GBM with an RMSE value of 15.32. Although SVM underperformed with an RMSE value of 15.82, statistical analysis singles the SVM as the only model that generated a distinctive output. The results of the models are consistent with a classical Cox proportional hazards model used as a reference technique. We conclude that application of these supervised learning techniques to lung cancer data in the SEER database may be of use to estimate patient survival time with the ultimate goal to inform patient care decisions, and that the performance of these techniques with this particular dataset may be on par with that of classical methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A statistical physics view of pitch fluctuations in the classical music from Bach to Chopin: evidence for scaling.

    PubMed

    Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping

    2013-01-01

    Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.

  19. Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.

    2011-01-01

    The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.

  20. Profiling of modified nucleosides from ribonucleic acid digestion by supercritical fluid chromatography coupled to high resolution mass spectrometry.

    PubMed

    Laboureur, Laurent; Guérineau, Vincent; Auxilien, Sylvie; Yoshizawa, Satoko; Touboul, David

    2018-02-16

    A method based on supercritical fluid chromatography coupled to high resolution mass spectrometry for the profiling of canonical and modified nucleosides was optimized, and compared to classical reverse-phase liquid chromatography in terms of separation, number of detected modified nucleosides and sensitivity. Limits of detection and quantification were measured using statistical method and quantifications of twelve nucleosides of a tRNA digest from E. coli are in good agreement with previously reported data. Results highlight the complementarity of both separation techniques to cover the largest view of nucleoside modifications for forthcoming epigenetic studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Study of optimum methods of optical communication

    NASA Technical Reports Server (NTRS)

    Harger, R. O.

    1972-01-01

    Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.

  2. Local coexistence of VO 2 phases revealed by deep data analysis

    DOE PAGES

    Strelcov, Evgheni; Ievlev, Anton; Tselev, Alexander; ...

    2016-07-07

    We report a synergistic approach of micro-Raman spectroscopic mapping and deep data analysis to study the distribution of crystallographic phases and ferroelastic domains in a defected Al-doped VO 2 microcrystal. Bayesian linear unmixing revealed an uneven distribution of the T phase, which is stabilized by the surface defects and uneven local doping that went undetectable by other classical analysis techniques such as PCA and SIMPLISMA. This work demonstrates the impact of information recovery via statistical analysis and full mapping in spectroscopic studies of vanadium dioxide systems, which is commonly substituted by averaging or single point-probing approaches, both of which suffermore » from information misinterpretation due to low resolving power.« less

  3. Numerical methods for coupled fracture problems

    NASA Astrophysics Data System (ADS)

    Viesca, Robert C.; Garagash, Dmitry I.

    2018-04-01

    We consider numerical solutions in which the linear elastic response to an opening- or sliding-mode fracture couples with one or more processes. Classic examples of such problems include traction-free cracks leading to stress singularities or cracks with cohesive-zone strength requirements leading to non-singular stress distributions. These classical problems have characteristic square-root asymptotic behavior for stress, relative displacement, or their derivatives. Prior work has shown that such asymptotics lead to a natural quadrature of the singular integrals at roots of Chebyhsev polynomials of the first, second, third, or fourth kind. We show that such quadratures lead to convenient techniques for interpolation, differentiation, and integration, with the potential for spectral accuracy. We further show that these techniques, with slight amendment, may continue to be used for non-classical problems which lack the classical asymptotic behavior. We consider solutions to example problems of both the classical and non-classical variety (e.g., fluid-driven opening-mode fracture and fault shear rupture driven by thermal weakening), with comparisons to analytical solutions or asymptotes, where available.

  4. Probability and Statistics: A Prelude.

    ERIC Educational Resources Information Center

    Goodman, A. F.; Blischke, W. R.

    Probability and statistics have become indispensable to scientific, technical, and management progress. They serve as essential dialects of mathematics, the classical language of science, and as instruments necessary for intelligent generation and analysis of information. A prelude to probability and statistics is presented by examination of the…

  5. Use of Fermi-Dirac statistics for defects in solids

    NASA Astrophysics Data System (ADS)

    Johnson, R. A.

    1981-12-01

    The Fermi-Dirac distribution function is an approximation describing a special case of Boltzmann statistics. A general occupation probability formula is derived and a criterion given for the use of Fermi-Dirac statistics. Application to classical problems of defects in solids is discussed.

  6. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  7. A Review of the Integration of Classical Biological Control with other Management Techniques to Manage Invasive Weeds in Natural Areas and Rangelands

    USDA-ARS?s Scientific Manuscript database

    Integrating classical biological control with other management techniques such as herbicide, fire, mechanical control, grazing, or plant competition, can be the most effective way to manage invasive weeds in natural areas and rangelands. Biological control agents can be protected from potential nega...

  8. Machine learning of frustrated classical spin models. I. Principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ce; Zhai, Hui

    2017-10-01

    This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.

  9. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  10. Speed and heart-rate profiles in skating and classical cross-country skiing competitions.

    PubMed

    Bolger, Conor M; Kocbach, Jan; Hegge, Ann Magdalen; Sandbakk, Øyvind

    2015-10-01

    To compare the speed and heart-rate profiles during international skating and classical competitions in male and female world-class cross-country skiers. Four male and 5 female skiers performed individual time trials of 15 km (men) and 10 km (women) in the skating and classical techniques on 2 consecutive days. Races were performed on the same 5-km course. The course was mapped with GPS and a barometer to provide a valid course and elevation profile. Time, speed, and heart rate were determined for uphill, flat, and downhill terrains throughout the entire competition by wearing a GPS and a heart-rate monitor. Times in uphill, flat, and downhill terrain were ~55%, 15-20%, and 25-30%, respectively, of the total race time for both techniques and genders. The average speed differences between skating and classical skiing were 9% and 11% for men and women, respectively, and these values were 12% and 15% for uphill, 8% and 13% for flat (all P < .05), and 2% and 1% for downhill terrain. The average speeds for men were 9% and 11% faster than for women in skating and classical, respectively, with corresponding numbers of 11% and 14% for uphill, 6% and 11% for flat, and 4% and 5% for downhill terrain (all P < .05). Heart-rate profiles were relatively independent of technique and gender. The greatest performance differences between the skating and classical techniques and between the 2 genders were found on uphill terrain. Therefore, these speed differences could not be explained by variations in exercise intensity.

  11. Off-diagonal expansion quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  12. Off-diagonal expansion quantum Monte Carlo.

    PubMed

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  13. Limb Lengthening and Then Insertion of an Intramedullary Nail: A Case-matched Comparison

    PubMed Central

    Kleinman, Dawn; Fragomen, Austin T.; Ilizarov, Svetlana

    2008-01-01

    Distraction osteogenesis is an effective method for lengthening, deformity correction, and treatment of nonunions and bone defects. The classic method uses an external fixator for both distraction and consolidation leading to lengthy times in frames and there is a risk of refracture after frame removal. We suggest a new technique: lengthening and then nailing (LATN) technique in which the frame is used for gradual distraction and then a reamed intramedullary nail inserted to support the bone during the consolidation phase, allowing early removal of the external fixator. We performed a retrospective case-matched comparison of patients lengthened with LATN (39 limbs in 27 patients) technique versus the classic (34 limbs in 27 patients). The LATN group wore the external fixator for less time than the classic group (12 versus 29 weeks). The LATN group had a lower external fixation index (0.5 versus 1.9) and a lower bone healing index (0.8 versus 1.9) than the classic group. LATN confers advantages over the classic method including shorter times needed in external fixation, quicker bone healing, and protection against refracture. There are also advantages over the lengthening over a nail and internal lengthening nail techniques. Level of Evidence: Level III, therapeutic study. See the Guidelines for Authors for a complete description of levels of evidence. PMID:18800209

  14. Methods for Multiloop Identification of Visual and Neuromuscular Pilot Responses.

    PubMed

    Olivari, Mario; Nieuwenhuizen, Frank M; Venrooij, Joost; Bülthoff, Heinrich H; Pollini, Lorenzo

    2015-12-01

    In this paper, identification methods are proposed to estimate the neuromuscular and visual responses of a multiloop pilot model. A conventional and widely used technique for simultaneous identification of the neuromuscular and visual systems makes use of cross-spectral density estimates. This paper shows that this technique requires a specific noninterference hypothesis, often implicitly assumed, that may be difficult to meet during actual experimental designs. A mathematical justification of the necessity of the noninterference hypothesis is given. Furthermore, two methods are proposed that do not have the same limitations. The first method is based on autoregressive models with exogenous inputs, whereas the second one combines cross-spectral estimators with interpolation in the frequency domain. The two identification methods are validated by offline simulations and contrasted to the classic method. The results reveal that the classic method fails when the noninterference hypothesis is not fulfilled; on the contrary, the two proposed techniques give reliable estimates. Finally, the three identification methods are applied to experimental data from a closed-loop control task with pilots. The two proposed techniques give comparable estimates, different from those obtained by the classic method. The differences match those found with the simulations. Thus, the two identification methods provide a good alternative to the classic method and make it possible to simultaneously estimate human's neuromuscular and visual responses in cases where the classic method fails.

  15. On Some Assumptions of the Null Hypothesis Statistical Testing

    ERIC Educational Resources Information Center

    Patriota, Alexandre Galvão

    2017-01-01

    Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…

  16. QSAR as a random event: modeling of nanoparticles uptake in PaCa2 cancer cells.

    PubMed

    Toropov, Andrey A; Toropova, Alla P; Puzyn, Tomasz; Benfenati, Emilio; Gini, Giuseppina; Leszczynska, Danuta; Leszczynski, Jerzy

    2013-06-01

    Quantitative structure-property/activity relationships (QSPRs/QSARs) are a tool to predict various endpoints for various substances. The "classic" QSPR/QSAR analysis is based on the representation of the molecular structure by the molecular graph. However, simplified molecular input-line entry system (SMILES) gradually becomes most popular representation of the molecular structure in the databases available on the Internet. Under such circumstances, the development of molecular descriptors calculated directly from SMILES becomes attractive alternative to "classic" descriptors. The CORAL software (http://www.insilico.eu/coral) is provider of SMILES-based optimal molecular descriptors which are aimed to correlate with various endpoints. We analyzed data set on nanoparticles uptake in PaCa2 pancreatic cancer cells. The data set includes 109 nanoparticles with the same core but different surface modifiers (small organic molecules). The concept of a QSAR as a random event is suggested in opposition to "classic" QSARs which are based on the only one distribution of available data into the training and the validation sets. In other words, five random splits into the "visible" training set and the "invisible" validation set were examined. The SMILES-based optimal descriptors (obtained by the Monte Carlo technique) for these splits are calculated with the CORAL software. The statistical quality of all these models is good. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Pattern-Based Inverse Modeling for Characterization of Subsurface Flow Models with Complex Geologic Heterogeneity

    NASA Astrophysics Data System (ADS)

    Golmohammadi, A.; Jafarpour, B.; M Khaninezhad, M. R.

    2017-12-01

    Calibration of heterogeneous subsurface flow models leads to ill-posed nonlinear inverse problems, where too many unknown parameters are estimated from limited response measurements. When the underlying parameters form complex (non-Gaussian) structured spatial connectivity patterns, classical variogram-based geostatistical techniques cannot describe the underlying connectivity patterns. Modern pattern-based geostatistical methods that incorporate higher-order spatial statistics are more suitable for describing such complex spatial patterns. Moreover, when the underlying unknown parameters are discrete (geologic facies distribution), conventional model calibration techniques that are designed for continuous parameters cannot be applied directly. In this paper, we introduce a novel pattern-based model calibration method to reconstruct discrete and spatially complex facies distributions from dynamic flow response data. To reproduce complex connectivity patterns during model calibration, we impose a feasibility constraint to ensure that the solution follows the expected higher-order spatial statistics. For model calibration, we adopt a regularized least-squares formulation, involving data mismatch, pattern connectivity, and feasibility constraint terms. Using an alternating directions optimization algorithm, the regularized objective function is divided into a continuous model calibration problem, followed by mapping the solution onto the feasible set. The feasibility constraint to honor the expected spatial statistics is implemented using a supervised machine learning algorithm. The two steps of the model calibration formulation are repeated until the convergence criterion is met. Several numerical examples are used to evaluate the performance of the developed method.

  18. Refined genetic algorithm -- Economic dispatch example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheble, G.B.; Brittig, K.

    1995-02-01

    A genetic-based algorithm is used to solve an economic dispatch (ED) problem. The algorithm utilizes payoff information of perspective solutions to evaluate optimality. Thus, the constraints of classical LaGrangian techniques on unit curves are eliminated. Using an economic dispatch problem as a basis for comparison, several different techniques which enhance program efficiency and accuracy, such as mutation prediction, elitism, interval approximation and penalty factors, are explored. Two unique genetic algorithms are also compared. The results are verified for a sample problem using a classical technique.

  19. Robust Statistics: What They Are, and Why They Are So Important

    ERIC Educational Resources Information Center

    Corlu, Sencer M.

    2009-01-01

    The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…

  20. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  1. Uterine preservation in pelvic organ prolapse using robot assisted laparoscopic sacrohysteropexy: quality of life and technique.

    PubMed

    Mourik, Sarah L; Martens, Jolise E; Aktas, Mustafa

    2012-11-01

    Measuring quality of life of women with disorders of the pelvic floor is crucial when evaluating a therapy. The aim of this study is to profile health related quality of life of women with pelvic organ prolapse who are treated with robot assisted laparoscopic sacrohysteropexy (RALS). We also compare the operative characteristics and learning curve in this study with the current literature and describe the surgical technique. A prospective cohort study in a teaching hospital in The Netherlands. Fifty women with uterovaginal prolapse were treated with RALS. This study presents the largest cohort in Europe treated by RALS to date. Quality of life was assessed pre- and post-operatively using the UDI/IIQ validated self-questionnaire designed for Dutch-speaking patients. Clinical and operative data were prospectively collected up to 29 months. RALS was performed with preservation of the uterus. Statistical analysis of categorical data was performed with the paired T-test. Descriptive statistics were computed with the use of standard methods for means, median and proportions. Before operation, overall wellbeing was scored at 67.7% and after surgery this improved to 82.1% (p=0.03). Feelings of nervousness, frustration and embarrassment reduced significantly. Sexual functioning improved, but not significantly. The mean operative time was 223 (103-340) min. Operative time decreased significantly with gained experience and became comparable to the operative time for abdominal sacrocolpopexy and classic laparoscopy. Average blood loss was less than 50 ml and patients had a mean hospital stay of 2 days. Of all women, 95.2% were very satisfied with the result after RALS. Health related quality of life improves significantly after RALS. There are high rates of patient satisfaction. RALS proves to be a safe and effective treatment of pelvic organ prolapse. Operative time is comparable to abdominal sacrocolpopexy and classic laparoscopy in the current literature. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Linear and Non-linear Information Flows In Rainfall Field

    NASA Astrophysics Data System (ADS)

    Molini, A.; La Barbera, P.; Lanza, L. G.

    The rainfall process is the result of a complex framework of non-linear dynamical in- teractions between the different components of the atmosphere. It preserves the com- plexity and the intermittent features of the generating system in space and time as well as the strong dependence of these properties on the scale of observations. The understanding and quantification of how the non-linearity of the generating process comes to influence the single rain events constitute relevant research issues in the field of hydro-meteorology, especially in those applications where a timely and effective forecasting of heavy rain events is able to reduce the risk of failure. This work focuses on the characterization of the non-linear properties of the observed rain process and on the influence of these features on hydrological models. Among the goals of such a survey is the research of regular structures of the rainfall phenomenon and the study of the information flows within the rain field. The research focuses on three basic evo- lution directions for the system: in time, in space and between the different scales. In fact, the information flows that force the system to evolve represent in general a connection between the different locations in space, the different instants in time and, unless assuming the hypothesis of scale invariance is verified "a priori", the different characteristic scales. A first phase of the analysis is carried out by means of classic statistical methods, then a survey of the information flows within the field is devel- oped by means of techniques borrowed from the Information Theory, and finally an analysis of the rain signal in the time and frequency domains is performed, with par- ticular reference to its intermittent structure. The methods adopted in this last part of the work are both the classic techniques of statistical inference and a few procedures for the detection of non-linear and non-stationary features within the process starting from measured data.

  3. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  4. Classical Electrodynamics: Lecture notes

    NASA Astrophysics Data System (ADS)

    Likharev, Konstantin K.

    2018-06-01

    Essential Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.

  5. Perceptual basis of evolving Western musical styles

    PubMed Central

    Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.

    2013-01-01

    The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669

  6. Notes on power of normality tests of error terms in regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Střelec, Luboš

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less

  7. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    NASA Technical Reports Server (NTRS)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  8. Energy expenditure for massage therapists during performing selected classical massage techniques.

    PubMed

    Więcek, Magdalena; Szymura, Jadwiga; Maciejczyk, Marcin; Szyguła, Zbigniew; Cempla, Jerzy; Borkowski, Mateusz

    2018-04-11

    The aim of the study is to evaluate the intensity of the effort and energy expenditure in the course of performing selected classical massage techniques and to assess the workload of a massage therapist during a work shift. Thirteen massage therapists (age: 21.9±1.9 years old, body mass index: 24.5±2.8 kg×m-2, maximal oxygen consumption × body mass-1 (VO2 max×BM-1): 42.3±7 ml×kg-1×min-1) were involved in the study. The stress test consisted in performing selected classical massage techniques in the following order: stroking, kneading, shaking, beating, rubbing and direct vibration, during which the cardio-respiratory responses and the subjective rating of perceived exertion (RPE) were assessed. Intensity of exercise during each massage technique was expressed as % VO2 max, % maximal heart rate (HRmax) and % heart rate reserve (HRR). During each massage technique, net energy expenditure (EE) and energy cost of work using metabolic equivalent of task (MET) were determined. The intensity of exercise was 47.2±6.2% as expressed in terms of % VO2 max, and 74.7±3.2% as expressed in terms of % HRmax, while it was 47.8±1.7% on average when expressed in terms of % HRR during the whole procedure. While performing the classical massage, the average EE and MET were 5.6±0.9 kcal×min-1 and 5.6±0.2, respectively. The average RPE calculated for the entire procedure was 12.1±1.4. During the performance of a classical massage technique for a single treatment during the study, the average total EE was 176.5±29.6 kcal, resulting in an energy expenditure of 336.2±56.4 kcal×h-1. In the case of the classical massage technique, rubbing was the highest intensity exercise for the masseur who performed the massage (%VO2 max = 57.4±13.1%, HRmax = 79.6±7.7%, HRR = 58.5±13.1%, MET = 6.7±1.1, EE = 7.1±1.4 kcal×min-1, RPE = 13.4±1.3). In the objective assessment, physical exercise while performing a single classical massage is characterized by hard work. The technique of classical massage during which the masseur performs the highest exercise intensity is rubbing. According to the classification of work intensity based on energy expenditure, the masseur's work is considered heavy during the whole work shift. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  9. Ehrenfest dynamics is purity non-preserving: A necessary ingredient for decoherence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alonso, J. L.; Instituto de Biocomputacion y Fisica de Sistemas Complejos; Unidad Asociada IQFR-BIFI, Universidad de Zaragoza, Mariano Esquillor s/n, E-50018 Zaragoza

    2012-08-07

    We discuss the evolution of purity in mixed quantum/classical approaches to electronic nonadiabatic dynamics in the context of the Ehrenfest model. As it is impossible to exactly determine initial conditions for a realistic system, we choose to work in the statistical Ehrenfest formalism that we introduced in Alonso et al. [J. Phys. A: Math. Theor. 44, 396004 (2011)]. From it, we develop a new framework to determine exactly the change in the purity of the quantum subsystem along with the evolution of a statistical Ehrenfest system. In a simple case, we verify how and to which extent Ehrenfest statistical dynamicsmore » makes a system with more than one classical trajectory, and an initial quantum pure state become a quantum mixed one. We prove this numerically showing how the evolution of purity depends on time, on the dimension of the quantum state space D, and on the number of classical trajectories N of the initial distribution. The results in this work open new perspectives for studying decoherence with Ehrenfest dynamics.« less

  10. Generalized relative entropies in the classical limit

    NASA Astrophysics Data System (ADS)

    Kowalski, A. M.; Martin, M. T.; Plastino, A.

    2015-03-01

    Our protagonists are (i) the Cressie-Read family of divergences (characterized by the parameter γ), (ii) Tsallis' generalized relative entropies (characterized by the q one), and, as a particular instance of both, (iii) the Kullback-Leibler (KL) relative entropy. In their normalized versions, we ascertain the equivalence between (i) and (ii). Additionally, we employ these three entropic quantifiers in order to provide a statistical investigation of the classical limit of a semiclassical model, whose properties are well known from a purely dynamic viewpoint. This places us in a good position to assess the appropriateness of our statistical quantifiers for describing involved systems. We compare the behaviour of (i), (ii), and (iii) as one proceeds towards the classical limit. We determine optimal ranges for γ and/or q. It is shown the Tsallis-quantifier is better than KL's for 1.5 < q < 2.5.

  11. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  12. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  13. APPROACH TO EQUILIBRIUM OF A QUANTUM PLASMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balescu, R.

    1961-01-01

    The treatment of irreversible processes in a classical plasma (R. Balescu, Phys. Fluids 3, 62(1960)) was extended to a gas of charged particles obeying quantum statistics. The various contributions to the equation of evolution for the reduced one-particle Wigner function were written in a form analogous to the classical formalism. The summation was then performed in a straightforward manner. The resulting equation describes collisions between particles "dressed" by their polarization clouds, exactly as in the classical situation. (auth)

  14. Unbiased estimators for spatial distribution functions of classical fluids

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Jarzynski, Christopher

    2005-01-01

    We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.

  15. Soil genotoxicity assessment: a new stategy based on biomolecular tools and plant bioindicators.

    PubMed

    Citterio, Sandra; Aina, Roberta; Labra, Massimo; Ghiani, Alessandra; Fumagalli, Pietro; Sgorbati, Sergio; Santagostino, Angela

    2002-06-15

    The setting up of efficient early warning systems is a challenge to research for preventing environmental alteration and human disease. In this paper, we report the development and the field application of a new biomonitoring methodology for assessing soil genotoxicity. In the first part, the use of amplified fragment length polymorphism and flow cytometry techniques to detect DNA damage induced by soils artificially contaminated with heavy metals as potentially genotoxic compounds is explained. Results show that the combination of the two techniques leads to efficient detection of the sublethal genotoxic effect induced in the plant bioindicator by contaminated soil. By contrast, the classic mortality, root, and shoot growth vegetative endpoints prove inappropriate for assessing soil genotoxicity because, although they cause genotoxic damage, some heavy metals do not affect sentinel plant development negatively. The statistical elaboration of the data obtained led to the development of a statistical predictive model which differentiates four different levels of soil genotoxic pollution and can be used everywhere. The second part deals with the application of the biomonitoring protocol in the genotoxic assessment of two areas surrounding a steelworks in northern Italy and the effectiveness of this methodology. In this particular case, in these areas, the predictive model reveals a pollution level strictly correlated to the heavy metal concentrations revealed by traditional chemical analysis.

  16. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    PubMed Central

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  17. Response surface methodology: A non-conventional statistical tool to maximize the throughput of Streptomyces species biomass and their bioactive metabolites.

    PubMed

    Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai

    2017-09-01

    Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.

  18. The modified Misgav-Ladach versus the Pfannenstiel-Kerr technique for cesarean section: a randomized trial.

    PubMed

    Xavier, Pedro; Ayres-De-Campos, Diogo; Reynolds, Ana; Guimarães, Mariana; Costa-Santos, Cristina; Patrício, Belmiro

    2005-09-01

    Modifications to the classic cesarean section technique described by Pfannenstiel and Kerr have been proposed in the last few years. The objective of this trial was to compare intraoperative and short-term postoperative outcomes between the Pfannenstiel-Kerr and the modified Misgav-Ladach (MML) techniques for cesarean section. This prospective randomized trial involved 162 patients undergoing transverse lower uterine segment cesarean section. Patients were allocated to one of the two arms: 88 to the MML technique and 74 to the Pfannenstiel-Kerr technique. Main outcome measures were defined as the duration of surgery, analgesic requirements, and bowel restitution by the second postoperative day. Additional outcomes evaluated were febrile morbidity, postoperative antibiotic use, postpartum endometritis, and wound complications. Student's t, Mann-Whitney, and Chi-square tests were used for statistical analysis of the results, and a p < 0.05 was considered as the probability level reflecting significant differences. No differences between groups were noted in the incidence of analgesic requirements, bowel restitution by the second postoperative day, febrile morbidity, antibiotic requirements, endometritis, or wound complications. The MML technique took on average 12 min less to complete (p = 0.001). The MML technique is faster to perform and similar in terms of febrile morbidity, time to bowel restitution, or need for postoperative medications. It is likely to be more cost-effective.

  19. Realistic finite temperature simulations of magnetic systems using quantum statistics

    NASA Astrophysics Data System (ADS)

    Bergqvist, Lars; Bergman, Anders

    2018-01-01

    We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.

  20. [The new methods in gerontology for life expectancy prediction of the indigenous population of Yugra].

    PubMed

    Gavrilenko, T V; Es'kov, V M; Khadartsev, A A; Khimikova, O I; Sokolova, A A

    2014-01-01

    The behavior of the state vector of human cardio-vascular system in different age groups according to methods of theory of chaos-self-organization and methods of classical statistics was investigated. Observations were made on the indigenous people of North of the Russian Federation. Using methods of the theory of chaos-self-organization the differences in the parameters of quasi-attractors of the human state vector of cardio-vascular system of the people of Russian Federation North were shown. Comparison with the results obtained by classical statistics was made.

  1. Rapid sequence induction has no use in pediatric anesthesia.

    PubMed

    Engelhardt, Thomas

    2015-01-01

    (Classic) rapid sequence induction and intubation (RSII) has been considered fundamental to the provision of safe anesthesia. This technique consists of a combination of drugs and techniques and is intended to prevent pulmonary aspiration of gastric content with catastrophic outcomes to the patient. This review investigates aspects of this technique and highlights dangers and frauds if this technique is transferred directly into pediatric anesthesia practice. The author recommends a controlled anesthesia induction by trained pediatric anesthesiologist with suitable equipment for the children considered at risk of pulmonary aspiration. RSSI is a dangerous technique if adopted without modification into pediatric anesthesia and has in its 'classic' form no use. © 2014 John Wiley & Sons Ltd.

  2. Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.

    PubMed

    Segovia, F; Górriz, J M; Ramírez, J; Phillips, C

    2016-01-01

    Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.

  3. The Multiphoton Interaction of Lambda Model Atom and Two-Mode Fields

    NASA Technical Reports Server (NTRS)

    Liu, Tang-Kun

    1996-01-01

    The system of two-mode fields interacting with atom by means of multiphotons is addressed, and the non-classical statistic quality of two-mode fields with interaction is discussed. Through mathematical calculation, some new rules of non-classical effects of two-mode fields which evolue with time, are established.

  4. A Review of Classical Methods of Item Analysis.

    ERIC Educational Resources Information Center

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  5. Non-classical State via Superposition of Two Opposite Coherent States

    NASA Astrophysics Data System (ADS)

    Ren, Gang; Du, Jian-ming; Yu, Hai-jun

    2018-04-01

    We study the non-classical properties of the states generated by superpositions of two opposite coherent states with the arbitrary relative phase factors. We show that the relative phase factors plays an important role in these superpositions. We demonstrate this result by discussing their squeezing properties, quantum statistical properties and fidelity in principle.

  6. Detection of low-contrast images in film-grain noise.

    PubMed

    Naderi, F; Sawchuk, A A

    1978-09-15

    When low contrast photographic images are digitized by a very small aperture, extreme film-grain noise almost completely obliterates the image information. Using a large aperture to average out the noise destroys the fine details of the image. In these situations conventional statistical restoration techniques have little effect, and well chosen heuristic algorithms have yielded better results. In this paper we analyze the noisecheating algorithm of Zweig et al. [J. Opt. Soc. Am. 65, 1347 (1975)] and show that it can be justified by classical maximum-likelihood detection theory. A more general algorithm applicable to a broader class of images is then developed by considering the signal-dependent nature of film-grain noise. Finally, a Bayesian detection algorithm with improved performance is presented.

  7. For a statistical interpretation of Helmholtz' thermal displacement

    NASA Astrophysics Data System (ADS)

    Podio-Guidugli, Paolo

    2016-11-01

    On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.

  8. New technologies in treatment of atrial fibrillation in cardiosurgical patients

    NASA Astrophysics Data System (ADS)

    Evtushenko, A. V.; Evtushenko, V. V.; Bykov, A. N.; Sergeev, V. S.; Syryamkin, V. I.; Kistenev, Yu. V.; Anfinogenova, Ya. D.; Smyshlyaev, K. A.; Kurlov, I. O.

    2015-11-01

    The article is devoted to the evaluation of the results of clinical application of penetrating radiofrequency ablation techniques on atrial myocardium. Total operated on 241 patients with valvular heart disease and coronary heart disease complicated with atrial fibrillation. All operations were performed under cardiopulmonary bypass and cardioplegia. The main group consists of 141 patients which were operated using penetrating technique radiofrequency exposure. The control group consisted of 100 patients who underwent surgery with the use of "classical" monopolar RF-ablation technique. Both groups were not significantly different on all counts before surgery. Patients with previous heart surgery were excluded during the selection of candidates for the procedure, due to the presence of adhesions in the pericardium, that do not allow good visualization of left atrium, sufficient to perform this procedure. Penetrating technique has significantly higher efficiency compared to the "classic" technique in the early and long-term postoperative periods. In the early postoperative period, its efficiency is 93%, and in the long term is 88%. The efficacy of "classical" monopolar procedure is below: 86% and 68% respectively.

  9. Algebraic techniques for diagonalization of a split quaternion matrix in split quaternionic mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Tongsong, E-mail: jiangtongsong@sina.com; Department of Mathematics, Heze University, Heze, Shandong 274015; Jiang, Ziwu

    In the study of the relation between complexified classical and non-Hermitian quantum mechanics, physicists found that there are links to quaternionic and split quaternionic mechanics, and this leads to the possibility of employing algebraic techniques of split quaternions to tackle some problems in complexified classical and quantum mechanics. This paper, by means of real representation of a split quaternion matrix, studies the problem of diagonalization of a split quaternion matrix and gives algebraic techniques for diagonalization of split quaternion matrices in split quaternionic mechanics.

  10. Bayes and the Law

    PubMed Central

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-01-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes’ theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law. PMID:27398389

  11. Bayes and the Law.

    PubMed

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-06-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes' theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law.

  12. Composite aortic root replacement using the classic or modified Cabrol coronary artery implantation technique.

    PubMed

    Garlicki, Miroslaw; Roguski, K; Puchniewicz, M; Ehrlich, Marek P

    2006-08-01

    We report in this study our results with composite aortic root replacement (CVR) using the classic or modified Cabrol coronary implantation technique. From October 2001 to March 2005, 25 patients underwent aortic root replacement. In all cases, the indication for surgery was a degenerative aneurysm with a diameter of more than 6 cm. Seven patients had undergone a previous aortic operation on the ascending aorta. Mean age was 53+/-13 years and 22 patients were male. Mean Euroscore was 5.2+/-2.4. Aortic insufficiency was present in all patients. Two patients had Marfan syndrome. The 30-day mortality was 0%. Two patients required profound hypothermic circulatory arrest. Mean aortic cross-clamp time was 91+/-24 minutes and the mean circulatory arrest time was 24+/-15 minutes. No patients developed a pseudoaneurysm after the operation. We conclude that composite aortic root replacement with the classic or modified Cabrol technique results in a low operative mortality. However, it should be only used when a "button" technique is not feasible.

  13. Overview Experimental Diagnostics for Rarefied Flows - Selected Topics

    DTIC Science & Technology

    2011-01-01

    flows occurring e.g. in electrical thrusters or plasma wind tunnels. Classical intrusive techniques like Pitot, heat flux, and enthalpy probe as well as...and applied at the IRS, especially designed for the characterisation of flows produced by electrical thrusters and within the plasma wind tunnels for...occurring e.g. in electrical thrusters or plasma wind tunnels. Classical intrusive techniques like Pitot, heat flux, and enthalpy probe as well as mass

  14. Uniform quantized electron gas

    NASA Astrophysics Data System (ADS)

    Høye, Johan S.; Lomba, Enrique

    2016-10-01

    In this work we study the correlation energy of the quantized electron gas of uniform density at temperature T  =  0. To do so we utilize methods from classical statistical mechanics. The basis for this is the Feynman path integral for the partition function of quantized systems. With this representation the quantum mechanical problem can be interpreted as, and is equivalent to, a classical polymer problem in four dimensions where the fourth dimension is imaginary time. Thus methods, results, and properties obtained in the statistical mechanics of classical fluids can be utilized. From this viewpoint we recover the well known RPA (random phase approximation). Then to improve it we modify the RPA by requiring the corresponding correlation function to be such that electrons with equal spins can not be on the same position. Numerical evaluations are compared with well known results of a standard parameterization of Monte Carlo correlation energies.

  15. Classical Electrodynamics: Problems with solutions; Problems with solutions

    NASA Astrophysics Data System (ADS)

    Likharev, Konstantin K.

    2018-06-01

    l Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.

  16. Evaluation of anterior knee pain in a PS total knee arthroplasty: the role of patella-friendly femoral component and patellar size.

    PubMed

    Atzori, F; Sabatini, L; Deledda, D; Schirò, M; Lo Baido, R; Baido, R L; Massè, A

    2015-04-01

    Total knee arthroplasty gives excellent objective results. Nevertheless, the subjective findings do not match the normal knee perception: Often, it depends on patellar pain onset. In this study, we analyzed clinical and radiological items that can affect resurfaced patellar tracking, and role of a patella-friendly femoral component and patellar size on patellar pain onset. Thirty consecutive patients were implanted using the same-cemented posterior-stabilized TKA associated with patella resurfacing. Fifteen patients were implanted using a classical femoral component, while another 15 patients were implanted using a patella-friendly femoral component. The statistical analysis was set to detect a significant difference (p < 0.05) in clinical and radiological outcomes related to several surgical parameters. Clinical and functional outcomes were recorded using the Knee Society Scoring System (KSS) and patellar pain with the Burnett questionnaire. Mean follow-up was 25 months. KSS results were excellent in both groups. Group 2 (patella-friendly femoral model) reached a higher percentage of 100 points in the clinical and functional KSS, but there was no statistical difference. Also, no statistical differences for Burnett Questionnaire results were recorded. We had one case of patellar clunk syndrome in the standard femoral component group and one poor result in the second group. Postoperative radiographic measurements evidenced no statistical differences in both groups. In group 1 (classical femoral component), better significant result (p < 0.05) war recorded at clinical evaluation according to the Knee Society Scoring System (KSS) in case of wider patellar component resurfaced. The present study reveals no statistically significant difference in the incidence of anterior knee pain between classical and "patella-friendly" femoral components. With the particular type of implant design utilized in this study, when the classical femoral component is used, bigger patellar implant sizes (38 and 41 mm) showed superior clinical outcome.

  17. Effect of Turkish classical music on blood pressure: a randomized controlled trial in hypertensive elderly patients.

    PubMed

    Bekiroğlu, Tansel; Ovayolu, Nimet; Ergün, Yusuf; Ekerbiçer, Hasan Çetin

    2013-06-01

    Existing studies suggest that music therapy can have favorable effects on hypertension and anxiety. We therefore set out to investigate the effect of Turkish classical music. To investigate whether Turkish classical music has positive effects on blood pressures and anxiety levels in elderly patients. This was a randomized controlled trial performed on 60 hypertensive patients living in a local elderly home in Adana, Turkey. Following the completion of a socio-demographic form for each patient, Hamilton anxiety scale was applied. Thereafter, the subjects were randomly divided into two equal-size groups and were allowed to either listen to Turkish classical music (music therapy group) or have a resting period (control group) for 25 min. The primary and secondary outcome measures were blood pressure and Hamilton anxiety scale scores, respectively. The mean reduction in systolic blood pressure was 13.00 mmHg in the music therapy group and 6.50 mmHg in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 6.80-9.36). The median reductions in diastolic blood pressures were 10 mmHg both in the music therapy and control groups. The between treatment group difference was not statistically significant (Mann-Whitney U test, P = 0.839). The mean reduction in HAMA-A was 1.63 in the music therapy group and 0.77 in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 0.82-1.92). The study demonstrated that both Turkish classical music and resting alone have positive effects on blood pressure in patients with hypertension. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing.

    PubMed

    Xu, Jason; Minin, Vladimir N

    2015-07-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.

  19. Path-integral simulation of solids.

    PubMed

    Herrero, C P; Ramírez, R

    2014-06-11

    The path-integral formulation of the statistical mechanics of quantum many-body systems is described, with the purpose of introducing practical techniques for the simulation of solids. Monte Carlo and molecular dynamics methods for distinguishable quantum particles are presented, with particular attention to the isothermal-isobaric ensemble. Applications of these computational techniques to different types of solids are reviewed, including noble-gas solids (helium and heavier elements), group-IV materials (diamond and elemental semiconductors), and molecular solids (with emphasis on hydrogen and ice). Structural, vibrational, and thermodynamic properties of these materials are discussed. Applications also include point defects in solids (structure and diffusion), as well as nuclear quantum effects in solid surfaces and adsorbates. Different phenomena are discussed, as solid-to-solid and orientational phase transitions, rates of quantum processes, classical-to-quantum crossover, and various finite-temperature anharmonic effects (thermal expansion, isotopic effects, electron-phonon interactions). Nuclear quantum effects are most remarkable in the presence of light atoms, so that especial emphasis is laid on solids containing hydrogen as a constituent element or as an impurity.

  20. Structure–property relationships in atomic-scale junctions: Histograms and beyond

    DOE PAGES

    Mark S. Hybertsen; Venkataraman, Latha

    2016-03-03

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure–function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, themore » scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Furthermore, harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics.« less

  1. Structure–property relationships in atomic-scale junctions: Histograms and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mark S. Hybertsen; Venkataraman, Latha

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure–function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, themore » scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Furthermore, harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics.« less

  2. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing

    PubMed Central

    Xu, Jason; Minin, Vladimir N.

    2016-01-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377

  3. Observational study of differences in head position for high notes in famous classical and non-classical male singers.

    PubMed

    Amarante Andrade, Pedro; Švec, Jan G

    2016-07-01

    Differences in classical and non-classical singing are due primarily to aesthetic style requirements. The head position can affect the sound quality. This study aimed at comparing the head position for famous classical and non-classical male singers performing high notes. Images of 39 Western classical and 34 non-classical male singers during live performances were obtained from YouTube. Ten raters evaluated the frontal rotational head position (depression versus elevation) and transverse head position (retraction versus protraction) visually using a visual analogue scale. The results showed a significant difference for frontal rotational head position. Most non-classical singers in the sample elevated their heads for high notes while the classical singers were observed to keep it around the neutral position. This difference may be attributed to different singing techniques and phonatory system adjustments utilized by each group.

  4. Splitting livers: Trans-hilar or trans-umbilical division? Technical aspects and comparative outcomes.

    PubMed

    de Ville de Goyet, J; di Francesco, F; Sottani, V; Grimaldi, C; Tozzi, A E; Monti, L; Muiesan, P

    2015-08-01

    Controversy remains about the best line of division for liver splitting, through Segment IV or through the umbilical fissure. Both techniques are currently used, with the choice varying between surgical teams in the absence of an evidence-based choice. We conducted a single-center retrospective analysis of 47 left split liver grafts that were procured with two different division techniques: "classical" (N = 28, Group A) or through the umbilical fissure and plate (N = 19, Group B). The allocation of recipients to each group was at random; a single transplant team performed all transplantations. Demographics, characteristics, technical aspects, and outcomes were similar in both groups. The grafts in Group A, prepared with the classical technique, were procured more often with a single BD orifice compared with the grafts in Group B; however, this was not associated with a higher incidence of biliary problems in this series of transplants (96% actual graft survival rate [median ± s.d. 26 ± 20 months]). Both techniques provide good quality split grafts and an excellent outcome; surgical expertise with a given technique is more relevant than the technique itself. The classical technique, however, seems to be more flexible in various ways, and surgeons may find it to be preferable. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Making Classical Conditioning Understandable through a Demonstration Technique.

    ERIC Educational Resources Information Center

    Gibb, Gerald D.

    1983-01-01

    One lemon, an assortment of other fruits and vegetables, a tennis ball, and a Galvanic Skin Response meter are needed to implement this approach to teaching about classical conditioning in introductory psychology courses. (RM)

  6. A brief overview of current relationships of geography, statistics, and taxonomy with the classical integrated control concept

    USDA-ARS?s Scientific Manuscript database

    A classic paper on the integrated control concept appeared in the later part of the 1950’s, led by Vernon Stern, Ray Smith, Robert van den Bosch, and Kenneth Hagen. Numerous concepts and definitions were formulated at that time. In this presentation, a short philosophical summary will be presented...

  7. An approach for the assessment of the statistical aspects of the SEA coupling loss factors and the vibrational energy transmission in complex aircraft structures: Experimental investigation and methods benchmark

    NASA Astrophysics Data System (ADS)

    Bouhaj, M.; von Estorff, O.; Peiffer, A.

    2017-09-01

    In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.

  8. Ghost imaging of phase objects with classical incoherent light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.

    2011-10-15

    We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.

  9. MSUSTAT.

    ERIC Educational Resources Information Center

    Mauriello, David

    1984-01-01

    Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…

  10. Pathways to dewetting in hydrophobic confinement.

    PubMed

    Remsing, Richard C; Xi, Erte; Vembanur, Srivathsan; Sharma, Sumit; Debenedetti, Pablo G; Garde, Shekhar; Patel, Amish J

    2015-07-07

    Liquid water can become metastable with respect to its vapor in hydrophobic confinement. The resulting dewetting transitions are often impeded by large kinetic barriers. According to macroscopic theory, such barriers arise from the free energy required to nucleate a critical vapor tube that spans the region between two hydrophobic surfaces--tubes with smaller radii collapse, whereas larger ones grow to dry the entire confined region. Using extensive molecular simulations of water between two nanoscopic hydrophobic surfaces, in conjunction with advanced sampling techniques, here we show that for intersurface separations that thermodynamically favor dewetting, the barrier to dewetting does not correspond to the formation of a (classical) critical vapor tube. Instead, it corresponds to an abrupt transition from an isolated cavity adjacent to one of the confining surfaces to a gap-spanning vapor tube that is already larger than the critical vapor tube anticipated by macroscopic theory. Correspondingly, the barrier to dewetting is also smaller than the classical expectation. We show that the peculiar nature of water density fluctuations adjacent to extended hydrophobic surfaces--namely, the enhanced likelihood of observing low-density fluctuations relative to Gaussian statistics--facilitates this nonclassical behavior. By stabilizing isolated cavities relative to vapor tubes, enhanced water density fluctuations thus stabilize novel pathways, which circumvent the classical barriers and offer diminished resistance to dewetting. Our results thus suggest a key role for fluctuations in speeding up the kinetics of numerous phenomena ranging from Cassie-Wenzel transitions on superhydrophobic surfaces, to hydrophobically driven biomolecular folding and assembly.

  11. Quantum enhanced superresolution microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Oron, Dan; Tenne, Ron; Israel, Yonatan; Silberberg, Yaron

    2017-02-01

    Far-field optical microscopy beyond the Abbe diffraction limit, making use of nonlinear excitation (e.g. STED), or temporal fluctuations in fluorescence (PALM, STORM, SOFI) is already a reality. In contrast, overcoming the diffraction limit using non-classical properties of light is very difficult to achieve due to the fragility of quantum states of light. Here, we experimentally demonstrate superresolution microscopy based on quantum properties of light naturally emitted by fluorophores used as markers in fluorescence microscopy. Our approach is based on photon antibunching, the tendency of fluorophores to emit photons one by one rather than in bursts. Although a distinctively quantum phenomenon, antibunching is readily observed in most common fluorophores even at room temperature. This nonclassical resource can be utilized directly to enhance the imaging resolution, since the non-classical far-field intensity correlations induced by antibunching carry high spatial frequency information on the spatial distribution of emitters. Detecting photon statistics simultaneously in the entire field of view, we were able to detect non-classical correlations of the second and third order, and reconstructed images with resolution significantly beyond the diffraction limit. Alternatively, we demonstrate the utilization of antibunching for augmenting the capabilities of localization-based superresolution imaging in the presence of multiple emitters, using a novel detector comprised of an array of single photon detectors connected to a densely packed fiber bundle. These features allow us to enhance the spatial and temporal resolution with which multiple emitters can be imaged compared with other techniques that rely on CCD cameras.

  12. The Concept of Command Leadership in the Military Classics: Ardant du Picq and Foch.

    DTIC Science & Technology

    1986-04-01

    counseling techniques. These lessons are then linked to some form of socio- psychological model designed to provide the officer with a list of leadership...the military has substituted contemporary quasi- psychology and business leadership models for the classical combat models" (12:1). In response to this...leadership, the military seems to- concentrate far more heavily on socio- psychological factors and modern managerial techniques than on the traits and

  13. Photoacoustic discrimination of vascular and pigmented lesions using classical and Bayesian methods

    NASA Astrophysics Data System (ADS)

    Swearingen, Jennifer A.; Holan, Scott H.; Feldman, Mary M.; Viator, John A.

    2010-01-01

    Discrimination of pigmented and vascular lesions in skin can be difficult due to factors such as size, subungual location, and the nature of lesions containing both melanin and vascularity. Misdiagnosis may lead to precancerous or cancerous lesions not receiving proper medical care. To aid in the rapid and accurate diagnosis of such pathologies, we develop a photoacoustic system to determine the nature of skin lesions in vivo. By irradiating skin with two laser wavelengths, 422 and 530 nm, we induce photoacoustic responses, and the relative response at these two wavelengths indicates whether the lesion is pigmented or vascular. This response is due to the distinct absorption spectrum of melanin and hemoglobin. In particular, pigmented lesions have ratios of photoacoustic amplitudes of approximately 1.4 to 1 at the two wavelengths, while vascular lesions have ratios of about 4.0 to 1. Furthermore, we consider two statistical methods for conducting classification of lesions: standard multivariate analysis classification techniques and a Bayesian-model-based approach. We study 15 human subjects with eight vascular and seven pigmented lesions. Using the classical method, we achieve a perfect classification rate, while the Bayesian approach has an error rate of 20%.

  14. Lower Leg Anterior and Lateral Intracompartmental Pressure Changes Before and After Classic Versus Skate Nordic Rollerskiing

    PubMed Central

    Woods, Katherine M.; Petron, David J.; Shultz, Barry B.; Hicks-Little, Charlie A.

    2015-01-01

    Context Chronic exertional compartment syndrome (CECS) is a debilitating condition resulting in loss of function and a decrease in athletic performance. Cases of CECS are increasing among Nordic skiers; therefore, analysis of intracompartmental pressures (ICPs) before and after Nordic skiing is warranted. Objective To determine if lower leg anterior and lateral ICPs and subjective lower leg pain levels increased after a 20-minute Nordic rollerskiing time trial and to examine if differences existed between postexercise ICPs for the 2 Nordic rollerskiing techniques, classic and skate. Design Crossover study. Setting Outdoor paved loop. Patients or Other Participants Seven healthy Division I Nordic skiers (3 men, 4 women; age = 22.71 ± 1.38 y, height = 175.36 ± 6.33 cm, mass = 70.71 ± 6.58 kg). Intervention(s) Participants completed two 20-minute rollerskiing time trials using the classic and skate technique in random order. The time trials were completed 7 days apart. Anterior and lateral ICPs and lower leg pain scores were obtained at baseline and at minutes 1 and 5 after rollerskiing. Main Outcome Measure(s) Anterior and lateral ICPs (mm Hg) were measured using a Stryker Quic STIC handheld monitor. Subjective measures of lower leg pain were recorded using the 11-point Numeric Rating Scale. Results Increases in both anterior (P = .000) and lateral compartment (P = .002) ICPs were observed, regardless of rollerskiing technique used. Subjective lower leg pain increased after the classic technique for the men from baseline to 1 minute postexercise and after the skate technique for the women. Significant 3-way interactions (technique × time × sex) were observed for the anterior (P = .002) and lateral (P = .009) compartment ICPs and lower leg pain (P = .005). Conclusions Postexercise anterior and lateral ICPs increased compared with preexercise ICPs after both classic and skate rollerskiing techniques. Lower leg pain is a primary symptom of CECS. The subjective lower leg pain 11-point Numeric Rating Scale results indicate that increases in lower leg ICPs sustained during Nordic rollerskiing may increase discomfort during activity. Our results therefore suggest that Nordic rollerskiing contributes to increases in ICPs, which may lead to the development of CECS. PMID:26090709

  15. Lower Leg Anterior and Lateral Intracompartmental Pressure Changes Before and After Classic Versus Skate Nordic Rollerskiing.

    PubMed

    Woods, Katherine M; Petron, David J; Shultz, Barry B; Hicks-Little, Charlie A

    2015-08-01

    Chronic exertional compartment syndrome (CECS) is a debilitating condition resulting in loss of function and a decrease in athletic performance. Cases of CECS are increasing among Nordic skiers; therefore, analysis of intracompartmental pressures (ICPs) before and after Nordic skiing is warranted. To determine if lower leg anterior and lateral ICPs and subjective lower leg pain levels increased after a 20-minute Nordic rollerskiing time trial and to examine if differences existed between postexercise ICPs for the 2 Nordic rollerskiing techniques, classic and skate. Crossover study. Outdoor paved loop. Seven healthy Division I Nordic skiers (3 men, 4 women; age = 22.71 ± 1.38 y, height = 175.36 ± 6.33 cm, mass = 70.71 ± 6.58 kg). Participants completed two 20-minute rollerskiing time trials using the classic and skate technique in random order. The time trials were completed 7 days apart. Anterior and lateral ICPs and lower leg pain scores were obtained at baseline and at minutes 1 and 5 after rollerskiing. Anterior and lateral ICPs (mm Hg) were measured using a Stryker Quic STIC handheld monitor. Subjective measures of lower leg pain were recorded using the 11-point Numeric Rating Scale. Increases in both anterior (P = .000) and lateral compartment (P = .002) ICPs were observed, regardless of rollerskiing technique used. Subjective lower leg pain increased after the classic technique for the men from baseline to 1 minute postexercise and after the skate technique for the women. Significant 3-way interactions (technique × time × sex) were observed for the anterior (P = .002) and lateral (P = .009) compartment ICPs and lower leg pain (P = .005). Postexercise anterior and lateral ICPs increased compared with preexercise ICPs after both classic and skate rollerskiing techniques. Lower leg pain is a primary symptom of CECS. The subjective lower leg pain 11-point Numeric Rating Scale results indicate that increases in lower leg ICPs sustained during Nordic rollerskiing may increase discomfort during activity. Our results therefore suggest that Nordic rollerskiing contributes to increases in ICPs, which may lead to the development of CECS.

  16. White matter pathology in ALS and lower motor neuron ALS variants: a diffusion tensor imaging study using tract-based spatial statistics.

    PubMed

    Prudlo, Johannes; Bißbort, Charlotte; Glass, Aenne; Grossmann, Annette; Hauenstein, Karlheinz; Benecke, Reiner; Teipel, Stefan J

    2012-09-01

    The aim of this work was to investigate white-matter microstructural changes within and outside the corticospinal tract in classical amyotrophic lateral sclerosis (ALS) and in lower motor neuron (LMN) ALS variants by means of diffusion tensor imaging (DTI). We investigated 22 ALS patients and 21 age-matched controls utilizing a whole-brain approach with a 1.5-T scanner for DTI. The patient group was comprised of 15 classical ALS- and seven LMN ALS-variant patients (progressive muscular atrophy, flail arm and flail leg syndrome). Disease severity was measured by the revised version of the functional rating scale. White matter fractional anisotropy (FA) was assessed using tract-based spatial statistics (TBSS) and a region of interest (ROI) approach. We found significant FA reductions in motor and extra-motor cerebral fiber tracts in classical ALS and in the LMN ALS-variant patients compared to controls. The voxel-based TBSS results were confirmed by the ROI findings. The white matter damage correlated with the disease severity in the patient group and was found in a similar distribution, but to a lesser extent, among the LMN ALS-variant subgroup. ALS and LMN ALS variants are multisystem degenerations. DTI shows the potential to determine an earlier diagnosis, particularly in LMN ALS variants. The statistically identical findings of white matter lesions in classical ALS and LMN variants as ascertained by DTI further underline that these variants should be regarded as part of the ALS spectrum.

  17. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  18. Epistemic View of Quantum States and Communication Complexity of Quantum Channels

    NASA Astrophysics Data System (ADS)

    Montina, Alberto

    2012-09-01

    The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.

  19. Neuromuscular taping versus sham therapy on muscular strength and motor performance in multiple sclerosis patients.

    PubMed

    Costantino, Cosimo; Pedrini, Martina Francesca; Licari, Oriana

    2016-01-01

    Purpose of this study is to evaluate differences in leg muscles strength and motor performance between neuromuscular taping (NT) and sham tape groups. Relapsing-remitting (RR) multiple sclerosis (MS) patients were recruited and randomly assigned to NT or sham tape groups. All patients underwent the treatment 5 times at 5-d intervals. They were submitted to a 6-minute walk test and isokinetic test (peak torque) at the beginning (T0), at the end (T1) and 2 months after the end of the treatment (T2). Forty MS patients (38 F; 2 M; mean age 45.5 ± 6.5 years) were assigned to NT group (n = 20) and to sham tape group (n = 20). Delta Peak Torque T1-T0 and T2-T0 between two groups were statistically significant in quadriceps (p = 0.007; 0.000) and hamstrings (p = 0.011; 0.007). The difference between the two groups according to 6-minute walk test was not statistically significant but in NT group it was noticed an increasing trend about the distance run. In this single-blind randomized controlled trial, NT seemed to increase strength in leg muscles, compared to a sham device, in RR MS patients. Further studies are needed to consider this therapy as a complement to classic physical therapy. Neuromuscular taping (NT) in multiple sclerosis: NT is well tolerated by multiple sclerosis patients and should be a complement to classic physical therapy. This technique normalizes muscular function, strengthens weakened muscles and assists the postural alignment.

  20. Statistical benchmark for BosonSampling

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas

    2016-03-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.

  1. Statistical Interpretation of the Local Field Inside Dielectrics.

    ERIC Educational Resources Information Center

    Berrera, Ruben G.; Mello, P. A.

    1982-01-01

    Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)

  2. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  3. Properties of the Boltzmann equation in the classical approximation

    DOE PAGES

    Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...

    2014-12-30

    We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less

  4. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  5. Evidence of non-classical (squeezed) light in biological systems

    NASA Astrophysics Data System (ADS)

    Popp, F. A.; Chang, J. J.; Herzog, A.; Yan, Z.; Yan, Y.

    2002-01-01

    By use of coincidence measurements on “ultraweak” photon emission, the photocount statistics (PCS) of artificial visible light turns out to follow-as expected-super-Poissonian PCS. Biophotons, originating from spontaneous or light-induced living systems, display super-Poissonian, Poissonian and even sub-Poissonian PCS. This result shows the first time evidence of non-classical (squeezed) light in living tissues.

  6. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    NASA Astrophysics Data System (ADS)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  7. A Computer-Aided Instruction Program for Teaching the TOPS20-MM Facility on the DDN (Defense Data Network)

    DTIC Science & Technology

    1988-06-01

    Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP Computer Assisted Instruction; Artificial Intelligence 194...while he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been...he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been used

  8. Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.

    PubMed

    David, Michael D

    2017-04-01

    The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  9. Prequantum classical statistical field theory: background field as a source of everything?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2011-07-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's "double solution" approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special "prequantum fields": the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the "photonic field" (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of "vacuum fluctuations") might play the role of a source of such pulses, i.e., the source of everything.

  10. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.

  11. Analysis of Classical Time-Trial Performance and Technique-Specific Physiological Determinants in Elite Female Cross-Country Skiers.

    PubMed

    Sandbakk, Øyvind; Losnegard, Thomas; Skattebo, Øyvind; Hegge, Ann M; Tønnessen, Espen; Kocbach, Jan

    2016-01-01

    The present study investigated the contribution of performance on uphill, flat, and downhill sections to overall performance in an international 10-km classical time-trial in elite female cross-country skiers, as well as the relationships between performance on snow and laboratory-measured physiological variables in the double poling (DP) and diagonal (DIA) techniques. Ten elite female cross-country skiers were continuously measured by a global positioning system device during an international 10-km cross-country skiing time-trial in the classical technique. One month prior to the race, all skiers performed a 5-min submaximal and 3-min self-paced performance test while roller skiing on a treadmill, both in the DP and DIA techniques. The time spent on uphill (r = 0.98) and flat (r = 0.91) sections of the race correlated most strongly with the overall 10-km performance (both p < 0.05). Approximately 56% of the racing time was spent uphill, and stepwise multiple regression revealed that uphill time explained 95.5% of the variance in overall performance (p < 0.001). Distance covered during the 3-min roller-skiing test and body-mass normalized peak oxygen uptake (VO2peak) in both techniques showed the strongest correlations with overall time-trial performance (r = 0.66-0.78), with DP capacity tending to have greatest impact on the flat and DIA capacity on uphill terrain (all p < 0.05). Our present findings reveal that the time spent uphill most strongly determine classical time-trial performance, and that the major portion of the performance differences among elite female cross-country skiers can be explained by variations in technique-specific aerobic power.

  12. Analysis of Classical Time-Trial Performance and Technique-Specific Physiological Determinants in Elite Female Cross-Country Skiers

    PubMed Central

    Sandbakk, Øyvind; Losnegard, Thomas; Skattebo, Øyvind; Hegge, Ann M.; Tønnessen, Espen; Kocbach, Jan

    2016-01-01

    The present study investigated the contribution of performance on uphill, flat, and downhill sections to overall performance in an international 10-km classical time-trial in elite female cross-country skiers, as well as the relationships between performance on snow and laboratory-measured physiological variables in the double poling (DP) and diagonal (DIA) techniques. Ten elite female cross-country skiers were continuously measured by a global positioning system device during an international 10-km cross-country skiing time-trial in the classical technique. One month prior to the race, all skiers performed a 5-min submaximal and 3-min self-paced performance test while roller skiing on a treadmill, both in the DP and DIA techniques. The time spent on uphill (r = 0.98) and flat (r = 0.91) sections of the race correlated most strongly with the overall 10-km performance (both p < 0.05). Approximately 56% of the racing time was spent uphill, and stepwise multiple regression revealed that uphill time explained 95.5% of the variance in overall performance (p < 0.001). Distance covered during the 3-min roller-skiing test and body-mass normalized peak oxygen uptake (VO2peak) in both techniques showed the strongest correlations with overall time-trial performance (r = 0.66–0.78), with DP capacity tending to have greatest impact on the flat and DIA capacity on uphill terrain (all p < 0.05). Our present findings reveal that the time spent uphill most strongly determine classical time-trial performance, and that the major portion of the performance differences among elite female cross-country skiers can be explained by variations in technique-specific aerobic power. PMID:27536245

  13. Response statistics of rotating shaft with non-linear elastic restoring forces by path integration

    NASA Astrophysics Data System (ADS)

    Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael

    2017-07-01

    Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.

  14. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  15. Learning moment-based fast local binary descriptor

    NASA Astrophysics Data System (ADS)

    Bellarbi, Abdelkader; Zenati, Nadia; Otmane, Samir; Belghit, Hayet

    2017-03-01

    Recently, binary descriptors have attracted significant attention due to their speed and low memory consumption; however, using intensity differences to calculate the binary descriptive vector is not efficient enough. We propose an approach to binary description called POLAR_MOBIL, in which we perform binary tests between geometrical and statistical information using moments in the patch instead of the classical intensity binary test. In addition, we introduce a learning technique used to select an optimized set of binary tests with low correlation and high variance. This approach offers high distinctiveness against affine transformations and appearance changes. An extensive evaluation on well-known benchmark datasets reveals the robustness and the effectiveness of the proposed descriptor, as well as its good performance in terms of low computation complexity when compared with state-of-the-art real-time local descriptors.

  16. Profiling the nucleobase and structure selectivity of anticancer drugs and other DNA alkylating agents by RNA sequencing.

    PubMed

    Gillingham, Dennis; Sauter, Basilius

    2018-05-06

    Drugs that covalently modify DNA are components of most chemotherapy regimens, often serving as first-line treatments. Classically the chemical reactivity of DNA alkylators has been determined in vitro with short oligonucleotides. Here we use next generation RNA sequencing to report on the chemoselectivity of alkylating agents. We develop the method with the well-known clinically used DNA modifiying drugs streptozotocin and temozolomide, and then apply the technique to profile RNA modification with uncharacterized alkylation reactions such as with powerful electrophiles like trimethylsilyldiazomethane. The multiplexed and massively parallel format of NGS offers analyses of chemical reactivity in nucleic acids to be accomplished in less time with greater statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Soft-tissue injuries from sports activities and traffic accidents--treatment with low-level laser therapy: a multicenter double-blind placebo-controlled clinical study on 132 patients

    NASA Astrophysics Data System (ADS)

    Simunovic, Zlatko; Trobonjaca, Tatjana

    2000-06-01

    The aim of current multicenter clinical study was to assess the efficacy of low energy-level laser therapy (LLLT) in the treatment of soft tissue injuries compared to the placebo and classical phyiotherapeutic procedures. This clinical study was conducted in two centers located in Locarno, Switzerland and Opatija, Croatia. Two types of irradiation techniques were used: (1) direct, skin contact technique for treatment of trigger points where IR diode laser 830 nm continuous wave was applied; and (2) scanning technique for irradiation of larger surface area with use of Helium Neon laser 632.8 nm combined with IR diode laser 904 nm pulsed wave. Results were evaluated according to clinical parameters like: hematoma, swelling, heat, pan and loss of function. The findings were statistically analyzed via chi- square test. Results have demonstrated that the recovery process was accelerated in 85 percent of patients treated with LLLT compared to the control group of patients. The results and advantages obtained proved once again the efficacy of LLLT as a new and successful way to treat soft tissue injuries.

  18. Introduction

    NASA Astrophysics Data System (ADS)

    Cohen, E. G. D.

    Lecture notes are organized around the key word dissipation, while focusing on a presentation of modern theoretical developments in the study of irreversible phenomena. A broad cross-disciplinary perspective towards non-equilibrium statistical mechanics is backed by the general theory of nonlinear and complex dynamical systems. The classical-quantum intertwine and semiclassical dissipative borderline issue (decoherence, "classical out of quantum") are here included . Special emphasis is put on links between the theory of classical and quantum dynamical systems (temporal disorder, dynamical chaos and transport processes) with central problems of non-equilibrium statistical mechanics like e.g. the connection between dynamics and thermodynamics, relaxation towards equilibrium states and mechanisms capable to drive and next maintain the physical system far from equilibrium, in a non-equilibrium steady (stationary) state. The notion of an equilibrium state - towards which a system naturally evolves if left undisturbed - is a fundamental concept of equilibrium statistical mechanics. Taken as a primitive point of reference that allows to give an unambiguous status to near equilibrium and far from equilibrium systems, together with the dynamical notion of a relaxation (decay) towards a prescribed asymptotic invariant measure or probability distribution (properties of ergodicity and mixing are implicit). A related issue is to keep under control the process of driving a physical system away from an initial state of equilibrium and either keeping it in another (non-equilibrium) steady state or allowing to restore the initial data (return back, relax). To this end various models of environment (heat bath, reservoir, thermostat, measuring instrument etc.), and the environment - system coupling are analyzed. The central theme of the book is the dynamics of dissipation and various mechanisms responsible for the irreversible behaviour (transport properties) of open systems on classical and quantum levels of description. A distinguishing feature of these lecture notes is that microscopic foundations of irreversibility are investigated basically in terms of "small" systems, when the "system" and/or "environment" may have a finite (and small) number of degrees of freedom and may be bounded. This is to be contrasted with the casual understanding of statistical mechanics which is regarded to refer to systems with a very large number of degrees of freedom. In fact, it is commonly accepted that the accumulation of effects due to many (range of the Avogadro number) particles is required for statistical mechanics reasoning. Albeit those large numbers are not at all sufficient for transport properties. A helpful hint towards this conceptual turnover comes from the observation that for chaotic dynamical systems the random time evolution proves to be compatible with the underlying purely deterministic laws of motion. Chaotic features of the classical dynamics already appear in systems with two degrees of freedom and such systems need to be described in statistical terms, if we wish to quantify the dynamics of relaxation towards an invariant ergodic measure. The relaxation towards equilibrium finds a statistical description through an analysis of statistical ensembles. This entails an extension of the range of validity of statistical mechanics to small classical systems. On the other hand, the dynamics of fluctuations in macroscopic dissipative systems (due to their molecular composition and thermal mobility) may render a characterization of such systems as being chaotic. That motivates attempts of understanding the role of microscopic chaos and various "chaotic hypotheses" - dynamical systems approach is being pushed down to the level of atoms, molecules and complex matter constituents, whose natural substitute are low-dimensional model subsystems (encompassing as well the mesoscopic "quantum chaos") - in non-equilibrium transport phenomena. On the way a number of questions is addressed like e.g.: is there, or what is the nature of a connection between chaos (modern theory of dynamical systems) and irreversible thermodynamics; can really quantum chaos explain some peculiar features of quantum transport? The answer in both cases is positive, modulo a careful discrimination between viewing the dynamical chaos as a necessary or sufficient basis for irreversibility. In those dynamical contexts, another key term dynamical semigroups refers to major technical tools appropriate for the "dissipative mathematics", modelling irreversible behaviour on the classical and quantum levels of description. Dynamical systems theory and "quantum chaos" research involve both a high level of mathematical sophistication and heavy computer "experimentation". One of the present volume specific flavors is a tutorial access to quite advanced mathematical tools. They gradually penetrate the classical and quantum dynamical semigroup description, while culminating in the noncommutative Brillouin zone construction as a prerequisite to understand transport in aperiodic solids. Lecture notes are structured into chapters to give a better insight into major conceptual streamlines. Chapter I is devoted to a discussion of non-equilibrium steady states and, through so-called chaotic hypothesis combined with suitable fluctuation theorems, elucidates the role of Sinai-Ruelle-Bowen distribution in both equilibrium and non-equilibrium statistical physics frameworks (E. G. D. Cohen). Links between dynamics and statistics (Boltzmann versus Tsallis) are also discussed. Fluctuation relations and a survey of deterministic thermostats are given in the context of non-equilibrium steady states of fluids (L. Rondoni). Response of systems driven far from equilibrium is analyzed on the basis of a central assertion about the existence of the statistical representation in terms of an ensemble of dynamical realizations of the driving process. Non-equilibrium work relation is deduced for irreversible processes (C. Jarzynski). The survey of non-equilibrium steady states in statistical mechanics of classical and quantum systems employs heat bath models and the random matrix theory input. The quantum heat bath analysis and derivation of fluctuation-dissipation theorems is performed by means of the influence functional technique adopted to solve quantum master equations (D. Kusnezov). Chapter II deals with an issue of relaxation and its dynamical theory in both classical and quantum contexts. Pollicott-Ruelle resonance background for the exponential decay scenario is discussed for irreversible processes of diffusion in the Lorentz gas and multibaker models (P. Gaspard). The Pollicott-Ruelle theory reappears as a major inspiration in the survey of the behaviour of ensembles of chaotic systems, with a focus on model systems for which no rigorous results concerning the exponential decay of correlations in time is available (S. Fishman). The observation, that non-equilibrium transport processes in simple classical chaotic systems can be described in terms of fractal structures developing in the system phase space, links their formation and properties with the entropy production in the course of diffusion processes displaying a low dimensional deterministic (chaotic) origin (J. R. Dorfman). Chapter III offers an introduction to the theory of dynamical semigroups. Asymptotic properties of Markov operators and Markov semigroups acting in the set of probability densities (statistical ensemble notion is implicit) are analyzed. Ergodicity, mixing, strong (complete) mixing and sweeping are discussed in the familiar setting of "noise, chaos and fractals" (R. Rudnicki). The next step comprises a passage to quantum dynamical semigroups and completely positive dynamical maps, with an ultimate goal to introduce a consistent framework for the analysis of irreversible phenomena in open quantum systems, where dissipation and decoherence are crucial concepts (R. Alicki). Friction and damping in classical and quantum mechanics of finite dissipative systems is analyzed by means of Markovian quantum semigroups with special emphasis on the issue of complete positivity (M. Fannes). Specific two-level model systems of elementary particle physics (kaons) and rudiments of neutron interferometry are employed to elucidate a distinction between positivity and complete positivity (F. Benatti). Quantization of dynamics of stochastic models related to equilibrium Gibbs states results in dynamical maps which form quantum stochastic dynamical semigroups (W. A. Majewski). Chapter IV addresses diverse but deeply interrelated features of driven chaotic (mesoscopic) classical and quantum systems, their dissipative properties, notions of quantum irreversibility, entanglement, dephasing and decoherence. A survey of non-perturbative quantum effects for open quantum systems is concluded by outlining the discrepancies between random matrix theory and non-perturbative semiclassical predictions (D. Cohen). As a useful supplement to the subject of bounded open systems, methods of quantum state control in a cavity (coherent versus incoherent dynamics and dissipation) are described for low dimensional quantum systems (A. Buchleitner). The dynamics of open quantum systems can be alternatively described by means of non-Markovian stochastic Schrödinger equation, jointly for an open system and its environment, which moves us beyond the Linblad evolution scenario of Markovian dynamical semigroups. The quantum Brownian motion is considered (W. Strunz) . Chapter V enforces a conceptual transition 'from "small" to "large" systems with emphasis on irreversible thermodynamics of quantum transport. Typical features of the statistical mechanics of infinitely extended systems and the dynamical (small) systems approach are described by means of representative examples of relaxation towards asymptotic steady states: quantum one-dimensional lattice conductor and an open multibaker map (S. Tasaki). Dissipative transport in aperiodic solids is reviewed by invoking methods on noncommutative geometry. The anomalous Drude formula is derived. The occurence of quantum chaos is discussed together with its main consequences (J. Bellissard). The chapter is concluded by a survey of scaling limits of the N-body Schrödinger quantum dynamics, where classical evolution equations of irreversible statistical mechanics (linear Boltzmann, Hartree, Vlasov) emerge "out of quantum". In particular, a scaling limit of one body quantum dynamics with impurities (static random potential) and that of quantum dynamics with weakly coupled phonons are shown to yield the linear Boltzmann equation (L. Erdös). Various interrelations between chapters and individual lectures, plus a detailed fine-tuned information about the subject matter coverage of the volume, can be recovered by examining an extensive index.

  19. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  20. Classical analogues of two-photon quantum interference.

    PubMed

    Kaltenbaek, R; Lavoie, J; Resch, K J

    2009-06-19

    Chirped-pulse interferometry (CPI) captures the metrological advantages of quantum Hong-Ou-Mandel (HOM) interferometry in a completely classical system. Modified HOM interferometers are the basis for a number of seminal quantum-interference effects. Here, the corresponding modifications to CPI allow for the first observation of classical analogues to the HOM peak and quantum beating. They also allow a new classical technique for generating phase super-resolution exhibiting a coherence length dramatically longer than that of the laser light, analogous to increased two-photon coherence lengths in entangled states.

  1. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  2. Electric Field Fluctuations in Water

    NASA Astrophysics Data System (ADS)

    Thorpe, Dayton; Limmer, David; Chandler, David

    2013-03-01

    Charge transfer in solution, such as autoionization and ion pair dissociation in water, is governed by rare electric field fluctuations of the solvent. Knowing the statistics of such fluctuations can help explain the dynamics of these rare events. Trajectories short enough to be tractable by computer simulation are virtually certain not to sample the large fluctuations that promote rare events. Here, we employ importance sampling techniques with classical molecular dynamics simulations of liquid water to study statistics of electric field fluctuations far from their means. We find that the distributions of electric fields located on individual water molecules are not in general gaussian. Near the mean this non-gaussianity is due to the internal charge distribution of the water molecule. Further from the mean, however, there is a previously unreported Bjerrum-like defect that stabilizes certain large fluctuations out of equilibrium. As expected, differences in electric fields acting between molecules are gaussian to a remarkable degree. By studying these differences, though, we are able to determine what configurations result not only in large electric fields, but also in electric fields with long spatial correlations that may be needed to promote charge separation.

  3. Cognitive biases, linguistic universals, and constraint-based grammar learning.

    PubMed

    Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin

    2013-07-01

    According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.

  4. Note onset deviations as musical piece signatures.

    PubMed

    Serrà, Joan; Özaslan, Tan Hakan; Arcos, Josep Lluis

    2013-01-01

    A competent interpretation of a musical composition presents several non-explicit departures from the written score. Timing variations are perhaps the most important ones: they are fundamental for expressive performance and a key ingredient for conferring a human-like quality to machine-based music renditions. However, the nature of such variations is still an open research question, with diverse theories that indicate a multi-dimensional phenomenon. In the present study, we consider event-shift timing variations and show that sequences of note onset deviations are robust and reliable predictors of the musical piece being played, irrespective of the performer. In fact, our results suggest that only a few consecutive onset deviations are already enough to identify a musical composition with statistically significant accuracy. We consider a mid-size collection of commercial recordings of classical guitar pieces and follow a quantitative approach based on the combination of standard statistical tools and machine learning techniques with the semi-automatic estimation of onset deviations. Besides the reported results, we believe that the considered materials and the methodology followed widen the testing ground for studying musical timing and could open new perspectives in related research fields.

  5. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  6. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  7. Ischemia may be the primary cause of the neurologic deficits in classic migraine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skyhoj Olsen, T.; Friberg, L.; Lassen, N.A.

    1987-02-01

    This study investigates whether the cerebral blood flow reduction occurring in attacks of classic migraine is sufficient to cause neurologic deficits. Regional cerebral blood flow measured with the xenon 133 intracarotid injection technique was analyzed in 11 patients in whom a low-flow area developed during attacks of classic migraine. When measured with this technique, regional cerebral blood flow in focal low-flow areas will be overestimated because of the effect of scattered radiation (Compton scatter) on the recordings. In this study, this effect was particularly taken into account when evaluating the degree of blood flow reduction. During attacks of classic migraine,more » cerebral blood flow reductions averaging 52% were observed focally in the 11 patients. Cerebral blood flow levels known to be insufficient for normal cortical function (less than 16 to 23 mL/100 g/min) were measured in seven patients during the attacks. This was probably also the case in the remaining four patients, but the effect of scattered radiation made a reliable evaluation of blood flow impossible. It is concluded that the blood flow reduction that occurs during attacks of classic migraine is sufficient to cause ischemia and neurologic deficits. Hence, this study suggests a vascular origin of the prodromal neurologic deficits that may accompany attacks of classic migraine.« less

  8. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  9. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  10. Statistical measures of Planck scale signal correlations in interferometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig J.; Kwon, Ohkyung

    2015-06-22

    A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of informationmore » suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.« less

  11. An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.

    ERIC Educational Resources Information Center

    Capraro, Mary Margaret

    This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…

  12. The use of classical and operant conditioning in training Aldabra tortoises (Geochelone gigantea), for venipuncture and other, husbandry issues.

    PubMed

    Weiss, Emily; Wilson, Sandra

    2003-01-01

    A variety of nonhuman animals in zoo and research settings have been the subjects of classical and operant conditioning techniques. Much of the published work has focused on mammals, husbandry training, and veterinary issues. However, several zoos are training reptiles and birds for similar procedures, but there has been little of this work published. Using positive reinforcement techniques enabled the training of 2 male and 2 female Aldabra tortoises (Geochelone gigantea) to approach a target, hold steady on target, and stretch and hold for venipuncture. This article discusses training techniques, venipuncture sight, and future training.

  13. Chemical Potential for the Interacting Classical Gas and the Ideal Quantum Gas Obeying a Generalized Exclusion Principle

    ERIC Educational Resources Information Center

    Sevilla, F. J.; Olivares-Quiroz, L.

    2012-01-01

    In this work, we address the concept of the chemical potential [mu] in classical and quantum gases towards the calculation of the equation of state [mu] = [mu](n, T) where n is the particle density and "T" the absolute temperature using the methods of equilibrium statistical mechanics. Two cases seldom discussed in elementary textbooks are…

  14. Recurrence theorems: A unified account

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, David, E-mail: david.wallace@balliol.ox.ac.uk

    I discuss classical and quantum recurrence theorems in a unified manner, treating both as generalisations of the fact that a system with a finite state space only has so many places to go. Along the way, I prove versions of the recurrence theorem applicable to dynamics on linear and metric spaces and make some comments about applications of the classical recurrence theorem in the foundations of statistical mechanics.

  15. Fisher information as a generalized measure of coherence in classical and quantum optics.

    PubMed

    Luis, Alfredo

    2012-10-22

    We show that metrological resolution in the detection of small phase shifts provides a suitable generalization of the degrees of coherence and polarization. Resolution is estimated via Fisher information. Besides the standard two-beam Gaussian case, this approach provides also good results for multiple field components and nonGaussian statistics. This works equally well in quantum and classical optics.

  16. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    PubMed

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  17. Quantum approach to classical statistical mechanics.

    PubMed

    Somma, R D; Batista, C D; Ortiz, G

    2007-07-20

    We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.

  18. Limit Theorems for Dispersing Billiards with Cusps

    NASA Astrophysics Data System (ADS)

    Bálint, P.; Chernov, N.; Dolgopyat, D.

    2011-12-01

    Dispersing billiards with cusps are deterministic dynamical systems with a mild degree of chaos, exhibiting "intermittent" behavior that alternates between regular and chaotic patterns. Their statistical properties are therefore weak and delicate. They are characterized by a slow (power-law) decay of correlations, and as a result the classical central limit theorem fails. We prove that a non-classical central limit theorem holds, with a scaling factor of {sqrt{nlog n}} replacing the standard {sqrt{n}} . We also derive the respective Weak Invariance Principle, and we identify the class of observables for which the classical CLT still holds.

  19. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  20. Renewing Literary Classics.

    ERIC Educational Resources Information Center

    Karolides, Nicholas J., Ed.

    1983-01-01

    The articles in this journal issue suggest techniques for classroom use of literature that has "withstood the test of time." The titles of the articles and their authors are as follows: (1) "The Storytelling Connection for the Classics" (Mary Ellen Martin); (2) "Elizabeth Bennet: A Liberated Woman" (Geneva Marking);…

  1. A quantum–quantum Metropolis algorithm

    PubMed Central

    Yung, Man-Hong; Aspuru-Guzik, Alán

    2012-01-01

    The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584

  2. Turbulent scaling laws as solutions of the multi-point correlation equation using statistical symmetries

    NASA Astrophysics Data System (ADS)

    Oberlack, Martin; Rosteck, Andreas; Avsarkisov, Victor

    2013-11-01

    Text-book knowledge proclaims that Lie symmetries such as Galilean transformation lie at the heart of fluid dynamics. These important properties also carry over to the statistical description of turbulence, i.e. to the Reynolds stress transport equations and its generalization, the multi-point correlation equations (MPCE). Interesting enough, the MPCE admit a much larger set of symmetries, in fact infinite dimensional, subsequently named statistical symmetries. Most important, theses new symmetries have important consequences for our understanding of turbulent scaling laws. The symmetries form the essential foundation to construct exact solutions to the infinite set of MPCE, which in turn are identified as classical and new turbulent scaling laws. Examples on various classical and new shear flow scaling laws including higher order moments will be presented. Even new scaling have been forecasted from these symmetries and in turn validated by DNS. Turbulence modellers have implicitly recognized at least one of the statistical symmetries as this is the basis for the usual log-law which has been employed for calibrating essentially all engineering turbulence models. An obvious conclusion is to generally make turbulence models consistent with the new statistical symmetries.

  3. Modulation Doped GaAs/Al sub xGA sub (1-x)As Layered Structures with Applications to Field Effect Transistors.

    DTIC Science & Technology

    1982-02-15

    function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga

  4. Reliability of a Measure of Institutional Discrimination against Minorities

    DTIC Science & Technology

    1979-12-01

    samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small

  5. Quantum vertex model for reversible classical computing.

    PubMed

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  6. Quantum vertex model for reversible classical computing

    NASA Astrophysics Data System (ADS)

    Chamon, C.; Mucciolo, E. R.; Ruckenstein, A. E.; Yang, Z.-C.

    2017-05-01

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without `learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  7. High-Speed Imaging Analysis of Register Transitions in Classically and Jazz-Trained Male Voices.

    PubMed

    Dippold, Sebastian; Voigt, Daniel; Richter, Bernhard; Echternach, Matthias

    2015-01-01

    Little data are available concerning register functions in different styles of singing such as classically or jazz-trained voices. Differences between registers seem to be much more audible in jazz singing than classical singing, and so we hypothesized that classically trained singers exhibit a smoother register transition, stemming from more regular vocal fold oscillation patterns. High-speed digital imaging (HSDI) was used for 19 male singers (10 jazz-trained singers, 9 classically trained) who performed a glissando from modal to falsetto register across the register transition. Vocal fold oscillation patterns were analyzed in terms of different parameters of regularity such as relative average perturbation (RAP), correlation dimension (D2) and shimmer. HSDI observations showed more regular vocal fold oscillation patterns during the register transition for the classically trained singers. Additionally, the RAP and D2 values were generally lower and more consistent for the classically trained singers compared to the jazz singers. However, intergroup comparisons showed no statistically significant differences. Some of our results may support the hypothesis that classically trained singers exhibit a smoother register transition from modal to falsetto register. © 2015 S. Karger AG, Basel.

  8. Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error

    NASA Astrophysics Data System (ADS)

    Miller, Austin

    In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.

  9. Diagrammar in classical scalar field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste

    2011-09-15

    In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less

  10. Selected topics from classical bacterial genetics.

    PubMed

    Raleigh, Elisabeth A; Elbing, Karen; Brent, Roger

    2002-08-01

    Current cloning technology exploits many facts learned from classical bacterial genetics. This unit covers those that are critical to understanding the techniques described in this book. Topics include antibiotics, the LAC operon, the F factor, nonsense suppressors, genetic markers, genotype and phenotype, DNA restriction, modification and methylation and recombination.

  11. Classical-to-Quantum Transition with Broadband Four-Wave Mixing

    NASA Astrophysics Data System (ADS)

    Vered, Rafi Z.; Shaked, Yaakov; Ben-Or, Yelena; Rosenbluh, Michael; Pe'er, Avi

    2015-02-01

    A key question of quantum optics is how nonclassical biphoton correlations at low power evolve into classical coherence at high power. Direct observation of the crossover from quantum to classical behavior is desirable, but difficult due to the lack of adequate experimental techniques that cover the ultrawide dynamic range in photon flux from the single photon regime to the classical level. We investigate biphoton correlations within the spectrum of light generated by broadband four-wave mixing over a large dynamic range of ˜80 dB in photon flux across the classical-to-quantum transition using a two-photon interference effect that distinguishes between classical and quantum behavior. We explore the quantum-classical nature of the light by observing the interference contrast dependence on internal loss and demonstrate quantum collapse and revival of the interference when the four-wave mixing gain in the fiber becomes imaginary.

  12. Application Of Iterative Reconstruction Techniques To Conventional Circular Tomography

    NASA Astrophysics Data System (ADS)

    Ghosh Roy, D. N.; Kruger, R. A.; Yih, B. C.; Del Rio, S. P.; Power, R. L.

    1985-06-01

    Two "point-by-point" iteration procedures, namely, Iterative Least Square Technique (ILST) and Simultaneous Iterative Reconstructive Technique (SIRT) were applied to classical circular tomographic reconstruction. The technique of tomosynthetic DSA was used in forming the tomographic images. Reconstructions of a dog's renal and neck anatomy are presented.

  13. Focus-based filtering + clustering technique for power-law networks with small world phenomenon

    NASA Astrophysics Data System (ADS)

    Boutin, François; Thièvre, Jérôme; Hascoët, Mountaz

    2006-01-01

    Realistic interaction networks usually present two main properties: a power-law degree distribution and a small world behavior. Few nodes are linked to many nodes and adjacent nodes are likely to share common neighbors. Moreover, graph structure usually presents a dense core that is difficult to explore with classical filtering and clustering techniques. In this paper, we propose a new filtering technique accounting for a user-focus. This technique extracts a tree-like graph with also power-law degree distribution and small world behavior. Resulting structure is easily drawn with classical force-directed drawing algorithms. It is also quickly clustered and displayed into a multi-level silhouette tree (MuSi-Tree) from any user-focus. We built a new graph filtering + clustering + drawing API and report a case study.

  14. Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies

    ERIC Educational Resources Information Center

    Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre

    2018-01-01

    Purpose: Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. Method: We propose a…

  15. Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests

    Treesearch

    James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers

    1999-01-01

    Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...

  16. Teaching Bayesian Statistics to Undergraduate Students through Debates

    ERIC Educational Resources Information Center

    Stewart, Sepideh; Stewart, Wayne

    2014-01-01

    This paper describes a lecturer's approach to teaching Bayesian statistics to students who were only exposed to the classical paradigm. The study shows how the lecturer extended himself by making use of ventriloquist dolls to grab hold of students' attention and embed important ideas in revealing the differences between the Bayesian and classical…

  17. Non-Gaussian statistics and nanosecond dynamics of electrostatic fluctuations affecting optical transitions in proteins.

    PubMed

    Martin, Daniel R; Matyushov, Dmitry V

    2012-08-30

    We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.

  18. Fiber Breakage Model for Carbon Composite Stress Rupture Phenomenon: Theoretical Development and Applications

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie

    2010-01-01

    Stress rupture failure of Carbon Composite Overwrapped Pressure Vessels (COPVs) is of serious concern to Science Mission and Constellation programs since there are a number of COPVs on board space vehicles with stored gases under high pressure for long durations of time. It has become customary to establish the reliability of these vessels using the so called classic models. The classical models are based on Weibull statistics fitted to observed stress rupture data. These stochastic models cannot account for any additional damage due to the complex pressure-time histories characteristic of COPVs being supplied for NASA missions. In particular, it is suspected that the effects of proof test could significantly reduce the stress rupture lifetime of COPVs. The focus of this paper is to present an analytical appraisal of a model that incorporates damage due to proof test. The model examined in the current paper is based on physical mechanisms such as micromechanics based load sharing concepts coupled with creep rupture and Weibull statistics. For example, the classic model cannot accommodate for damage due to proof testing which every flight vessel undergoes. The paper compares current model to the classic model with a number of examples. In addition, several applications of the model to current ISS and Constellation program issues are also examined.

  19. Development of quantitative structure-activity relationships and its application in rational drug design.

    PubMed

    Yang, Guang-Fu; Huang, Xiaoqin

    2006-01-01

    Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques.

  20. The 2.4 μm Galaxy Luminosity Function As Measured Using WISE. I. Measurement Techniques

    NASA Astrophysics Data System (ADS)

    Lake, S. E.; Wright, E. L.; Tsai, C.-W.; Lam, A.

    2017-04-01

    The astronomy community has at its disposal a large back catalog of public spectroscopic galaxy redshift surveys that can be used for the measurement of luminosity functions (LFs). Utilizing the back catalog with new photometric surveys to maximum efficiency requires modeling the color selection bias imposed on the selection of target galaxies by flux limits at multiple wavelengths. The likelihood derived herein can address, in principle, all possible color selection biases through the use of a generalization of the LF, {{Φ }}(L), over the space of all spectra: the spectro-luminosity functional, {{\\Psi }}[{L}ν ]. It is, therefore, the first estimator capable of simultaneously analyzing multiple redshift surveys in a consistent way. We also propose a new way of parametrizing the evolution of the classic Schechter function parameters, L ⋆ and ϕ ⋆, that improves both the physical realism and statistical performance of the model. The techniques derived in this paper are used in a companion paper by Lake et al. to measure the LF of galaxies at the rest-frame wavelength of 2.4 μ {{m}} using the Widefield Infrared Survey Explorer (WISE).

  1. Color stability of shade guides after autoclave sterilization.

    PubMed

    Schmeling, Max; Sartori, Neimar; Monteiro, Sylvio; Baratieri, Luiz

    2014-01-01

    This study evaluated the influence of 120 autoclave sterilization cycles on the color stability of two commercial shade guides (Vita Classical and Vita System 3D-Master). The specimens were evaluated by spectrophotometer before and after the sterilization cycles. The color was described using the three-dimensional CIELab system. The statistical analysis was performed in three chromaticity coordinates, before and after sterilization cycles, using the paired samples t test. All specimens became darker after autoclave sterilization cycles. However, specimens of Vita Classical became redder, while those of the Vita System 3D-Master became more yellow. Repeated cycles of autoclave sterilization caused statistically significant changes in the color coordinates of the two shade guides. However, these differences are considered clinically acceptable.

  2. Much Polyphony but Little Harmony: Otto Sackur's Groping for a Quantum Theory of Gases

    NASA Astrophysics Data System (ADS)

    Badino, Massimiliano; Friedrich, Bretislav

    2013-09-01

    The endeavor of Otto Sackur (1880-1914) was driven, on the one hand, by his interest in Nernst's heat theorem, statistical mechanics, and the problem of chemical equilibrium and, on the other hand, by his goal to shed light on classical mechanics from the quantum vantage point. Inspired by the interplay between classical physics and quantum theory, Sackur chanced to expound his personal take on the role of the quantum in the changing landscape of physics in the turbulent 1910s. We tell the story of this enthusiastic practitioner of the old quantum theory and early contributor to quantum statistical mechanics, whose scientific ontogenesis provides a telling clue about the phylogeny of his contemporaries.

  3. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors

    NASA Astrophysics Data System (ADS)

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  4. Quantum gas-liquid condensation in an attractive Bose gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Shun-ichiro

    Gas-liquid condensation (GLC) in an attractive Bose gas is studied on the basis of statistical mechanics. Using some results in combinatorial mathematics, the following are derived. (1) With decreasing temperature, the Bose-statistical coherence grows in the many-body wave function, which gives rise to the divergence of the grand partition function prior to Bose-Einstein condensation. It is a quantum-mechanical analogue to the GLC in a classical gas (quantum GLC). (2) This GLC is triggered by the bosons with zero momentum. Compared with the classical GLC, an incomparably weaker attractive force creates it. For the system showing the quantum GLC, we discussmore » a cold helium 4 gas at sufficiently low pressure.« less

  5. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.

    PubMed

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  6. Quantum-optical coherence tomography with classical light.

    PubMed

    Lavoie, J; Kaltenbaek, R; Resch, K J

    2009-03-02

    Quantum-optical coherence tomography (Q-OCT) is an interferometric technique for axial imaging offering several advantages over conventional methods. Chirped-pulse interferometry (CPI) was recently demonstrated to exhibit all of the benefits of the quantum interferometer upon which Q-OCT is based. Here we use CPI to measure axial interferograms to profile a sample accruing the important benefits of Q-OCT, including automatic dispersion cancellation, but with 10 million times higher signal. Our technique solves the artifact problem in Q-OCT and highlights the power of classical correlation in optical imaging.

  7. [Postmortem imaging studies with data processing and 3D reconstruction: a new path of development of classic forensic medicine?].

    PubMed

    Woźniak, Krzysztof; Moskała, Artur; Urbanik, Andrzej; Kopacz, Paweł; Kłys, Małgorzata

    2009-01-01

    The techniques employed in "classic" forensic autopsy have been virtually unchanged for many years. One of the fundamental purposes of forensic documentation is to register as objectively as possible the changes found by forensic pathologists. The authors present the review of techniques of postmortem imaging studies, which aim not only at increased objectivity of observations, but also at extending the scope of the registered data. The paper is illustrated by images originating from research carried out by the authors.

  8. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  9. The current and future status of the concealed information test for field use.

    PubMed

    Matsuda, Izumi; Nittono, Hiroshi; Allen, John J B

    2012-01-01

    The Concealed Information Test (CIT) is a psychophysiological technique for examining whether a person has knowledge of crime-relevant information. Many laboratory studies have shown that the CIT has good scientific validity. However, the CIT has seldom been used for actual criminal investigations. One successful exception is its use by the Japanese police. In Japan, the CIT has been widely used for criminal investigations, although its probative force in court is not strong. In this paper, we first review the current use of the field CIT in Japan. Then, we discuss two possible approaches to increase its probative force: sophisticated statistical judgment methods and combining new psychophysiological measures with classic autonomic measures. On the basis of these considerations, we propose several suggestions for future practice and research involving the field CIT.

  10. The Current and Future Status of the Concealed Information Test for Field Use

    PubMed Central

    Matsuda, Izumi; Nittono, Hiroshi; Allen, John J. B.

    2012-01-01

    The Concealed Information Test (CIT) is a psychophysiological technique for examining whether a person has knowledge of crime-relevant information. Many laboratory studies have shown that the CIT has good scientific validity. However, the CIT has seldom been used for actual criminal investigations. One successful exception is its use by the Japanese police. In Japan, the CIT has been widely used for criminal investigations, although its probative force in court is not strong. In this paper, we first review the current use of the field CIT in Japan. Then, we discuss two possible approaches to increase its probative force: sophisticated statistical judgment methods and combining new psychophysiological measures with classic autonomic measures. On the basis of these considerations, we propose several suggestions for future practice and research involving the field CIT. PMID:23205018

  11. Distance majorization and its applications.

    PubMed

    Chi, Eric C; Zhou, Hua; Lange, Kenneth

    2014-08-01

    The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton's method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications.

  12. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    NASA Astrophysics Data System (ADS)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  13. Contribution of Glottic Insufficiency to Perceived Breathiness in Classically Trained Singers.

    PubMed

    Graham, Ellen; Angadi, Vrushali; Sloggy, Joanna; Stemple, Joseph

    2016-09-01

    Breathiness in the singing voice is problematic for classical singers. Voice students and singing teachers typically attribute breathiness to breath management issues and breathing technique. The present study sought to determine whether glottic insufficiency may also contribute to breathiness in a singer's voice. Studies have revealed a relationship between insufficient vocal fold closure and inefficiency in the speaking voice. However, the effect of insufficient vocal fold closure on vocal efficiency in singers has yet to be determined. Two groups of voice students identified with and without breathiness issues underwent aerodynamic and acoustic voice assessment as well as laryngeal stroboscopy of the vocal folds to quantify the prevalence of insufficient vocal fold closure, also known as glottic insufficiency. These assessments revealed four groups: 1) those with glottic insufficiency and no perceived voice breathiness; 2) those with glottic sufficiency and perceived voice breathiness; 3) those with glottic insufficiency and perceived breathiness; and 4) those with glottic sufficiency and no perceived breathiness. Results suggest that previously undiscovered glottal insufficiency is common in young singers, particularly women, though the correlation with identified breathiness was not statistically significant. Acoustic and aerodynamic measures including noise-to-harmonics ratio, maximum phonation time, airflow rate, subglottal pressure, and laryngeal airway resistance were most sensitive to glottic insufficiency.

  14. Notes on quantitative structure-properties relationships (QSPR) (1): A discussion on a QSPR dimensionality paradox (QSPR DP) and its quantum resolution.

    PubMed

    Carbó-Dorca, Ramon; Gallegos, Ana; Sánchez, Angel J

    2009-05-01

    Classical quantitative structure-properties relationship (QSPR) statistical techniques unavoidably present an inherent paradoxical computational context. They rely on the definition of a Gram matrix in descriptor spaces, which is used afterwards to reduce the original dimension via several possible kinds of algebraic manipulations. From there, effective models for the computation of unknown properties of known molecular structures are obtained. However, the reduced descriptor dimension causes linear dependence within the set of discrete vector molecular representations, leading to positive semi-definite Gram matrices in molecular spaces. To resolve this QSPR dimensionality paradox (QSPR DP) here is proposed to adopt as starting point the quantum QSPR (QQSPR) computational framework perspective, where density functions act as infinite dimensional descriptors. The fundamental QQSPR equation, deduced from employing quantum expectation value numerical evaluation, can be approximately solved in order to obtain models exempt of the QSPR DP. The substitution of the quantum similarity matrix by an empirical Gram matrix in molecular spaces, build up with the original non manipulated discrete molecular descriptor vectors, permits to obtain classical QSPR models with the same characteristics as in QQSPR, that is: possessing a certain degree of causality and explicitly independent of the descriptor dimension. 2008 Wiley Periodicals, Inc.

  15. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  16. Inferring Intra-Community Microbial Interaction Patterns from Metagenomic Datasets Using Associative Rule Mining Techniques

    PubMed Central

    Mande, Sharmila S.

    2016-01-01

    The nature of inter-microbial metabolic interactions defines the stability of microbial communities residing in any ecological niche. Deciphering these interaction patterns is crucial for understanding the mode/mechanism(s) through which an individual microbial community transitions from one state to another (e.g. from a healthy to a diseased state). Statistical correlation techniques have been traditionally employed for mining microbial interaction patterns from taxonomic abundance data corresponding to a given microbial community. In spite of their efficiency, these correlation techniques can capture only 'pair-wise interactions'. Moreover, their emphasis on statistical significance can potentially result in missing out on several interactions that are relevant from a biological standpoint. This study explores the applicability of one of the earliest association rule mining algorithm i.e. the 'Apriori algorithm' for deriving 'microbial association rules' from the taxonomic profile of given microbial community. The classical Apriori approach derives association rules by analysing patterns of co-occurrence/co-exclusion between various '(subsets of) features/items' across various samples. Using real-world microbiome data, the efficiency/utility of this rule mining approach in deciphering multiple (biologically meaningful) association patterns between 'subsets/subgroups' of microbes (constituting microbiome samples) is demonstrated. As an example, association rules derived from publicly available gut microbiome datasets indicate an association between a group of microbes (Faecalibacterium, Dorea, and Blautia) that are known to have mutualistic metabolic associations among themselves. Application of the rule mining approach on gut microbiomes (sourced from the Human Microbiome Project) further indicated similar microbial association patterns in gut microbiomes irrespective of the gender of the subjects. A Linux implementation of the Association Rule Mining (ARM) software (customised for deriving 'microbial association rules' from microbiome data) is freely available for download from the following link: http://metagenomics.atc.tcs.com/arm. PMID:27124399

  17. Inferring Intra-Community Microbial Interaction Patterns from Metagenomic Datasets Using Associative Rule Mining Techniques.

    PubMed

    Tandon, Disha; Haque, Mohammed Monzoorul; Mande, Sharmila S

    2016-01-01

    The nature of inter-microbial metabolic interactions defines the stability of microbial communities residing in any ecological niche. Deciphering these interaction patterns is crucial for understanding the mode/mechanism(s) through which an individual microbial community transitions from one state to another (e.g. from a healthy to a diseased state). Statistical correlation techniques have been traditionally employed for mining microbial interaction patterns from taxonomic abundance data corresponding to a given microbial community. In spite of their efficiency, these correlation techniques can capture only 'pair-wise interactions'. Moreover, their emphasis on statistical significance can potentially result in missing out on several interactions that are relevant from a biological standpoint. This study explores the applicability of one of the earliest association rule mining algorithm i.e. the 'Apriori algorithm' for deriving 'microbial association rules' from the taxonomic profile of given microbial community. The classical Apriori approach derives association rules by analysing patterns of co-occurrence/co-exclusion between various '(subsets of) features/items' across various samples. Using real-world microbiome data, the efficiency/utility of this rule mining approach in deciphering multiple (biologically meaningful) association patterns between 'subsets/subgroups' of microbes (constituting microbiome samples) is demonstrated. As an example, association rules derived from publicly available gut microbiome datasets indicate an association between a group of microbes (Faecalibacterium, Dorea, and Blautia) that are known to have mutualistic metabolic associations among themselves. Application of the rule mining approach on gut microbiomes (sourced from the Human Microbiome Project) further indicated similar microbial association patterns in gut microbiomes irrespective of the gender of the subjects. A Linux implementation of the Association Rule Mining (ARM) software (customised for deriving 'microbial association rules' from microbiome data) is freely available for download from the following link: http://metagenomics.atc.tcs.com/arm.

  18. Rope-based oral fluid sampling for early detection of classical swine fever in domestic pigs at group level.

    PubMed

    Dietze, Klaas; Tucakov, Anna; Engel, Tatjana; Wirtz, Sabine; Depner, Klaus; Globig, Anja; Kammerer, Robert; Mouchantat, Susan

    2017-01-05

    Non-invasive sampling techniques based on the analysis of oral fluid specimen have gained substantial importance in the field of swine herd management. Methodological advances have a focus on endemic viral diseases in commercial pig production. More recently, these approaches have been adapted to non-invasive sampling of wild boar for transboundary animal disease detection for which these effective population level sampling methods have not been available. In this study, a rope-in-a-bait based oral fluid sampling technique was tested to detect classical swine fever virus nucleic acid shedding from experimentally infected domestic pigs. Separated in two groups treated identically, the course of the infection was slightly differing in terms of onset of the clinical signs and levels of viral ribonucleic acid detection in the blood and oral fluid. The technique was capable of detecting classical swine fever virus nucleic acid as of day 7 post infection coinciding with the first detection in conventional oropharyngeal swab samples from some individual animals. Except for day 7 post infection in the "slower onset group", the chances of classical swine fever virus nucleic acid detection in ropes were identical or higher as compared to the individual sampling. With the provided evidence, non-invasive oral fluid sampling at group level can be considered as additional cost-effective detection tool in classical swine fever prevention and control strategies. The proposed methodology is of particular use in production systems with reduced access to veterinary services such as backyard or scavenging pig production where it can be integrated in feeding or baiting practices.

  19. Using qubits to reveal quantum signatures of an oscillator

    NASA Astrophysics Data System (ADS)

    Agarwal, Shantanu

    In this thesis, we seek to study the qubit-oscillator system with the aim to identify and quantify inherent quantum features of the oscillator. We show that the quantum signatures of the oscillator get imprinted on the dynamics of the joint system. The two key features which we explore are the quantized energy spectrum of the oscillator and the non-classicality of the oscillator's wave function. To investigate the consequences of the oscillator's discrete energy spectrum, we consider the qubit to be coupled to the oscillator through the Rabi Hamiltonian. Recent developments in fabrication technology have opened up the possibility to explore parameter regimes which were conventionally inaccessible. Motivated by these advancements, we investigate in this thesis a parameter space where the qubit frequency is much smaller than the oscillator frequency and the Rabi frequency is allowed to be an appreciable fraction of the bare frequency of the oscillator. We use the adiabatic approximation to understand the dynamics in this quasi-degenerate qubit regime. By deriving a dressed master equation, we systematically investigate the effects of the environment on the system dynamics. We develop a spectroscopic technique, using which one can probe the steady state response of the driven and damped system. The spectroscopic signal clearly reveals the quantized nature of the oscillator's energy spectrum. We extend the adiabatic approximation, earlier developed only for the single qubit case, to a scenario where multiple qubits interact with the oscillator. Using the extended adiabatic approximation, we study the collapse and revival of multi-qubit observables. We develop analytic expressions for the revival signals which are in good agreement with the numerically evaluated results. Within the quantum restriction imposed by Heisenberg's uncertainty principle, the uncertainty in the position and momentum of an oscillator is minimum and shared equally when the oscillator is prepared in a coherent state. For this reason, coherent states and states which can be thought of as a statistical mixture of coherent states are categorized as classical; whereas states which are not valid coherent state mixtures are classified as non-classical. In this thesis, we propose a new non-classicality witness operation which does not require a tomography of the oscillator's state. We show that by coupling a qubit longitudinally to the oscillator, one can infer about the non-classical nature of the initial state of the oscillator. Using a qubit observable, we derive a non-classicality witness inequality, a violation of which definitively indicates the non-classical nature of an oscillator's state.

  20. An overview of techniques for linking high-dimensional molecular data to time-to-event endpoints by risk prediction models.

    PubMed

    Binder, Harald; Porzelius, Christine; Schumacher, Martin

    2011-03-01

    Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig

    It is argued by extrapolation of general relativity and quantum mechanics that a classical inertial frame corresponds to a statistically defined observable that rotationally fluctuates due to Planck scale indeterminacy. Physical effects of exotic nonlocal rotational correlations on large scale field states are estimated. Their entanglement with the strong interaction vacuum is estimated to produce a universal, statistical centrifugal acceleration that resembles the observed cosmological constant.

  2. Computational algorithms dealing with the classical and statistical mechanics of celestial scale polymers in space elevator technology

    NASA Astrophysics Data System (ADS)

    Knudsen, Steven; Golubovic, Leonardo

    Prospects to build Space Elevator (SE) systems have become realistic with ultra-strong materials such as carbon nano-tubes and diamond nano-threads. At cosmic length-scales, space elevators can be modeled as polymer like floppy strings of tethered mass beads. A new venue in SE science has emerged with the introduction of the Rotating Space Elevator (RSE) concept supported by novel algorithms discussed in this presentation. An RSE is a loopy string reaching into outer space. Unlike the classical geostationary SE concepts of Tsiolkovsky, Artsutanov, and Pearson, our RSE exhibits an internal rotation. Thanks to this, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth whereas the other one is in outer space. The RSE concept thus solves a major problem in SE technology which is how to supply energy to the climbers moving along space elevator strings. The investigation of the classical and statistical mechanics of a floppy string interacting with objects sliding along it required development of subtle computational algorithms described in this presentation

  3. Surveillance and simulation of bovine spongiform encephalopathy and scrapie in small ruminants in Switzerland

    PubMed Central

    2010-01-01

    Background After bovine spongiform encephalopathy (BSE) emerged in European cattle livestock in 1986 a fundamental question was whether the agent established also in the small ruminants' population. In Switzerland transmissible spongiform encephalopathies (TSEs) in small ruminants have been monitored since 1990. While in the most recent TSE cases a BSE infection could be excluded, for historical cases techniques to discriminate scrapie from BSE had not been available at the time of diagnosis and thus their status remained unclear. We herein applied state-of-the-art techniques to retrospectively classify these animals and to re-analyze the affected flocks for secondary cases. These results were the basis for models, simulating the course of TSEs over a period of 70 years. The aim was to come to a statistically based overall assessment of the TSE situation in the domestic small ruminant population in Switzerland. Results In sum 16 TSE cases were identified in small ruminants in Switzerland since 1981, of which eight were atypical and six were classical scrapie. In two animals retrospective analysis did not allow any further classification due to the lack of appropriate tissue samples. We found no evidence for an infection with the BSE agent in the cases under investigation. In none of the affected flocks, secondary cases were identified. A Bayesian prevalence calculation resulted in most likely estimates of one case of BSE, five cases of classical scrapie and 21 cases of atypical scrapie per 100'000 small ruminants. According to our models none of the TSEs is considered to cause a broader epidemic in Switzerland. In a closed population, they are rather expected to fade out in the next decades or, in case of a sporadic origin, may remain at a very low level. Conclusions In summary, these data indicate that despite a significant epidemic of BSE in cattle, there is no evidence that BSE established in the small ruminant population in Switzerland. Classical and atypical scrapie both occur at a very low level and are not expected to escalate into an epidemic. In this situation the extent of TSE surveillance in small ruminants requires reevaluation based on cost-benefit analysis. PMID:20398417

  4. The effect of live classical piano music on the vital signs of patients undergoing ophthalmic surgery.

    PubMed

    Camara, Jorge G; Ruszkowski, Joseph M; Worak, Sandra R

    2008-06-25

    Music and surgery. To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Retrospective case series. 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Mean arterial pressure, heart rate, and respiratory rate. 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery.

  5. Statistical mechanical foundation of the peridynamic nonlocal continuum theory: energy and momentum conservation laws.

    PubMed

    Lehoucq, R B; Sears, Mark P

    2011-09-01

    The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.

  6. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  7. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  8. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  9. Automatic Classification of Sub-Techniques in Classical Cross-Country Skiing Using a Machine Learning Algorithm on Micro-Sensor Data

    PubMed Central

    Seeberg, Trine M.; Tjønnås, Johannes; Haugnes, Pål; Sandbakk, Øyvind

    2017-01-01

    The automatic classification of sub-techniques in classical cross-country skiing provides unique possibilities for analyzing the biomechanical aspects of outdoor skiing. This is currently possible due to the miniaturization and flexibility of wearable inertial measurement units (IMUs) that allow researchers to bring the laboratory to the field. In this study, we aimed to optimize the accuracy of the automatic classification of classical cross-country skiing sub-techniques by using two IMUs attached to the skier’s arm and chest together with a machine learning algorithm. The novelty of our approach is the reliable detection of individual cycles using a gyroscope on the skier’s arm, while a neural network machine learning algorithm robustly classifies each cycle to a sub-technique using sensor data from an accelerometer on the chest. In this study, 24 datasets from 10 different participants were separated into the categories training-, validation- and test-data. Overall, we achieved a classification accuracy of 93.9% on the test-data. Furthermore, we illustrate how an accurate classification of sub-techniques can be combined with data from standard sports equipment including position, altitude, speed and heart rate measuring systems. Combining this information has the potential to provide novel insight into physiological and biomechanical aspects valuable to coaches, athletes and researchers. PMID:29283421

  10. An experimental model for the study of cognitive disorders: the hippocampus and associative learning in mice.

    PubMed

    Delgado-García, José M; Gruart, Agnès

    2008-12-01

    The availability of transgenic mice mimicking selective human neurodegenerative and psychiatric disorders calls for new electrophysiological and microstimulation techniques capable of being applied in vivo in this species. In this article, we will concentrate on experiments and techniques developed in our laboratory during the past few years. Thus we have developed different techniques for the study of learning and memory capabilities of wild-type and transgenic mice with deficits in cognitive functions, using classical conditioning procedures. These techniques include different trace (tone/SHOCK and shock/SHOCK) conditioning procedures ? that is, a classical conditioning task involving the cerebral cortex, including the hippocampus. We have also developed implantation and recording techniques for evoking long-term potentiation (LTP) in behaving mice and for recording the evolution of field excitatory postsynaptic potentials (fEPSP) evoked in the hippocampal CA1 area by the electrical stimulation of the commissural/Schaffer collateral pathway across conditioning sessions. Computer programs have also been developed to quantify the appearance and evolution of eyelid conditioned responses and the slope of evoked fEPSPs. According to the present results, the in vivo recording of the electrical activity of selected hippocampal sites during classical conditioning of eyelid responses appears to be a suitable experimental procedure for studying learning capabilities in genetically modified mice, and an excellent model for the study of selected neuropsychiatric disorders compromising cerebral cortex functioning.

  11. Three Perspectives on: Children's Classics in a Non-Classical Age

    ERIC Educational Resources Information Center

    Fadiman, Clifton

    1972-01-01

    Along with pioneering thrusts into new thematic territory for children's literature has come experimentation in form, style, and technique, even more marked in the field of illustration than in verbal narrative. This article serves as an introduction to contributions by English, French and American experts on children's literature. (Author/SJ)

  12. The Gaussian beam mode analysis of classical phase aberrations in diffraction-limited optical systems

    NASA Astrophysics Data System (ADS)

    Trappe, Neil; Murphy, J. Anthony; Withington, Stafford

    2003-07-01

    Gaussian beam mode analysis (GBMA) offers a more intuitive physical insight into how light beams evolve as they propagate than the conventional Fresnel diffraction integral approach. In this paper we illustrate that GBMA is a computationally efficient, alternative technique for tracing the evolution of a diffracting coherent beam. In previous papers we demonstrated the straightforward application of GBMA to the computation of the classical diffraction patterns associated with a range of standard apertures. In this paper we show how the GBMA technique can be expanded to investigate the effects of aberrations in the presence of diffraction by introducing the appropriate phase error term into the propagating quasi-optical beam. We compare our technique to the standard diffraction integral calculation for coma, astigmatism and spherical aberration, taking—for comparison—examples from the classic text 'Principles of Optics' by Born and Wolf. We show the advantages of GBMA for allowing the defocusing of an aberrated image to be evaluated quickly, which is particularly important and useful for probing the consequences of astigmatism and spherical aberration.

  13. Novel high dose rate lip brachytherapy technique to improve dose homogeneity and reduce toxicity by customized mold.

    PubMed

    Feldman, Jon; Appelbaum, Limor; Sela, Mordechay; Voskoboinik, Ninel; Kadouri, Sarit; Weinberger, Jeffrey; Orion, Itzhak; Meirovitz, Amichay

    2014-12-23

    The purpose of this study is to describe a novel brachytherapy technique for lip Squamous Cell Carcinoma, utilizing a customized mold with embedded brachytherapy sleeves, which separates the lip from the mandible, and improves dose homogeneity. Seven patients with T2 lip cancer treated with a "sandwich" technique of High Dose Rate (HDR) brachytherapy to the lip, consisting of interstitial catheters and a customized mold with embedded catheters, were reviewed for dosimetry and outcome using 3D planning. Dosimetric comparison was made between the "sandwich" technique to "classic" - interstitial catheters only plan. We compared dose volume histograms for Clinical Tumor Volume (CTV), normal tissue "hot spots" and mandible dose. We are reporting according to the ICRU 58 and calculated the Conformal Index (COIN) to show the advantage of our technique. The seven patients (ages 36-81 years, male) had median follow-up of 47 months. Four patients received Brachytherapy and External Beam Radiation Therapy, 3 patients received brachytherapy alone. All achieved local control, with excellent esthetic and functional results. All patients are disease free. The Customized Mold Sandwich technique (CMS) reduced the high dose region receiving 150% (V150) by an average of 20% (range 1-47%), The low dose region (les then 90% of the prescribed dose) improved by 73% in average by using the CMS technique. The COIN value for the CMS was in average 0.92 as opposed to 0.88 for the interstitial catheter only. All differences (excluding the low dose region) were statistically significant. The CMS technique significantly reduces the high dose volume and increases treatment homogeneity. This may reduce the potential toxicity to the lip and adjacent mandible, and results in excellent tumor control, cosmetic and functionality.

  14. Detection of chromosomal changes in chronic lymphocytic leukemia using classical cytogenetic methods and FISH: application of rich mitogen mixtures for lymphocyte cultures.

    PubMed

    Koczkodaj, Dorota; Popek, Sylwia; Zmorzyński, Szymon; Wąsik-Szczepanek, Ewa; Filip, Agata A

    2016-04-01

    One of the research methods of prognostic value in chronic lymphocytic leukemia (CLL) is cytogenetic analysis. This method requires the presence of appropriate B-cell mitogens in cultures in order to obtain a high mitotic index. The aim of our research was to determine the most effective methods of in vitro B-cell stimulation to maximize the number of metaphases from peripheral blood cells of patients with CLL for classical cytogenetic examination, and then to correlate the results with those obtained using fluorescence in situ hybridization (FISH). The study group involved 50 consecutive patients with CLL. Cell cultures were maintained with the basic composition of culture medium and addition of respective stimulators. We used the following stimulators: Pokeweed Mitogen (PWM), 12-O-tetradecanoylphorbol 13-acetate (TPA), ionophore, lipopolysaccharide (LPS), and CpG-oligonucleotide DSP30. We received the highest mitotic index when using the mixture of PWM+TPA+I+DSP30. With classical cytogenetic tests using banding techniques, numerical and structural aberrations of chromosomes were detected in 46 patients, and no change was found in only four patients. Test results clearly confirmed the legitimacy of using cell cultures enriched with the mixture of cell stimulators and combining classical cytogenetic techniques with the FISH technique in later patient diagnosing. Copyright © 2016 American Federation for Medical Research.

  15. The ambiguity of simplicity in quantum and classical simulation

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.

    2017-04-01

    A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.

  16. Modern modelling techniques are data hungry: a simulation study for predicting dichotomous endpoints.

    PubMed

    van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W

    2014-12-22

    Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.

  17. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Power-law distributions for a trapped ion interacting with a classical buffer gas.

    PubMed

    DeVoe, Ralph G

    2009-02-13

    Classical collisions with an ideal gas generate non-Maxwellian distribution functions for a single ion in a radio frequency ion trap. The distributions have power-law tails whose exponent depends on the ratio of buffer gas to ion mass. This provides a statistical explanation for the previously observed transition from cooling to heating. Monte Carlo results approximate a Tsallis distribution over a wide range of parameters and have ab initio agreement with experiment.

  19. Study on elevated-temperature flow behavior of Ni-Cr-Mo-B ultra-heavy-plate steel via experiment and modelling

    NASA Astrophysics Data System (ADS)

    Gao, Zhi-yu; Kang, Yu; Li, Yan-shuai; Meng, Chao; Pan, Tao

    2018-04-01

    Elevated-temperature flow behavior of a novel Ni-Cr-Mo-B ultra-heavy-plate steel was investigated by conducting hot compressive deformation tests on a Gleeble-3800 thermo-mechanical simulator at a temperature range of 1123 K–1423 K with a strain rate range from 0.01 s‑1 to10 s‑1 and a height reduction of 70%. Based on the experimental results, classic strain-compensated Arrhenius-type, a new revised strain-compensated Arrhenius-type and classic modified Johnson-Cook constitutive models were developed for predicting the high-temperature deformation behavior of the steel. The predictability of these models were comparatively evaluated in terms of statistical parameters including correlation coefficient (R), average absolute relative error (AARE), average root mean square error (RMSE), normalized mean bias error (NMBE) and relative error. The statistical results indicate that the new revised strain-compensated Arrhenius-type model could give prediction of elevated-temperature flow stress for the steel accurately under the entire process conditions. However, the predicted values by the classic modified Johnson-Cook model could not agree well with the experimental values, and the classic strain-compensated Arrhenius-type model could track the deformation behavior more accurately compared with the modified Johnson-Cook model, but less accurately with the new revised strain-compensated Arrhenius-type model. In addition, reasons of differences in predictability of these models were discussed in detail.

  20. Number statistics for β-ensembles of random matrices: Applications to trapped fermions at zero temperature.

    PubMed

    Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo

    2016-09-01

    Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.

  1. The Application of FT-IR Spectroscopy for Quality Control of Flours Obtained from Polish Producers

    PubMed Central

    Ceglińska, Alicja; Reder, Magdalena; Ciemniewska-Żytkiewicz, Hanna

    2017-01-01

    Samples of wheat, spelt, rye, and triticale flours produced by different Polish mills were studied by both classic chemical methods and FT-IR MIR spectroscopy. An attempt was made to statistically correlate FT-IR spectral data with reference data with regard to content of various components, for example, proteins, fats, ash, and fatty acids as well as properties such as moisture, falling number, and energetic value. This correlation resulted in calibrated and validated statistical models for versatile evaluation of unknown flour samples. The calibration data set was used to construct calibration models with use of the CSR and the PLS with the leave one-out, cross-validation techniques. The calibrated models were validated with a validation data set. The results obtained confirmed that application of statistical models based on MIR spectral data is a robust, accurate, precise, rapid, inexpensive, and convenient methodology for determination of flour characteristics, as well as for detection of content of selected flour ingredients. The obtained models' characteristics were as follows: R2 = 0.97, PRESS = 2.14; R2 = 0.96, PRESS = 0.69; R2 = 0.95, PRESS = 1.27; R2 = 0.94, PRESS = 0.76, for content of proteins, lipids, ash, and moisture level, respectively. Best results of CSR models were obtained for protein, ash, and crude fat (R2 = 0.86; 0.82; and 0.78, resp.). PMID:28243483

  2. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    NASA Astrophysics Data System (ADS)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  3. Estimation of critical behavior from the density of states in classical statistical models

    NASA Astrophysics Data System (ADS)

    Malakis, A.; Peratzakis, A.; Fytas, N. G.

    2004-12-01

    We present a simple and efficient approximation scheme which greatly facilitates the extension of Wang-Landau sampling (or similar techniques) in large systems for the estimation of critical behavior. The method, presented in an algorithmic approach, is based on a very simple idea, familiar in statistical mechanics from the notion of thermodynamic equivalence of ensembles and the central limit theorem. It is illustrated that we can predict with high accuracy the critical part of the energy space and by using this restricted part we can extend our simulations to larger systems and improve the accuracy of critical parameters. It is proposed that the extensions of the finite-size critical part of the energy space, determining the specific heat, satisfy a scaling law involving the thermal critical exponent. The method is applied successfully for the estimation of the scaling behavior of specific heat of both square and simple cubic Ising lattices. The proposed scaling law is verified by estimating the thermal critical exponent from the finite-size behavior of the critical part of the energy space. The density of states of the zero-field Ising model on these lattices is obtained via a multirange Wang-Landau sampling.

  4. A low-order model for wave propagation in random waveguides

    NASA Astrophysics Data System (ADS)

    Millet, Christophe; Bertin, Michael; Bouche, Daniel

    2014-11-01

    In numerical modeling of infrasound propagation in the atmosphere, the wind and temperature profiles are usually obtained as a result of matching atmospheric models to empirical data and thus inevitably involve some random errors. In the present approach, the sound speed profiles are considered as random functions and the wave equation is solved using a reduced-order model, starting from the classical normal mode technique. We focus on the asymptotic behavior of the transmitted waves in the weakly heterogeneous regime (the coupling between the wave and the medium is weak), with a fixed number of propagating modes that can be obtained by rearranging the eigenvalues by decreasing Sobol indices. The most important feature of the stochastic approach lies in the fact that the model order can be computed to satisfy a given statistical accuracy whatever the frequency. The statistics of a transmitted broadband pulse are computed by decomposing the original pulse into a sum of modal pulses that can be described by a front pulse stabilization theory. The method is illustrated on two large-scale infrasound calibration experiments, that were conducted at the Sayarim Military Range, Israel, in 2009 and 2011.

  5. Biological effective dose evaluation in gynaecological brachytherapy: LDR and HDR treatments, dependence on radiobiological parameters, and treatment optimisation.

    PubMed

    Bianchi, C; Botta, F; Conte, L; Vanoli, P; Cerizza, L

    2008-10-01

    This study was undertaken to compare the biological efficacy of different high-dose-rate (HDR) and low-dose-rate (LDR) treatments of gynaecological lesions, to identify the causes of possible nonuniformity and to optimise treatment through customised calculation. The study considered 110 patients treated between 2001 and 2006 with external beam radiation therapy and/or brachytherapy with either LDR (afterloader Selectron, (137)Cs) or HDR (afterloader microSelectron Classic, (192)Ir). The treatments were compared in terms of biologically effective dose (BED) to the tumour and to the rectum (linear-quadratic model) by using statistical tests for comparisons between independent samples. The difference between the two treatments was statistically significant in one case only. However, within each technique, we identified considerable nonuniformity in therapeutic efficacy due to differences in fractionation schemes and overall treatment time. To solve this problem, we created a Microsoft Excel spreadsheet allowing calculation of the optimal treatment for each patient: best efficacy (BED(tumour)) without exceeding toxicity threshold (BED(rectum)). The efficacy of a treatment may vary as a result of several factors. Customised radiobiological evaluation is a useful adjunct to clinical evaluation in planning equivalent treatments that satisfy all dosimetric constraints.

  6. An Efficient Augmented Lagrangian Method for Statistical X-Ray CT Image Reconstruction.

    PubMed

    Li, Jiaojiao; Niu, Shanzhou; Huang, Jing; Bian, Zhaoying; Feng, Qianjin; Yu, Gaohang; Liang, Zhengrong; Chen, Wufan; Ma, Jianhua

    2015-01-01

    Statistical iterative reconstruction (SIR) for X-ray computed tomography (CT) under the penalized weighted least-squares criteria can yield significant gains over conventional analytical reconstruction from the noisy measurement. However, due to the nonlinear expression of the objective function, most exiting algorithms related to the SIR unavoidably suffer from heavy computation load and slow convergence rate, especially when an edge-preserving or sparsity-based penalty or regularization is incorporated. In this work, to address abovementioned issues of the general algorithms related to the SIR, we propose an adaptive nonmonotone alternating direction algorithm in the framework of augmented Lagrangian multiplier method, which is termed as "ALM-ANAD". The algorithm effectively combines an alternating direction technique with an adaptive nonmonotone line search to minimize the augmented Lagrangian function at each iteration. To evaluate the present ALM-ANAD algorithm, both qualitative and quantitative studies were conducted by using digital and physical phantoms. Experimental results show that the present ALM-ANAD algorithm can achieve noticeable gains over the classical nonlinear conjugate gradient algorithm and state-of-the-art split Bregman algorithm in terms of noise reduction, contrast-to-noise ratio, convergence rate, and universal quality index metrics.

  7. Low order models for uncertainty quantification in acoustic propagation problems

    NASA Astrophysics Data System (ADS)

    Millet, Christophe

    2016-11-01

    Long-range sound propagation problems are characterized by both a large number of length scales and a large number of normal modes. In the atmosphere, these modes are confined within waveguides causing the sound to propagate through multiple paths to the receiver. For uncertain atmospheres, the modes are described as random variables. Concise mathematical models and analysis reveal fundamental limitations in classical projection techniques due to different manifestations of the fact that modes that carry small variance can have important effects on the large variance modes. In the present study, we propose a systematic strategy for obtaining statistically accurate low order models. The normal modes are sorted in decreasing Sobol indices using asymptotic expansions, and the relevant modes are extracted using a modified iterative Krylov-based method. The statistics of acoustic signals are computed by decomposing the original pulse into a truncated sum of modal pulses that can be described by a stationary phase method. As the low-order acoustic model preserves the overall structure of waveforms under perturbations of the atmosphere, it can be applied to uncertainty quantification. The result of this study is a new algorithm which applies on the entire phase space of acoustic fields.

  8. Note Onset Deviations as Musical Piece Signatures

    PubMed Central

    Serrà, Joan; Özaslan, Tan Hakan; Arcos, Josep Lluis

    2013-01-01

    A competent interpretation of a musical composition presents several non-explicit departures from the written score. Timing variations are perhaps the most important ones: they are fundamental for expressive performance and a key ingredient for conferring a human-like quality to machine-based music renditions. However, the nature of such variations is still an open research question, with diverse theories that indicate a multi-dimensional phenomenon. In the present study, we consider event-shift timing variations and show that sequences of note onset deviations are robust and reliable predictors of the musical piece being played, irrespective of the performer. In fact, our results suggest that only a few consecutive onset deviations are already enough to identify a musical composition with statistically significant accuracy. We consider a mid-size collection of commercial recordings of classical guitar pieces and follow a quantitative approach based on the combination of standard statistical tools and machine learning techniques with the semi-automatic estimation of onset deviations. Besides the reported results, we believe that the considered materials and the methodology followed widen the testing ground for studying musical timing and could open new perspectives in related research fields. PMID:23935971

  9. Malnutrition and Environmental Enrichment: A Statistical Reappraisal of the Findings of the Adoption Study of Winick et al. (1975).

    ERIC Educational Resources Information Center

    Trueman, Mark

    1985-01-01

    Critically reviews the influential study "Malnutrition and Environmental Enrichment" by Winick et al. (1975) and highlights what are considered to be statistical flaws in its analysis. Data in the classic study of height, weight, and IQ changes in three groups of adopted, malnourished Korean girls are reanalysed and conclusions…

  10. Information flow and quantum cryptography using statistical fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Home, D.; Whitaker, M.A.B.

    2003-02-01

    A procedure is formulated, using the quantum teleportation arrangement, that communicates knowledge of an apparatus setting between the wings of the experiment, using statistical fluctuations in a sequence of measurement results. It requires an entangled state, and transmission of classical information totally unrelated to the apparatus setting actually communicated. Our procedure has conceptual interest, and has applications to quantum cryptography.

  11. Extraction of decision rules via imprecise probabilities

    NASA Astrophysics Data System (ADS)

    Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.

    2017-05-01

    Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.

  12. OSM-Classic : An optical imaging technique for accurately determining strain

    NASA Astrophysics Data System (ADS)

    Aldrich, Daniel R.; Ayranci, Cagri; Nobes, David S.

    OSM-Classic is a program designed in MATLAB® to provide a method of accurately determining strain in a test sample using an optical imaging technique. Measuring strain for the mechanical characterization of materials is most commonly performed with extensometers, LVDT (linear variable differential transistors), and strain gauges; however, these strain measurement methods suffer from their fragile nature and it is not particularly easy to attach these devices to the material for testing. To alleviate these potential problems, an optical approach that does not require contact with the specimen can be implemented to measure the strain. OSM-Classic is a software that interrogates a series of images to determine elongation in a test sample and hence, strain of the specimen. It was designed to provide a graphical user interface that includes image processing with a dynamic region of interest. Additionally, the stain is calculated directly while providing active feedback during the processing.

  13. Crop improvement and conservation through tissue culture techniques

    USDA-ARS?s Scientific Manuscript database

    Crop improvement through classic breeding and/or genetic engineering methods is possible in the majority of cultivated crops. However, gene manipulations, chromosome duplication, protoplast fusion, bioassays, interspecific cross recovery involve tissue culture techniques. For vegetatively propagated...

  14. A Whirlwind Tour of Computational Geometry.

    ERIC Educational Resources Information Center

    Graham, Ron; Yao, Frances

    1990-01-01

    Described is computational geometry which used concepts and results from classical geometry, topology, combinatorics, as well as standard algorithmic techniques such as sorting and searching, graph manipulations, and linear programing. Also included are special techniques and paradigms. (KR)

  15. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  16. A low-order model for long-range infrasound propagation in random atmospheric waveguides

    NASA Astrophysics Data System (ADS)

    Millet, C.; Lott, F.

    2014-12-01

    In numerical modeling of long-range infrasound propagation in the atmosphere, the wind and temperature profiles are usually obtained as a result of matching atmospheric models to empirical data. The atmospheric models are classically obtained from operational numerical weather prediction centers (NOAA Global Forecast System or ECMWF Integrated Forecast system) as well as atmospheric climate reanalysis activities and thus, do not explicitly resolve atmospheric gravity waves (GWs). The GWs are generally too small to be represented in Global Circulation Models, and their effects on the resolved scales need to be parameterized in order to account for fine-scale atmospheric inhomogeneities (for length scales less than 100 km). In the present approach, the sound speed profiles are considered as random functions, obtained by superimposing a stochastic GW field on the ECMWF reanalysis ERA-Interim. The spectral domain is binned by a large number of monochromatic GWs, and the breaking of each GW is treated independently from the others. The wave equation is solved using a reduced-order model, starting from the classical normal mode technique. We focus on the asymptotic behavior of the transmitted waves in the weakly heterogeneous regime (for which the coupling between the wave and the medium is weak), with a fixed number of propagating modes that can be obtained by rearranging the eigenvalues by decreasing Sobol indices. The most important feature of the stochastic approach lies in the fact that the model order (i.e. the number of relevant eigenvalues) can be computed to satisfy a given statistical accuracy whatever the frequency. As the low-order model preserves the overall structure of waveforms under sufficiently small perturbations of the profile, it can be applied to sensitivity analysis and uncertainty quantification. The gain in CPU cost provided by the low-order model is essential for extracting statistical information from simulations. The statistics of a transmitted broadband pulse are computed by decomposing the original pulse into a sum of modal pulses that propagate with different phase speeds and can be described by a front pulse stabilization theory. The method is illustrated on two large-scale infrasound calibration experiments, that were conducted at the Sayarim Military Range, Israel, in 2009 and 2011.

  17. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    NASA Astrophysics Data System (ADS)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  18. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  19. Distance majorization and its applications

    PubMed Central

    Chi, Eric C.; Zhou, Hua; Lange, Kenneth

    2014-01-01

    The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton’s method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications. PMID:25392563

  20. Natural air leak test without submergence for spontaneous pneumothorax.

    PubMed

    Uramoto, Hidetaka; Tanaka, Fumihiro

    2011-12-24

    Postoperative air leaks are frequent complications after surgery for a spontaneous pneumothorax (SP). We herein describe a new method to test for air leaks by using a transparent film and thoracic tube in a closed system. Between 2005 and 2010, 35 patients underwent a novel method for evaluating air leaks without submergence, and their clinical records were retrospectively reviewed. The data on patient characteristics, surgical details, and perioperative outcomes were analyzed. The differences in the clinical background and intraoperative factors did not reach a statistically significant level between the new and classical methods. The incidence of recurrence was also equivalent to the standard method. However, the length of the operation and drainage periods were significantly shorter in patients evaluated using the new method than the conventional method. Further, no postoperative complications were observed in patients evaluated using the new method. This simple technique is satisfactorily effective and does not result in any complications.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harstad, E. N.; Harlow, Francis Harvey,; Schreyer, H. L.

    Our goal is to develop constitutive relations for the behavior of a solid polymer during high-strain-rate deformations. In contrast to the classic thermodynamic techniques for deriving stress-strain response in static (equilibrium) circumstances, we employ a statistical-mechanics approach, in which we evolve a probability distribution function (PDF) for the velocity fluctuations of the repeating units of the chain. We use a Langevin description for the dynamics of a single repeating unit and a Lioville equation to describe the variations of the PDF. Moments of the PDF give the conservation equations for a single polymer chain embedded in other similar chains. Tomore » extract single-chain analytical constitutive relations these equations have been solved for representative loading paths. By this process we discover that a measure of nonuniform chain link displacement serves this purpose very well. We then derive an evolution equation for the descriptor function, with the result being a history-dependent constitutive relation.« less

  2. Ultra-sensitive flow measurement in individual nanopores through pressure--driven particle translocation.

    PubMed

    Gadaleta, Alessandro; Biance, Anne-Laure; Siria, Alessandro; Bocquet, Lyderic

    2015-05-07

    A challenge for the development of nanofluidics is to develop new instrumentation tools, able to probe the extremely small mass transport across individual nanochannels. Such tools are a prerequisite for the fundamental exploration of the breakdown of continuum transport in nanometric confinement. In this letter, we propose a novel method for the measurement of the hydrodynamic permeability of nanometric pores, by diverting the classical technique of Coulter counting to characterize a pressure-driven flow across an individual nanopore. Both the analysis of the translocation rate, as well as the detailed statistics of the dwell time of nanoparticles flowing across a single nanopore, allow us to evaluate the permeability of the system. We reach a sensitivity for the water flow down to a few femtoliters per second, which is more than two orders of magnitude better than state-of-the-art alternative methods.

  3. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    NASA Astrophysics Data System (ADS)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  4. Advanced spectrophotometric chemometric methods for resolving the binary mixture of doxylamine succinate and pyridoxine hydrochloride.

    PubMed

    Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita

    2018-03-01

    The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.

  5. Mutual information, neural networks and the renormalization group

    NASA Astrophysics Data System (ADS)

    Koch-Janusz, Maciej; Ringel, Zohar

    2018-06-01

    Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.

  6. Omics integrating physical techniques: aged Piedmontese meat analysis.

    PubMed

    Lana, Alessandro; Longo, Valentina; Dalmasso, Alessandra; D'Alessandro, Angelo; Bottero, Maria Teresa; Zolla, Lello

    2015-04-01

    Piedmontese meat tenderness becomes higher by extending the ageing period after slaughter up to 44 days. Classical physical analysis only partially explain this evidence, so in order to discover the reason of the potential beneficial effects of prolonged ageing, we performed omic analysis in the Longissimus thoracis muscle by examining main biochemical changes through mass spectrometry-based metabolomics and proteomics. We observed a progressive decline in myofibrillar structural integrity (underpinning meat tenderness) and impaired energy metabolism. Markers of autophagic responses (e.g. serine and glutathione metabolism) and nitrogen metabolism (urea cycle intermediates) accumulated until the end of the assayed period. Key metabolites such as glutamate, a mediator of the appreciated umami taste of the meat, were found to constantly accumulate until day 44. Finally, statistical analyses revealed that glutamate, serine and arginine could serve as good predictors of ultimate meat quality parameters, even though further studies are mandatory. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. The Parker-Sochacki Method of Solving Differential Equations: Applications and Limitations

    NASA Astrophysics Data System (ADS)

    Rudmin, Joseph W.

    2006-11-01

    The Parker-Sochacki method is a powerful but simple technique of solving systems of differential equations, giving either analytical or numerical results. It has been in use for about 10 years now since its discovery by G. Edgar Parker and James Sochacki of the James Madison University Dept. of Mathematics and Statistics. It is being presented here because it is still not widely known and can benefit the listeners. It is a method of rapidly generating the Maclauren series to high order, non-iteratively. It has been successfully applied to more than a hundred systems of equations, including the classical many-body problem. Its advantages include its speed of calculation, its simplicity, and the fact that it uses only addition, subtraction and multiplication. It is not just a polynomial approximation, because it yields the Maclaurin series, and therefore exhibits the advantages and disadvantages of that series. A few applications will be presented.

  8. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  9. Quantum and classical behavior in interacting bosonic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzberg, Mark P.

    It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less

  10. Airborne non-contact and contact broadband ultrasounds for frequency attenuation profile estimation of cementitious materials.

    PubMed

    Gosálbez, J; Wright, W M D; Jiang, W; Carrión, A; Genovés, V; Bosch, I

    2018-08-01

    In this paper, the study of frequency-dependent ultrasonic attenuation in strongly heterogeneous cementitious materials is addressed. To accurately determine the attenuation over a wide frequency range, it is necessary to have suitable excitation techniques. We have analysed two kinds of ultrasound techniques: contact ultrasound and airborne non-contact ultrasound. The mathematical formulation for frequency-dependent attenuation has been established and it has been revealed that each technique may achieve similar results but requires specific different calibration processes. In particular, the airborne non-contact technique suffers high attenuation due to energy losses at the air-material interfaces. Thus, its bandwidth is limited to low frequencies but it does not require physical contact between transducer and specimen. In contrast, the classical contact technique can manage higher frequencies but the measurement depends on the pressure between the transducer and the specimen. Cement specimens have been tested with both techniques and frequency attenuation dependence has been estimated. Similar results were achieved at overlapping bandwidth and it has been demonstrated that the airborne non-contact ultrasound technique could be a viable alternative to the classical contact technique. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Symmetry-Based Techniques for Qualitative Understanding of Rovibrational Effects in Spherical-Top Molecular Spectra and Dynamics

    NASA Astrophysics Data System (ADS)

    Mitchell, Justin Chadwick

    2011-12-01

    Using light to probe the structure of matter is as natural as opening our eyes. Modern physics and chemistry have turned this art into a rich science, measuring the delicate interactions possible at the molecular level. Perhaps the most commonly used tool in computational spectroscopy is that of matrix diagonalization. While this is invaluable for calculating everything from molecular structure and energy levels to dipole moments and dynamics, the process of numerical diagonalization is an opaque one. This work applies symmetry and semi-classical techniques to elucidate numerical spectral analysis for high-symmetry molecules. Semi-classical techniques, such as the Potential Energy Surfaces, have long been used to help understand molecular vibronic and rovibronic spectra and dynamics. This investigation focuses on newer semi-classical techniques that apply Rotational Energy Surfaces (RES) to rotational energy level clustering effects in high-symmetry molecules. Such clusters exist in rigid rotor molecules as well as deformable spherical tops. This study begins by using the simplicity of rigid symmetric top molecules to clarify the classical-quantum correspondence of RES semi-classical analysis and then extends it to a more precise and complete theory of modern high-resolution spectra. RES analysis is extended to molecules having more complex and higher rank tensorial rotational and rovibrational Hamiltonians than were possible to understand before. Such molecules are shown to produce an extraordinary range of rotational level clusters, corresponding to a panoply of symmetries ranging from C4v to C2 and C1 (no symmetry) with a corresponding range of new angular momentum localization and J-tunneling effects. Using RES topography analysis and the commutation duality relations between symmetry group operators in the lab-frame to those in the body-frame, it is shown how to better describe and catalog complex splittings found in rotational level clusters. Symmetry character analysis is generalized to give analytic eigensolutions. An appendix provides vibrational analogies. For the first time, interactions between molecular vibrations (polyads) are described semi-classically by multiple RES. This is done for the nu 3/2nu4 dyad of CF4. The nine-surface RES topology of the U(9)-dyad agrees with both computational and experimental work. A connection between this and a simpler U(2) example is detailed in an Appendix.

  12. Preoperative and Postoperative CT Scan Assessment of Pterygomaxillary Junction in Patients Undergoing Le Fort I Osteotomy: Comparison of Pterygomaxillary Dysjunction Technique and Trimble Technique-A Pilot Study.

    PubMed

    Dadwal, Himani; Shanmugasundaram, S; Krishnakumar Raja, V B

    2015-09-01

    To determine the rate of complications and occurrence of pterygoid plate fractures comparing two techniques of Le Fort I osteotomy i.e., Classic Pterygomaxillary Dysjunction technique and Trimble technique and to know whether the dimensions of pterygomaxillary junction [determined preoperatively by computed tomography (CT) scan] have any influence on pterygomaxillary separation achieved during surgery. The study group consisted of eight South Indian patients with maxillary excess. A total of 16 sides were examined by CT. Preoperative CT was analyzed for all the patients. The thickness and width of the pterygomaxillary junction and the distance of the greater palatine canal from the pterygomaxillary junction was noted. Pterygomaxillary dysjunction was achieved by two techniques, the classic pterygomaxillary dysjunction technique (Group I) and Trimble technique (Group II). Patients were selected randomly and equally for both the techniques. Dysjunction was analyzed by postoperative CT. The average thickness of the pterygomaxillary junction on 16 sides was 4.5 ± 1.2 mm. Untoward pterygoid plate fractures occurred in Group I in 3 sides out of 8. In Trimble technique (Group II), no pterygoid plate fractures were noted. The average width of the pterygomaxillary junction was 7.8 ± 1.5 mm, distance of the greater palatine canal from pterygomaxillary junction was 7.4 ± 1.6 mm and the length of fusion of pterygomaxillary junction was 8.0 ± 1.9 mm. The Le Fort I osteotomy has become a standard procedure for correcting various dentofacial deformities. In an attempt to make Le Fort I osteotomy safer and avoid the problems associated with sectioning with an osteotome between the maxillary tuberosity and the pterygoid plates, Trimble suggested sectioning across the posterior aspect of the maxillary tuberosity itself. In our study, comparison between the classic pterygomaxillary dysjunction technique and the Trimble technique was made by using postoperative CT scan. It was found that unfavorable pterygoid plate fractures occurred only in dysjunction group and not in Trimble technique group. Preoperative CT scan assessment was done for all the patients to determine the dimension of the pterygomaxillary region. Preoperative CT scan proved to be helpful in not only determining the dimensions of the pterygomaxillary region but we also found out that thickness of the pterygomaxillary junction was an important parameter which may influence the separation at the pterygomaxillary region. No untoward fractures of the pterygoid plates were seen in Trimble technique (Group II) which makes it a safer technique than classic dysjunction technique. It was noted that pterygoid plate fractures occurred in patients in whom the thickness of the pterygomaxillary junction was <3.6 mm (preoperatively). Therefore, preoperative evaluation is important, on the basis of which we can decide upon the technique to be selected for safer and acceptable separation of pterygomaxillary region.

  13. Point and path performance of light aircraft: A review and analysis

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Summey, D. C.; Johnson, W. D.

    1973-01-01

    The literature on methods for predicting the performance of light aircraft is reviewed. The methods discussed in the review extend from the classical instantaneous maximum or minimum technique to techniques for generating mathematically optimum flight paths. Classical point performance techniques are shown to be adequate in many cases but their accuracies are compromised by the need to use simple lift, drag, and thrust relations in order to get closed form solutions. Also the investigation of the effect of changes in weight, altitude, configuration, etc. involves many essentially repetitive calculations. Accordingly, computer programs are provided which can fit arbitrary drag polars and power curves with very high precision and which can then use the resulting fits to compute the performance under the assumption that the aircraft is not accelerating.

  14. Fuzzy logic controller versus classical logic controller for residential hybrid solar-wind-storage energy system

    NASA Astrophysics Data System (ADS)

    Derrouazin, A.; Aillerie, M.; Mekkakia-Maaza, N.; Charles, J. P.

    2016-07-01

    Several researches for management of diverse hybrid energy systems and many techniques have been proposed for robustness, savings and environmental purpose. In this work we aim to make a comparative study between two supervision and control techniques: fuzzy and classic logics to manage the hybrid energy system applied for typical housing fed by solar and wind power, with rack of batteries for storage. The system is assisted by the electric grid during energy drop moments. A hydrogen production device is integrated into the system to retrieve surplus energy production from renewable sources for the household purposes, intending the maximum exploitation of these sources over years. The models have been achieved and generated signals for electronic switches command of proposed both techniques are presented and discussed in this paper.

  15. Techniques in Adlerian Psychology.

    ERIC Educational Resources Information Center

    Carlson, Jon, Ed.; Slavik, Steven, Ed.

    This book is a collection of classic and recent papers (published between 1964 and 1994) reprinted from the "Journal of Juvenile Psychology""Individual Psychologist," and "Individual Psychology." Each of the five sections is introduced by the editor's comments. "General Techniques" contains the following…

  16. Could the clinical interpretability of subgroups detected using clustering methods be improved by using a novel two-stage approach?

    PubMed

    Kent, Peter; Stochkendahl, Mette Jensen; Christensen, Henrik Wulff; Kongsted, Alice

    2015-01-01

    Recognition of homogeneous subgroups of patients can usefully improve prediction of their outcomes and the targeting of treatment. There are a number of research approaches that have been used to recognise homogeneity in such subgroups and to test their implications. One approach is to use statistical clustering techniques, such as Cluster Analysis or Latent Class Analysis, to detect latent relationships between patient characteristics. Influential patient characteristics can come from diverse domains of health, such as pain, activity limitation, physical impairment, social role participation, psychological factors, biomarkers and imaging. However, such 'whole person' research may result in data-driven subgroups that are complex, difficult to interpret and challenging to recognise clinically. This paper describes a novel approach to applying statistical clustering techniques that may improve the clinical interpretability of derived subgroups and reduce sample size requirements. This approach involves clustering in two sequential stages. The first stage involves clustering within health domains and therefore requires creating as many clustering models as there are health domains in the available data. This first stage produces scoring patterns within each domain. The second stage involves clustering using the scoring patterns from each health domain (from the first stage) to identify subgroups across all domains. We illustrate this using chest pain data from the baseline presentation of 580 patients. The new two-stage clustering resulted in two subgroups that approximated the classic textbook descriptions of musculoskeletal chest pain and atypical angina chest pain. The traditional single-stage clustering resulted in five clusters that were also clinically recognisable but displayed less distinct differences. In this paper, a new approach to using clustering techniques to identify clinically useful subgroups of patients is suggested. Research designs, statistical methods and outcome metrics suitable for performing that testing are also described. This approach has potential benefits but requires broad testing, in multiple patient samples, to determine its clinical value. The usefulness of the approach is likely to be context-specific, depending on the characteristics of the available data and the research question being asked of it.

  17. Quantum-Classical Correspondence Principle for Work Distributions

    NASA Astrophysics Data System (ADS)

    Jarzynski, Christopher; Quan, H. T.; Rahav, Saar

    2015-07-01

    For closed quantum systems driven away from equilibrium, work is often defined in terms of projective measurements of initial and final energies. This definition leads to statistical distributions of work that satisfy nonequilibrium work and fluctuation relations. While this two-point measurement definition of quantum work can be justified heuristically by appeal to the first law of thermodynamics, its relationship to the classical definition of work has not been carefully examined. In this paper, we employ semiclassical methods, combined with numerical simulations of a driven quartic oscillator, to study the correspondence between classical and quantal definitions of work in systems with 1 degree of freedom. We find that a semiclassical work distribution, built from classical trajectories that connect the initial and final energies, provides an excellent approximation to the quantum work distribution when the trajectories are assigned suitable phases and are allowed to interfere. Neglecting the interferences between trajectories reduces the distribution to that of the corresponding classical process. Hence, in the semiclassical limit, the quantum work distribution converges to the classical distribution, decorated by a quantum interference pattern. We also derive the form of the quantum work distribution at the boundary between classically allowed and forbidden regions, where this distribution tunnels into the forbidden region. Our results clarify how the correspondence principle applies in the context of quantum and classical work distributions and contribute to the understanding of work and nonequilibrium work relations in the quantum regime.

  18. Characterizing chaotic melodies in automatic music composition

    NASA Astrophysics Data System (ADS)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  19. The Effect of Live Classical Piano Music on the Vital Signs of Patients Undergoing Ophthalmic Surgery

    PubMed Central

    Camara, Jorge G.; Ruszkowski, Joseph M.; Worak, Sandra R.

    2008-01-01

    Context Music and surgery. Objective To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Design Retrospective case series. Setting and Patients 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Intervention Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Main outcome measure Mean arterial pressure, heart rate, and respiratory rate. Results 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Conclusion Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery. PMID:18679538

  20. On the importance of an accurate representation of the initial state of the system in classical dynamics simulations

    NASA Astrophysics Data System (ADS)

    García-Vela, A.

    2000-05-01

    A definition of a quantum-type phase-space distribution is proposed in order to represent the initial state of the system in a classical dynamics simulation. The central idea is to define an initial quantum phase-space state of the system as the direct product of the coordinate and momentum representations of the quantum initial state. The phase-space distribution is then obtained as the square modulus of this phase-space state. The resulting phase-space distribution closely resembles the quantum nature of the system initial state. The initial conditions are sampled with the distribution, using a grid technique in phase space. With this type of sampling the distribution of initial conditions reproduces more faithfully the shape of the original phase-space distribution. The method is applied to generate initial conditions describing the three-dimensional state of the Ar-HCl cluster prepared by ultraviolet excitation. The photodissociation dynamics is simulated by classical trajectories, and the results are compared with those of a wave packet calculation. The classical and quantum descriptions are found in good agreement for those dynamical events less subject to quantum effects. The classical result fails to reproduce the quantum mechanical one for the more strongly quantum features of the dynamics. The properties and applicability of the phase-space distribution and the sampling technique proposed are discussed.

  1. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paavola, Janika; Hall, Michael J. W.; Paris, Matteo G. A.

    The transition from quantum to classical, in the case of a quantum harmonic oscillator, is typically identified with the transition from a quantum superposition of macroscopically distinguishable states, such as the Schroedinger-cat state, into the corresponding statistical mixture. This transition is commonly characterized by the asymptotic loss of the interference term in the Wigner representation of the cat state. In this paper we show that the quantum-to-classical transition has different dynamical features depending on the measure for nonclassicality used. Measures based on an operatorial definition have well-defined physical meaning and allow a deeper understanding of the quantum-to-classical transition. Our analysismore » shows that, for most nonclassicality measures, the Schroedinger-cat state becomes classical after a finite time. Moreover, our results challenge the prevailing idea that more macroscopic states are more susceptible to decoherence in the sense that the transition from quantum to classical occurs faster. Since nonclassicality is a prerequisite for entanglement generation our results also bridge the gap between decoherence, which is lost only asymptotically, and entanglement, which may show a ''sudden death''. In fact, whereas the loss of coherences still remains asymptotic, we emphasize that the transition from quantum to classical can indeed occur at a finite time.« less

  3. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  4. Geometric Algebra for Physicists

    NASA Astrophysics Data System (ADS)

    Doran, Chris; Lasenby, Anthony

    2007-11-01

    Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.

  5. Biogeochemical behaviour and bioremediation of uranium in waters of abandoned mines.

    PubMed

    Mkandawire, Martin

    2013-11-01

    The discharges of uranium and associated radionuclides as well as heavy metals and metalloids from waste and tailing dumps in abandoned uranium mining and processing sites pose contamination risks to surface and groundwater. Although many more are being planned for nuclear energy purposes, most of the abandoned uranium mines are a legacy of uranium production that fuelled arms race during the cold war of the last century. Since the end of cold war, there have been efforts to rehabilitate the mining sites, initially, using classical remediation techniques based on high chemical and civil engineering. Recently, bioremediation technology has been sought as alternatives to the classical approach due to reasons, which include: (a) high demand of sites requiring remediation; (b) the economic implication of running and maintaining the facilities due to high energy and work force demand; and (c) the pattern and characteristics of contaminant discharges in most of the former uranium mining and processing sites prevents the use of classical methods. This review discusses risks of uranium contamination from abandoned uranium mines from the biogeochemical point of view and the potential and limitation of uranium bioremediation technique as alternative to classical approach in abandoned uranium mining and processing sites.

  6. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    NASA Astrophysics Data System (ADS)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  7. Quantum cryptography: a view from classical cryptography

    NASA Astrophysics Data System (ADS)

    Buchmann, Johannes; Braun, Johannes; Demirel, Denise; Geihs, Matthias

    2017-06-01

    Much of digital data requires long-term protection of confidentiality, for example, medical health records. Cryptography provides such protection. However, currently used cryptographic techniques such as Diffe-Hellman key exchange may not provide long-term security. Such techniques rely on certain computational assumptions, such as the hardness of the discrete logarithm problem that may turn out to be incorrect. On the other hand, quantum cryptography---in particular quantum random number generation and quantum key distribution---offers information theoretic protection. In this paper, we explore the challenge of providing long-term confidentiality and we argue that a combination of quantum cryptography and classical cryptography can provide such protection.

  8. Recommender engine for continuous-time quantum Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Huang, Li; Yang, Yi-feng; Wang, Lei

    2017-03-01

    Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.

  9. Quantum key distribution with 1.25 Gbps clock synchronization.

    PubMed

    Bienfang, J; Gross, A; Mink, A; Hershman, B; Nakassis, A; Tang, X; Lu, R; Su, D; Clark, Charles; Williams, Carl; Hagley, E; Wen, Jesse

    2004-05-03

    We have demonstrated the exchange of sifted quantum cryptographic key over a 730 meter free-space link at rates of up to 1.0 Mbps, two orders of magnitude faster than previously reported results. A classical channel at 1550 nm operates in parallel with a quantum channel at 845 nm. Clock recovery techniques on the classical channel at 1.25 Gbps enable quantum transmission at up to the clock rate. System performance is currently limited by the timing resolution of our silicon avalanche photodiode detectors. With improved detector resolution, our technique will yield another order of magnitude increase in performance, with existing technology.

  10. Making classical ground-state spin computing fault-tolerant.

    PubMed

    Crosson, I J; Bacon, D; Brown, K R

    2010-09-01

    We examine a model of classical deterministic computing in which the ground state of the classical system is a spatial history of the computation. This model is relevant to quantum dot cellular automata as well as to recent universal adiabatic quantum computing constructions. In its most primitive form, systems constructed in this model cannot compute in an error-free manner when working at nonzero temperature. However, by exploiting a mapping between the partition function for this model and probabilistic classical circuits we are able to show that it is possible to make this model effectively error-free. We achieve this by using techniques in fault-tolerant classical computing and the result is that the system can compute effectively error-free if the temperature is below a critical temperature. We further link this model to computational complexity and show that a certain problem concerning finite temperature classical spin systems is complete for the complexity class Merlin-Arthur. This provides an interesting connection between the physical behavior of certain many-body spin systems and computational complexity.

  11. Comparison of incidence of intravascular injections during transforaminal epidural steroid injection using different needle types

    PubMed Central

    Lee, Yong Ho

    2014-01-01

    Background Infrequent but serious complications of transforaminal epidural steroid injection (TFESI) occur due to inadvertent intravascular injections. A few studies reported that the different needle types can influence on the occurrences of intravascular incidence in TFESI. This study prospectively evaluated whether short-bevel needle can reduce the incidences of intravascular injection of TFESI compared to long-bevel needles. Methods From March 2013 to December 2013, 239 consecutive patients were enrolled and received 249 fluoroscopically guided TFESI using the classic technique. Confirmation of intravascular spread was done initially with real time fluoroscopy and then with digital subtraction angiography method in a same patient. Injection technique for TFESI was the same for both short-bevel and long-bevel needle types. Results The incidences of intravascular injections with the long-bevel and short-bevel needles were 15.0% (21/140) and 9.2% (4/140), respectively. More than half of intravascular injections occurred simultaneously with epidural injections (8.0%, 20/249). There were no statistically significant differences between the long-bevel and the short-bevel needles in the rates of intravascular injections (P = 0.17). Conclusions Short-bevel needles did not demonstrate any benefits in reducing the incidence of intravascular injection. PMID:25302096

  12. Enamel Thickness before and after Orthodontic Treatment Analysed in Optical Coherence Tomography

    PubMed Central

    Koprowski, Robert; Safranow, Krzysztof; Woźniak, Krzysztof

    2017-01-01

    Despite the continuous development of materials and techniques of adhesive bonding, the basic procedure remains relatively constant. The technique is based on three components: etching substance, adhesive system, and composite material. The use of etchants during bonding orthodontic brackets carries the risk of damage to the enamel. Therefore, the article examines the effect of the manner of enamel etching on its thickness before and after orthodontic treatment. The study was carried out in vitro on a group of 80 teeth. It was divided into two subgroups of 40 teeth each. The procedure of enamel etching was performed under laboratory conditions. In the first subgroup, the classic method of enamel etching and the fifth-generation bonding system were used. In the second subgroup, the seventh-generation (self-etching) bonding system was used. In both groups, metal orthodontic brackets were fixed and the enamel was cleaned with a cutter fixed on the micromotor after their removal. Before and after the treatment, two-dimensional optical coherence tomography scans were performed. The enamel thickness was assessed on the two-dimensional scans. The average enamel thickness in both subgroups was not statistically significant. PMID:28243604

  13. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Data re-arranging techniques leading to proper variable selections in high energy physics

    NASA Astrophysics Data System (ADS)

    Kůs, Václav; Bouř, Petr

    2017-12-01

    We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.

  15. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Thermodynamics and Kinetics of Prenucleation Clusters, Classical and Non-Classical Nucleation

    PubMed Central

    Zahn, Dirk

    2015-01-01

    Recent observations of prenucleation species and multi-stage crystal nucleation processes challenge the long-established view on the thermodynamics of crystal formation. Here, we review and generalize extensions to classical nucleation theory. Going beyond the conventional implementation as has been used for more than a century now, nucleation inhibitors, precursor clusters and non-classical nucleation processes are rationalized as well by analogous concepts based on competing interface and bulk energy terms. This is illustrated by recent examples of species formed prior to/instead of crystal nucleation and multi-step nucleation processes. Much of the discussed insights were obtained from molecular simulation using advanced sampling techniques, briefly summarized herein for both nucleation-controlled and diffusion-controlled aggregate formation. PMID:25914369

  17. An Update on Statistical Boosting in Biomedicine.

    PubMed

    Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf

    2017-01-01

    Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  18. Visualizing the semantic structure in classical music works.

    PubMed

    Chan, Wing-Yi; Qu, Huamin; Mak, Wai-Ho

    2010-01-01

    A major obstacle in the appreciation of classical music is that extensive training is required to understand musical structure and compositional techniques toward comprehending the thoughts behind the musical work. In this paper, we propose an innovative visualization solution to reveal the semantic structure in classical orchestral works such that users can gain insights into musical structure and appreciate the beauty of music. We formulate the semantic structure into macrolevel layer interactions, microlevel theme variations, and macro-micro relationships between themes and layers to abstract the complicated construction of a musical composition. The visualization has been applied with success in understanding some classical music works as supported by highly promising user study results with the general audience and very positive feedback from music students and experts, demonstrating its effectiveness in conveying the sophistication and beauty of classical music to novice users with informative and intuitive displays.

  19. Improving esthetic results in benign parotid surgery: statistical evaluation of facelift approach, sternocleidomastoid flap, and superficial musculoaponeurotic system flap application.

    PubMed

    Bianchi, Bernardo; Ferri, Andrea; Ferrari, Silvano; Copelli, Chiara; Sesenna, Enrico

    2011-04-01

    The purpose of this article was to analyze the efficacy of facelift incision, sternocleidomastoid muscle flap, and superficial musculoaponeurotic system flap for improving the esthetic results in patients undergoing partial parotidectomy for benign parotid tumor resection. The usefulness of partial parotidectomy is discussed, and a statistical evaluation of the esthetic results was performed. From January 1, 1996, to January 1, 2007, 274 patients treated for benign parotid tumors were studied. Of these, 172 underwent partial parotidectomy. The 172 patients were divided into 4 groups: partial parotidectomy with classic or modified Blair incision without reconstruction (group 1), partial parotidectomy with facelift incision and without reconstruction (group 2), partial parotidectomy with facelift incision associated with sternocleidomastoid muscle flap (group 3), and partial parotidectomy with facelift incision associated with superficial musculoaponeurotic system flap (group 4). Patients were considered, after a follow-up of at least 18 months, for functional and esthetic evaluation. The functional outcome was assessed considering the facial nerve function, Frey syndrome, and recurrence. The esthetic evaluation was performed by inviting the patients and a blind panel of 1 surgeon and 2 secretaries of the department to give a score of 1 to 10 to assess the final cosmetic outcome. The statistical analysis was finally performed using the Mann-Whitney U test for nonparametric data to compare the different group results. P less than .05 was considered significant. No recurrence developed in any of the 4 groups or in any of the 274 patients during the follow-up period. The statistical analysis, comparing group 1 and the other groups, revealed a highly significant statistical difference (P < .0001) for all groups. Also, when group 2 was compared with groups 3 and 4, the difference was highly significantly different statistically (P = .0018 for group 3 and P = .0005 for group 4). Finally, when groups 3 and 4 were compared, the difference was not statistically significant (P = .3467). Partial parotidectomy is the real key point for improving esthetic results in benign parotid surgery. The evaluation of functional complications and the recurrence rate in this series of patients has confirmed that this technique can be safely used for parotid benign tumor resection. The use of a facelift incision alone led to a high statistically significant improvement in the esthetic outcome. When the facelift incision was used with reconstructive techniques, such as the sternocleidomastoid muscle flap or the superficial musculoaponeurotic system flap, the esthetic results improved further. Finally, no statistically significant difference resulted comparing the use of the superficial musculoaponeurotic system and the sternocleidomastoid muscle flap. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  20. Fuzzy logic controller versus classical logic controller for residential hybrid solar-wind-storage energy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derrouazin, A., E-mail: derrsid@gmail.com; Université de Lorraine, LMOPS, EA 4423, 57070 Metz; CentraleSupélec, LMOPS, 57070 Metz

    Several researches for management of diverse hybrid energy systems and many techniques have been proposed for robustness, savings and environmental purpose. In this work we aim to make a comparative study between two supervision and control techniques: fuzzy and classic logics to manage the hybrid energy system applied for typical housing fed by solar and wind power, with rack of batteries for storage. The system is assisted by the electric grid during energy drop moments. A hydrogen production device is integrated into the system to retrieve surplus energy production from renewable sources for the household purposes, intending the maximum exploitationmore » of these sources over years. The models have been achieved and generated signals for electronic switches command of proposed both techniques are presented and discussed in this paper.« less

  1. Time Domain Stability Margin Assessment Method

    NASA Technical Reports Server (NTRS)

    Clements, Keith

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  2. Time-Domain Stability Margin Assessment

    NASA Technical Reports Server (NTRS)

    Clements, Keith

    2016-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  3. Methods for the calculation of axial wave numbers in lined ducts with mean flow

    NASA Technical Reports Server (NTRS)

    Eversman, W.

    1981-01-01

    A survey is made of the methods available for the calculation of axial wave numbers in lined ducts. Rectangular and circular ducts with both uniform and non-uniform flow are considered as are ducts with peripherally varying liners. A historical perspective is provided by a discussion of the classical methods for computing attenuation when no mean flow is present. When flow is present these techniques become either impractical or impossible. A number of direct eigenvalue determination schemes which have been used when flow is present are discussed. Methods described are extensions of the classical no-flow technique, perturbation methods based on the no-flow technique, direct integration methods for solution of the eigenvalue equation, an integration-iteration method based on the governing differential equation for acoustic transmission, Galerkin methods, finite difference methods, and finite element methods.

  4. Resolution of quantum singularities

    NASA Astrophysics Data System (ADS)

    Konkowski, Deborah; Helliwell, Thomas

    2017-01-01

    A review of quantum singularities in static and conformally static spacetimes is given. A spacetime is said to be quantum mechanically non-singular if a quantum wave packet does not feel, in some sense, the presence of a singularity; mathematically, this means that the wave operator is essentially self-adjoint on the space of square integrable functions. Spacetimes with classical mild singularities (quasiregular ones) to spacetimes with classical strong curvature singularities have been tested. Here we discuss the similarities and differences between classical singularities that are healed quantum mechanically and those that are not. Possible extensions of the mathematical technique to more physically realistic spacetimes are discussed.

  5. Quantum machine learning.

    PubMed

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  6. Quantum machine learning

    NASA Astrophysics Data System (ADS)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-01

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  7. On information, negentropy and H-theorem

    NASA Astrophysics Data System (ADS)

    Chakrabarti, C. G.; Sarker, N. G.

    1983-09-01

    The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.

  8. Minimum Uncertainty Coherent States Attached to Nondegenerate Parametric Amplifiers

    NASA Astrophysics Data System (ADS)

    Dehghani, A.; Mojaveri, B.

    2015-06-01

    Exact analytical solutions for the two-mode nondegenerate parametric amplifier have been obtained by using the transformation from the two-dimensional harmonic oscillator Hamiltonian. Some important physical properties such as quantum statistics and quadrature squeezing of the corresponding states are investigated. In addition, these states carry classical features such as Poissonian statistics and minimize the Heisenberg uncertainty relation of a pair of the coordinate and the momentum operators.

  9. Factorization approach to superintegrable systems: Formalism and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Á., E-mail: angelb@ubu.es; Herranz, F. J., E-mail: fjherranz@ubu.es; Kuru, Ş., E-mail: kuru@science.ankara.edu.tr

    2017-03-15

    The factorization technique for superintegrable Hamiltonian systems is revisited and applied in order to obtain additional (higher-order) constants of the motion. In particular, the factorization approach to the classical anisotropic oscillator on the Euclidean plane is reviewed, and new classical (super) integrable anisotropic oscillators on the sphere are constructed. The Tremblay–Turbiner–Winternitz system on the Euclidean plane is also studied from this viewpoint.

  10. Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory

    DTIC Science & Technology

    2013-12-10

    screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag

  11. Resolving anthropogenic aerosol pollution types - deconvolution and exploratory classification of pollution events

    NASA Astrophysics Data System (ADS)

    Äijälä, Mikko; Heikkinen, Liine; Fröhlich, Roman; Canonaco, Francesco; Prévôt, André S. H.; Junninen, Heikki; Petäjä, Tuukka; Kulmala, Markku; Worsnop, Douglas; Ehn, Mikael

    2017-03-01

    Mass spectrometric measurements commonly yield data on hundreds of variables over thousands of points in time. Refining and synthesizing this raw data into chemical information necessitates the use of advanced, statistics-based data analytical techniques. In the field of analytical aerosol chemistry, statistical, dimensionality reductive methods have become widespread in the last decade, yet comparable advanced chemometric techniques for data classification and identification remain marginal. Here we present an example of combining data dimensionality reduction (factorization) with exploratory classification (clustering), and show that the results cannot only reproduce and corroborate earlier findings, but also complement and broaden our current perspectives on aerosol chemical classification. We find that applying positive matrix factorization to extract spectral characteristics of the organic component of air pollution plumes, together with an unsupervised clustering algorithm, k-means+ + , for classification, reproduces classical organic aerosol speciation schemes. Applying appropriately chosen metrics for spectral dissimilarity along with optimized data weighting, the source-specific pollution characteristics can be statistically resolved even for spectrally very similar aerosol types, such as different combustion-related anthropogenic aerosol species and atmospheric aerosols with similar degree of oxidation. In addition to the typical oxidation level and source-driven aerosol classification, we were also able to classify and characterize outlier groups that would likely be disregarded in a more conventional analysis. Evaluating solution quality for the classification also provides means to assess the performance of mass spectral similarity metrics and optimize weighting for mass spectral variables. This facilitates algorithm-based evaluation of aerosol spectra, which may prove invaluable for future development of automatic methods for spectra identification and classification. Robust, statistics-based results and data visualizations also provide important clues to a human analyst on the existence and chemical interpretation of data structures. Applying these methods to a test set of data, aerosol mass spectrometric data of organic aerosol from a boreal forest site, yielded five to seven different recurring pollution types from various sources, including traffic, cooking, biomass burning and nearby sawmills. Additionally, three distinct, minor pollution types were discovered and identified as amine-dominated aerosols.

  12. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun

    2004-05-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less

  13. Multiple Point Statistics algorithm based on direct sampling and multi-resolution images

    NASA Astrophysics Data System (ADS)

    Julien, S.; Renard, P.; Chugunova, T.

    2017-12-01

    Multiple Point Statistics (MPS) has become popular for more than one decade in Earth Sciences, because these methods allow to generate random fields reproducing highly complex spatial features given in a conceptual model, the training image, while classical geostatistics techniques based on bi-point statistics (covariance or variogram) fail to generate realistic models. Among MPS methods, the direct sampling consists in borrowing patterns from the training image to populate a simulation grid. This latter is sequentially filled by visiting each of these nodes in a random order, and then the patterns, whose the number of nodes is fixed, become narrower during the simulation process, as the simulation grid is more densely informed. Hence, large scale structures are caught in the beginning of the simulation and small scale ones in the end. However, MPS may mix spatial characteristics distinguishable at different scales in the training image, and then loose the spatial arrangement of different structures. To overcome this limitation, we propose to perform MPS simulation using a decomposition of the training image in a set of images at multiple resolutions. Applying a Gaussian kernel onto the training image (convolution) results in a lower resolution image, and iterating this process, a pyramid of images depicting fewer details at each level is built, as it can be done in image processing for example to lighten the space storage of a photography. The direct sampling is then employed to simulate the lowest resolution level, and then to simulate each level, up to the finest resolution, conditioned to the level one rank coarser. This scheme helps reproduce the spatial structures at any scale of the training image and then generate more realistic models. We illustrate the method with aerial photographies (satellite images) and natural textures. Indeed, these kinds of images often display typical structures at different scales and are well-suited for MPS simulation techniques.

  14. A Photon Interference Detector with Continuous Display.

    ERIC Educational Resources Information Center

    Gilmore, R. S.

    1978-01-01

    Describes an apparatus which attempts to give a direct visual impression of the random detection of individual photons coupled with the recognition of the classical intensity distribution as a result of fairly high proton statistics. (Author/GA)

  15. Colors of Inner Disk Classical Kuiper Belt Objects

    NASA Astrophysics Data System (ADS)

    Romanishin, W.; Tegler, S. C.; Consolmagno, G. J.

    2010-07-01

    We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten inner belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.

  16. COLORS OF INNER DISK CLASSICAL KUIPER BELT OBJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanishin, W.; Tegler, S. C.; Consolmagno, G. J., E-mail: wromanishin@ou.ed, E-mail: Stephen.Tegler@nau.ed, E-mail: gjc@specola.v

    2010-07-15

    We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten innermore » belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.« less

  17. Statistics of transmission eigenvalues in two-dimensional quantum cavities: Ballistic versus stochastic scattering

    NASA Astrophysics Data System (ADS)

    Rotter, Stefan; Aigner, Florian; Burgdörfer, Joachim

    2007-03-01

    We investigate the statistical distribution of transmission eigenvalues in phase-coherent transport through quantum dots. In two-dimensional ab initio simulations for both clean and disordered two-dimensional cavities, we find markedly different quantum-to-classical crossover scenarios for these two cases. In particular, we observe the emergence of “noiseless scattering states” in clean cavities, irrespective of sharp-edged entrance and exit lead mouths. We find the onset of these “classical” states to be largely independent of the cavity’s classical chaoticity, but very sensitive with respect to bulk disorder. Our results suggest that for weakly disordered cavities, the transmission eigenvalue distribution is determined both by scattering at the disorder potential and the cavity walls. To properly account for this intermediate parameter regime, we introduce a hybrid crossover scheme, which combines previous models that are valid in the ballistic and the stochastic limit, respectively.

  18. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  19. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  20. Turbulent statistics and intermittency enhancement in coflowing superfluid 4He

    NASA Astrophysics Data System (ADS)

    Biferale, L.; Khomenko, D.; L'vov, V.; Pomyalov, A.; Procaccia, I.; Sahoo, G.

    2018-02-01

    The large-scale turbulent statistics of mechanically driven superfluid 4He was shown experimentally to follow the classical counterpart. In this paper, we use direct numerical simulations to study the whole range of scales in a range of temperatures T ∈[1.3 ,2.1 ] K. The numerics employ self-consistent and nonlinearly coupled normal and superfluid components. The main results are that (i) the velocity fluctuations of normal and super components are well correlated in the inertial range of scales, but decorrelate at small scales. (ii) The energy transfer by mutual friction between components is particulary efficient in the temperature range between 1.8 and 2 K, leading to enhancement of small-scale intermittency for these temperatures. (iii) At low T and close to Tλ, the scaling properties of the energy spectra and structure functions of the two components are approaching those of classical hydrodynamic turbulence.

  1. Quantum chaos: an introduction via chains of interacting spins-1/2

    NASA Astrophysics Data System (ADS)

    Gubin, Aviva; Santos, Lea

    2012-02-01

    We discuss aspects of quantum chaos by focusing on spectral statistical properties and structures of eigenstates of quantum many-body systems. Quantum systems whose classical counterparts are chaotic have properties that differ from those of quantum systems whose classical counterparts are regular. One of the main signatures of what became known as quantum chaos is a spectrum showing repulsion of the energy levels. We show how level repulsion may develop in one-dimensional systems of interacting spins-1/2 which are devoid of random elements and involve only two-body interactions. We present a simple recipe to unfold the spectrum and emphasize the importance of taking into account the symmetries of the system. In addition to the statistics of eigenvalues, we analyze also how the structure of the eigenstates may indicate chaos. This is done by computing quantities that measure the level of delocalization of the eigenstates.

  2. Finite-size effect on optimal efficiency of heat engines.

    PubMed

    Tajima, Hiroyasu; Hayashi, Masahito

    2017-07-01

    The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.

  3. Multiple Query Evaluation Based on an Enhanced Genetic Algorithm.

    ERIC Educational Resources Information Center

    Tamine, Lynda; Chrisment, Claude; Boughanem, Mohand

    2003-01-01

    Explains the use of genetic algorithms to combine results from multiple query evaluations to improve relevance in information retrieval. Discusses niching techniques, relevance feedback techniques, and evolution heuristics, and compares retrieval results obtained by both genetic multiple query evaluation and classical single query evaluation…

  4. $$\\mathscr{H}_2$$ optimal control techniques for resistive wall mode feedback in tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clement, Mitchell; Hanson, Jeremy; Bialek, Jim

    DIII-D experiments show that a new, advanced algorithm improves resistive wall mode (RWM) stability control in high performance discharges using external coils. DIII-D can excite strong, locked or nearly locked external kink modes whose rotation frequencies and growth rates are on the order of the magnetic ux di usion time of the vacuum vessel wall. The VALEN RWM model has been used to gauge the e ectiveness of RWM control algorithms in tokamaks. Simulations and experiments have shown that modern control techniques like Linear Quadratic Gaussian (LQG) control will perform better, using 77% less current, than classical techniques when usingmore » control coils external to DIII-D's vacuum vessel. Experiments were conducted to develop control of a rotating n = 1 perturbation using an LQG controller derived from VALEN and external coils. Feedback using this LQG algorithm outperformed a proportional gain only controller in these perturbation experiments over a range of frequencies. Results from high N experiments also show that advanced feedback techniques using external control coils may be as e ective as internal control coil feedback using classical control techniques.« less

  5. $$\\mathscr{H}_2$$ optimal control techniques for resistive wall mode feedback in tokamaks

    DOE PAGES

    Clement, Mitchell; Hanson, Jeremy; Bialek, Jim; ...

    2018-02-28

    DIII-D experiments show that a new, advanced algorithm improves resistive wall mode (RWM) stability control in high performance discharges using external coils. DIII-D can excite strong, locked or nearly locked external kink modes whose rotation frequencies and growth rates are on the order of the magnetic ux di usion time of the vacuum vessel wall. The VALEN RWM model has been used to gauge the e ectiveness of RWM control algorithms in tokamaks. Simulations and experiments have shown that modern control techniques like Linear Quadratic Gaussian (LQG) control will perform better, using 77% less current, than classical techniques when usingmore » control coils external to DIII-D's vacuum vessel. Experiments were conducted to develop control of a rotating n = 1 perturbation using an LQG controller derived from VALEN and external coils. Feedback using this LQG algorithm outperformed a proportional gain only controller in these perturbation experiments over a range of frequencies. Results from high N experiments also show that advanced feedback techniques using external control coils may be as e ective as internal control coil feedback using classical control techniques.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gevorkyan, A. S., E-mail: g-ashot@sci.am; Sahakyan, V. V.

    We study the classical 1D Heisenberg spin glasses in the framework of nearest-neighboring model. Based on the Hamilton equations we obtained the system of recurrence equations which allows to perform node-by-node calculations of a spin-chain. It is shown that calculations from the first principles of classical mechanics lead to ℕℙ hard problem, that however in the limit of the statistical equilibrium can be calculated by ℙ algorithm. For the partition function of the ensemble a new representation is offered in the form of one-dimensional integral of spin-chains’ energy distribution.

  7. A classical density-functional theory for describing water interfaces.

    PubMed

    Hughes, Jessica; Krebs, Eric J; Roundy, David

    2013-01-14

    We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.

  8. Quantum communication complexity advantage implies violation of a Bell inequality

    PubMed Central

    Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii

    2016-01-01

    We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600

  9. Neuroanatomical morphometric characterization of sex differences in youth using statistical learning.

    PubMed

    Sepehrband, Farshid; Lynch, Kirsten M; Cabeen, Ryan P; Gonzalez-Zacarias, Clio; Zhao, Lu; D'Arcy, Mike; Kesselman, Carl; Herting, Megan M; Dinov, Ivo D; Toga, Arthur W; Clark, Kristi A

    2018-05-15

    Exploring neuroanatomical sex differences using a multivariate statistical learning approach can yield insights that cannot be derived with univariate analysis. While gross differences in total brain volume are well-established, uncovering the more subtle, regional sex-related differences in neuroanatomy requires a multivariate approach that can accurately model spatial complexity as well as the interactions between neuroanatomical features. Here, we developed a multivariate statistical learning model using a support vector machine (SVM) classifier to predict sex from MRI-derived regional neuroanatomical features from a single-site study of 967 healthy youth from the Philadelphia Neurodevelopmental Cohort (PNC). Then, we validated the multivariate model on an independent dataset of 682 healthy youth from the multi-site Pediatric Imaging, Neurocognition and Genetics (PING) cohort study. The trained model exhibited an 83% cross-validated prediction accuracy, and correctly predicted the sex of 77% of the subjects from the independent multi-site dataset. Results showed that cortical thickness of the middle occipital lobes and the angular gyri are major predictors of sex. Results also demonstrated the inferential benefits of going beyond classical regression approaches to capture the interactions among brain features in order to better characterize sex differences in male and female youths. We also identified specific cortical morphological measures and parcellation techniques, such as cortical thickness as derived from the Destrieux atlas, that are better able to discriminate between males and females in comparison to other brain atlases (Desikan-Killiany, Brodmann and subcortical atlases). Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Accuracy of specific BIVA for the assessment of body composition in the United States population.

    PubMed

    Buffa, Roberto; Saragat, Bruno; Cabras, Stefano; Rinaldi, Andrea C; Marini, Elisabetta

    2013-01-01

    Bioelectrical impedance vector analysis (BIVA) is a technique for the assessment of hydration and nutritional status, used in the clinical practice. Specific BIVA is an analytical variant, recently proposed for the Italian elderly population, that adjusts bioelectrical values for body geometry. Evaluating the accuracy of specific BIVA in the adult U.S. population, compared to the 'classic' BIVA procedure, using DXA as the reference technique, in order to obtain an interpretative model of body composition. A cross-sectional sample of 1590 adult individuals (836 men and 754 women, 21-49 years old) derived from the NHANES 2003-2004 was considered. Classic and specific BIVA were applied. The sensitivity and specificity in recognizing individuals below the 5(th) and above the 95(th) percentiles of percent fat (FMDXA%) and extracellular/intracellular water (ECW/ICW) ratio were evaluated by receiver operating characteristic (ROC) curves. Classic and specific BIVA results were compared by a probit multiple-regression. Specific BIVA was significantly more accurate than classic BIVA in evaluating FMDXA% (ROC areas: 0.84-0.92 and 0.49-0.61 respectively; p = 0.002). The evaluation of ECW/ICW was accurate (ROC areas between 0.83 and 0.96) and similarly performed by the two procedures (p = 0.829). The accuracy of specific BIVA was similar in the two sexes (p = 0.144) and in FMDXA% and ECW/ICW (p = 0.869). Specific BIVA showed to be an accurate technique. The tolerance ellipses of specific BIVA can be used for evaluating FM% and ECW/ICW in the U.S. adult population.

  11. MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data

    PubMed Central

    Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.

    2014-01-01

    Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674

  12. The Effect of Mass, Wind Angle, and Erection Technique on the Aeroelastic Behaviour of a Cable-Stayed Bridge Model (Effet de la Masse, de L’Angle du Vent et de la Technique D’Erection sur le Comportement Aeroelastique d’une Marquette de Pont a Haubans).

    DTIC Science & Technology

    1987-09-01

    response. An estimate of the buffeting response for the two cases is presented in Figure 4, using the theory of Irwin (Reference 7). Data acquisition was...values were obtained using the log decrement method by exciting the bridge in one mode and observing the decay of the response. Classical theory would...added mass or structural damping level. The addition of inertia to the deck would tend to lower the response according to classical vibration theory

  13. A comparison of cone-beam computed tomography and direct measurement in the examination of the mandibular canal and adjacent structures.

    PubMed

    Kim, Thomas S; Caruso, Joseph M; Christensen, Heidi; Torabinejad, Mahmoud

    2010-07-01

    The purpose of this investigation was to assess the ability of cone-beam computed tomography (CBCT) scanning to measure distances from the apices of selected posterior teeth to the mandibular canal. Measurements were taken from the apices of all posterior teeth that were superior to the mandibular canal. A pilot study was performed to determine the scanning parameters that produced the most diagnostic image and the best dissection technique. Twelve human hemimandibles with posterior teeth were scanned at .20 voxels on an I-CAT Classic CBCT device (Imaging Sciences International, Hatfield, PA), and the scans were exported in Digital Imaging and Communications in Medicine (DICOM) format. The scans were examined in InVivo Dental software (Anatomage, San Jose, CA), and measurements were taken from the apex of each root along its long axis to the upper portion of the mandibular canal. The specimens were dissected under a dental operating microscope, and analogous direct measurements were taken with a Boley gauge. All measurements were taken in triplicate at least 1 week apart by one individual (TSK). The results were averaged and the data separated into matching pairs for statistical analysis. There was no statistical difference (alpha = .05) between the methods of measurement according to the Wilcoxon matched pairs test (p = 0.676). For the anatomic measurements, the intra-rater correlation coefficient (ICC) was .980 and for the CBCT it was .949, indicating that both methods were highly reproducible. Both measurement methods were highly predictive of and highly correlated to each other according to regression and correlation analysis, respectively. Based on the results of this study, the I-CAT Classic can be used to measure distances from the apices of the posterior teeth to the mandibular canal as accurately as direct anatomic dissection. Copyright 2010 American Association of Endodontists. All rights reserved.

  14. Exploring Attitudes of Indian Classical Singers Toward Seeking Vocal Health Care.

    PubMed

    Gunjawate, Dhanshree R; Aithal, Venkataraja U; Guddattu, Vasudeva; Kishore, Amrutha; Bellur, Rajashekhar

    2016-11-01

    The attitude of Indian classical singers toward seeking vocal health care is a dimension yet to be explored. The current study was aimed to determine the attitudes of these singers toward seeking vocal health care and further understand the influence of age and gender. Cross-sectional. A 10-item self-report questionnaire adapted from a study on contemporary commercial music singers was used. An additional question was added to ask if the singer was aware about the profession and role of speech-language pathologists (SLPs). The questionnaire was administered on 55 randomly selected self-identified trained Indian classical singers who rated the items using a five-point Likert scale. Demographic variables were summarized using descriptive statistics and t test was used to compare the mean scores between genders and age groups. Of the singers, 78.2% were likely to see a doctor for heath-related problems, whereas 81.8% were unlikely to seek medical care for voice-related problems; the difference was statistically significant (P < 0.001). Responses for the questions assessing the attitudes toward findings from medical examination by a specialist revealed a statistically significant difference (P = 0.02) between the genders. Age did not have a significant influence on the responses. Only 23.6% of the respondents were aware about the profession and the role of SLPs. The findings are in tune with western literature reporting hesitation of singers toward seeking vocal health care and draws attention of SLPs to promote their role in vocal health awareness and management. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  15. Impact of the initial classic section during a simulated cross-country skiing skiathlon on the cardiopulmonary responses during the subsequent period of skate skiing.

    PubMed

    Mourot, Laurent; Fabre, Nicolas; Andersson, Erik; Willis, Sarah J; Hébert-Losier, Kim; Holmberg, Hans-Christer

    2014-08-01

    The aim of this study was to assess potential changes in the performance and cardiorespiratory responses of elite cross-country skiers following transition from the classic (CL) to the skating (SK) technique during a simulated skiathlon. Eight elite male skiers performed two 6 km (2 × 3 km) roller-skiing time trials on a treadmill at racing speed: one starting with the classic and switching to the skating technique (CL1-SK2) and another employing the skating technique throughout (SK1-SK2), with continuous monitoring of gas exchanges, heart rates, and kinematics (video). The overall performance times in the CL1-SK2 (21:12 ± 1:24) and SK1-SK2 (20:48 ± 2:00) trials were similar, and during the second section of each performance times and overall cardiopulmonary responses were also comparable. However, in comparison with SK1-SK2, the CL1-SK2 trial involved significantly higher increases in minute ventilation (V̇E, 89.8 ± 26.8 vs. 106.8 ± 17.6 L·min(-1)) and oxygen uptake (V̇O2; 3.1 ± 0.8 vs 3.5 ± 0.5 L·min(-1)) 2 min after the transition as well as longer time constants for V̇E, V̇O2, and heart rate during the first 3 min after the transition. This higher cardiopulmonary exertion was associated with ∼3% faster cycle rates. In conclusion, overall performance during the 2 time trials did not differ. The similar performance times during the second sections were achieved with comparable mean cardiopulmonary responses. However, the observation that during the initial 3-min post-transition following classic skiing cardiopulmonary responses and cycle rates were slightly higher supports the conclusion that an initial section of classic skiing exerts an impact on performance during a subsequent section of skate skiing.

  16. Quantum-Like Representation of Non-Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  17. Non-classical Signature of Parametric Fluorescence and its Application in Metrology

    NASA Astrophysics Data System (ADS)

    Hamar, M.; Michálek, V.; Pathak, A.

    2014-08-01

    The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.

  18. Determination of Phosphates by the Gravimetric Quimociac Technique

    ERIC Educational Resources Information Center

    Shaver, Lee Alan

    2008-01-01

    The determination of phosphates by the classic quimociac gravimetric technique was used successfully as a laboratory experiment in our undergraduate analytical chemistry course. Phosphate-containing compounds are dissolved in acid and converted to soluble orthophosphate ion (PO[subscript 4][superscript 3-]). The soluble phosphate is easily…

  19. Comparison of learning curves and skill transfer between classical and robotic laparoscopy according to the viewing conditions: implications for training.

    PubMed

    Blavier, Adélaïde; Gaudissart, Quentin; Cadière, Guy-Bernard; Nyssen, Anne-Sophie

    2007-07-01

    The purpose of this study was to evaluate the perceptual (2-dimensional [2D] vs. 3-dimensional [3D] view) and instrumental (classical vs. robotic) impacts of new robotic system on learning curves. Forty medical students without any surgical experience were randomized into 4 groups (classical laparoscopy with 3D-direct view or with 2D-indirect view, robotic system in 3D or in 2D) and repeated a laparoscopic task 6 times. After these 6 repetitions, they performed 2 trials with the same technique but in the other viewing condition (perceptive switch). Finally, subjects performed the last 3 trials with the technique they never used (technical switch). Subjects evaluated their performance answering a questionnaire (impressions of mastery, familiarity, satisfaction, self-confidence, and difficulty). Our study showed better performance and improvement in 3D view than in 2D view whatever the instrumental aspect. Participants reported less mastery, familiarity, and self-confidence and more difficulty in classical laparoscopy with 2D-indirect view than in the other conditions. Robotic surgery improves surgical performance and learning, particularly by 3D view advantage. However, perceptive and technical switches emphasize the need to adapt and pursue training also with traditional technology to prevent risks in conversion procedure.

  20. Hearing the shape of the Ising model with a programmable superconducting-flux annealer.

    PubMed

    Vinci, Walter; Markström, Klas; Boixo, Sergio; Roy, Aidan; Spedalieri, Federico M; Warburton, Paul A; Severini, Simone

    2014-07-16

    Two objects can be distinguished if they have different measurable properties. Thus, distinguishability depends on the Physics of the objects. In considering graphs, we revisit the Ising model as a framework to define physically meaningful spectral invariants. In this context, we introduce a family of refinements of the classical spectrum and consider the quantum partition function. We demonstrate that the energy spectrum of the quantum Ising Hamiltonian is a stronger invariant than the classical one without refinements. For the purpose of implementing the related physical systems, we perform experiments on a programmable annealer with superconducting flux technology. Departing from the paradigm of adiabatic computation, we take advantage of a noisy evolution of the device to generate statistics of low energy states. The graphs considered in the experiments have the same classical partition functions, but different quantum spectra. The data obtained from the annealer distinguish non-isomorphic graphs via information contained in the classical refinements of the functions but not via the differences in the quantum spectra.

  1. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  2. Single-snapshot DOA estimation by using Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Fortunati, Stefano; Grasso, Raffaele; Gini, Fulvio; Greco, Maria S.; LePage, Kevin

    2014-12-01

    This paper deals with the problem of estimating the directions of arrival (DOA) of multiple source signals from a single observation vector of an array data. In particular, four estimation algorithms based on the theory of compressed sensing (CS), i.e., the classical ℓ 1 minimization (or Least Absolute Shrinkage and Selection Operator, LASSO), the fast smooth ℓ 0 minimization, and the Sparse Iterative Covariance-Based Estimator, SPICE and the Iterative Adaptive Approach for Amplitude and Phase Estimation, IAA-APES algorithms, are analyzed, and their statistical properties are investigated and compared with the classical Fourier beamformer (FB) in different simulated scenarios. We show that unlike the classical FB, a CS-based beamformer (CSB) has some desirable properties typical of the adaptive algorithms (e.g., Capon and MUSIC) even in the single snapshot case. Particular attention is devoted to the super-resolution property. Theoretical arguments and simulation analysis provide evidence that a CS-based beamformer can achieve resolution beyond the classical Rayleigh limit. Finally, the theoretical findings are validated by processing a real sonar dataset.

  3. Canonical partition functions: ideal quantum gases, interacting classical gases, and interacting quantum gases

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-02-01

    In statistical mechanics, for a system with a fixed number of particles, e.g. a finite-size system, strictly speaking, the thermodynamic quantity needs to be calculated in the canonical ensemble. Nevertheless, the calculation of the canonical partition function is difficult. In this paper, based on the mathematical theory of the symmetric function, we suggest a method for the calculation of the canonical partition function of ideal quantum gases, including ideal Bose, Fermi, and Gentile gases. Moreover, we express the canonical partition functions of interacting classical and quantum gases given by the classical and quantum cluster expansion methods in terms of the Bell polynomial in mathematics. The virial coefficients of ideal Bose, Fermi, and Gentile gases are calculated from the exact canonical partition function. The virial coefficients of interacting classical and quantum gases are calculated from the canonical partition function by using the expansion of the Bell polynomial, rather than calculated from the grand canonical potential.

  4. Enzyme-based electrochemical biosensors for determination of organophosphorus and carbamate pesticides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Everett, W.R.; Rechnitz, G.A.

    1999-01-01

    A mini review of enzyme-based electrochemical biosensors for inhibition analysis of organophosphorus and carbamate pesticides is presented. Discussion includes the most recent literature to present advances in detection limits, selectivity and real sample analysis. Recent reviews on the monitoring of pesticides and their residues suggest that the classical analytical techniques of gas and liquid chromatography are the most widely used methods of detection. These techniques, although very accurate in their determinations, can be quite time consuming and expensive and usually require extensive sample clean up and pro-concentration. For these and many other reasons, the classical techniques are very difficult tomore » adapt for field use. Numerous researchers, in the past decade, have developed and made improvements on biosensors for use in pesticide analysis. This mini review will focus on recent advances made in enzyme-based electrochemical biosensors for the determinations of organophosphorus and carbamate pesticides.« less

  5. Time Domain Stability Margin Assessment of the NASA Space Launch System GN&C Design for Exploration Mission One

    NASA Technical Reports Server (NTRS)

    Clements, Keith; Wall, John

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  6. Time Domain Stability Margin Assessment of the NS Space Launch System GN&C Design for Exploration Mission One

    NASA Technical Reports Server (NTRS)

    Clements, Keith; Wall, John

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  7. Robust Stability Analysis of the Space Launch System Control Design: A Singular Value Approach

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Newsome, Jerry R.

    2015-01-01

    Classical stability analysis consists of breaking the feedback loops one at a time and determining separately how much gain or phase variations would destabilize the stable nominal feedback system. For typical launch vehicle control design, classical control techniques are generally employed. In addition to stability margins, frequency domain Monte Carlo methods are used to evaluate the robustness of the design. However, such techniques were developed for Single-Input-Single-Output (SISO) systems and do not take into consideration the off-diagonal terms in the transfer function matrix of Multi-Input-Multi-Output (MIMO) systems. Robust stability analysis techniques such as H(sub infinity) and mu are applicable to MIMO systems but have not been adopted as standard practices within the launch vehicle controls community. This paper took advantage of a simple singular-value-based MIMO stability margin evaluation method based on work done by Mukhopadhyay and Newsom and applied it to the SLS high-fidelity dynamics model. The method computes a simultaneous multi-loop gain and phase margin that could be related back to classical margins. The results presented in this paper suggest that for the SLS system, traditional SISO stability margins are similar to the MIMO margins. This additional level of verification provides confidence in the robustness of the control design.

  8. Semi-empirical seismic relations of A-F stars from COROT and Kepler legacy data

    NASA Astrophysics Data System (ADS)

    Moya, A.; Suárez, J. C.; García Hernández, A.; Mendoza, M. A.

    2017-10-01

    Asteroseismology is witnessing a revolution, thanks to high-precise asteroseismic space data (MOST, COROT, Kepler, BRITE) and their large ground-based follow-up programs. Those instruments have provided an unprecedented large amount of information, which allows us to scrutinize its statistical properties in the quest for hidden relations among pulsational and/or physical observables. This approach might be particularly useful for stars whose pulsation content is difficult to interpret. This is the case of intermediate-mass classical pulsating stars (I.e. γ Dor, δ Scuti, hybrids) for which current theories do not properly predict the observed oscillation spectra. Here, we establish a first step in finding such hidden relations from data mining techniques for these stars. We searched for those hidden relations in a sample of δ Scuti and hybrid stars observed by COROT and Kepler (74 and 153, respectively). No significant correlations between pairs of observables were found. However, two statistically significant correlations emerged from multivariable correlations in the observed seismic data, which describe the total number of observed frequencies and the largest one, respectively. Moreover, three different sets of stars were found to cluster according to their frequency density distribution. Such sets are in apparent agreement with the asteroseismic properties commonly accepted for A-F pulsating stars.

  9. Hybrid Diffusion Imaging in Mild Traumatic Brain Injury.

    PubMed

    Wu, Yu-Chien; Mustafi, Sourajit Mitra; Harezlak, Jaroslaw; Kodiweera, Chandana; Flashman, Laura A; McAllister, Thomas

    2018-05-22

    Mild traumatic brain injury (mTBI) is an important public health problem. Although conventional medical imaging techniques can detect moderate-to-severe injuries, they are relatively insensitive to mTBI. In this study, we used hybrid diffusion imaging (HYDI) to detect white-matter alterations in nineteen patients with mTBI and 23 other trauma-control patients. Within 15 days (SD=10) of brain injury, all subjects underwent magnetic-resonance HYDI and were assessed with battery of neuropsychological tests of sustained attention, memory, and executive function. Tract-based spatial statistics (TBSS) were used for voxelwise statistical analyses within the white-matter skeleton to study between-group differences in diffusion metrics, within-group correlations between diffusion metrics and clinical outcomes, and between group interaction effects. The advanced diffusion imaging techniques including neurite orientation dispersion and density imaging (NODDI) and q-space analyses appeared to be more sensitive then classic diffusion tensor imaging (DTI). Only NODDI-derived intra-axonal volume fraction (Vic) demonstrated significant group differences (i.e., 5% to 9% lower in the injured brain). Within the mTBI group, Vic and a q-space measure, P0, correlated with 6 of 10 neuropsychological tests including measures of attention, memory, and executive function. In addition, the direction of correlations differed significantly between the groups (R2 > 0.71 and Pinteration < 0.03). Specifically, in the control group, higher Vic and P0 were associated with better performances on clinical assessments, whereas in the mTBI group, higher Vic and P0 were associated with worse performances with correlation coefficients > 0.83. In summary, the NODDI-derived axonal density index and q-space measure for tissue restriction demonstrated superior sensitivity to white-matter changes shortly after mTBI. These techniques hold promise as a neuroimaging biomarker for mTBI.

  10. Single anterior portal: A better option for arthroscopic treatment of traumatic anterior shoulder instability?

    PubMed

    Çiçek, Hakan; Tuhanioğlu, Ümit; Oğur, Hasan Ulaş; Seyfettinoğlu, Fırat; Çiloğlu, Osman; Beyzadeoğlu, Tahsin

    2017-07-01

    The aim of this study was to compare single and double anterior portal techniques in the arthroscopic treatment of traumatic anterior shoulder instability. A total of 91 cases who underwent arthroscopic Bankart repair for anterior shoulder instability were reviewed. The patients were divided into 2 groups as Group 1 (47 male and 2 female; mean age: 25.8 ± 6.8) for arthroscopic single anterior portal approach and Group 2 (41 male and 1 female; mean age: 25.4 ± 6.6) for the classical anterior double portal approach. The groups were compared for clinical scores, range of motion, analgesia requirement, complications, duration of surgery, cost and learning curve according to a short questionnaire completed by the relevant healthcare professionals. No statistically significant difference was found between the 2 groups in terms of pre-operative and post-operative Constant and Rowe Shoulder Scores, range of motion and complications (p > 0.05). In Group 2 patients, the requirement for post-operative analgesics was significantly higher (p < 0.001), whereas the duration of surgery was statistically significantly shorter in Group 1 (p < 0.001). In the assessment of the questionnaire, it was seen that a single portal anterior approach was preferred at a higher ratio (p = 0.035). The cost analysis revealed that the cost was 5.7% less for patients with a single portal. In the arthroscopic treatment of traumatic anterior shoulder instability accompanied by a Bankart lesion, the anterior single portal technique is as successful in terms of clinical results as the conventional double portal approach. The single portal technique has advantages such as less postoperative pain, a shorter surgical learning curve and lower costs. Level III, Therapeutic study. Copyright © 2017 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  11. An Update of the Classical and Novel Methods Used for Measuring Fast Neurotransmitters During Normal and Brain Altered Function

    PubMed Central

    Cifuentes Castro, Victor Hugo; López Valenzuela, Carmen Lucía; Salazar Sánchez, Juan Carlos; Peña, Kenia Pardo; López Pérez, Silvia J.; Ibarra, Jorge Ortega; Villagrán, Alberto Morales

    2014-01-01

    To understand better the cerebral functions, several methods have been developed to study the brain activity, they could be related with morphological, electrophysiological, molecular and neurochemical techniques. Monitoring neurotransmitter concentration is a key role to know better how the brain works during normal or pathological conditions, as well as for studying the changes in neurotransmitter concentration with the use of several drugs that could affect or reestablish the normal brain activity. Immediate response of the brain to environmental conditions is related with the release of the fast acting neurotransmission by glutamate (Glu), γ-aminobutyric acid (GABA) and acetylcholine (ACh) through the opening of ligand-operated ion channels. Neurotransmitter release is mainly determined by the classical microdialysis technique, this is generally coupled to high performance liquid chromatography (HPLC). Detection of neurotransmitters can be done by fluorescence, optical density, electrochemistry or other detection systems more sophisticated. Although the microdialysis method is the golden technique to monitor the brain neurotransmitters, it has a poor temporal resolution. Recently, with the use of biosensor the drawback of temporal resolution has been improved considerably, however other inconveniences have merged, such as stability, reproducibility and the lack of reliable biosensors mainly for GABA. The aim of this review is to show the important advances in the different ways to measure neurotransmitter concentrations; both with the use of classic techniques as well as with the novel methods and alternant approaches to improve the temporal resolution. PMID:25977677

  12. Biomechanical and energetic determinants of technique selection in classical cross-country skiing.

    PubMed

    Pellegrini, Barbara; Zoppirolli, Chiara; Bortolan, Lorenzo; Holmberg, Hans-Christer; Zamparo, Paola; Schena, Federico

    2013-12-01

    Classical cross-country skiing can be performed using three main techniques: diagonal stride (DS), double poling (DP), and double poling with kick (DK). Similar to other forms of human and animal gait, it is currently unclear whether technique selection occurs to minimize metabolic cost or to keep some mechanical factors below a given threshold. The aim of this study was to find the determinants of technique selection. Ten male athletes roller skied on a treadmill at different slopes (from 0° to 7° at 10km/h) and speeds (from 6 to 18km/h at 2°). The technique preferred by skiers was gathered for every proposed condition. Biomechanical parameters and metabolic cost were then measured for each condition and technique. Skiers preferred DP for skiing on the flat and they transitioned to DK and then to DS with increasing slope steepness, when increasing speed all skiers preferred DP. Data suggested that selections mainly occur to remain below a threshold of poling force. Second, critically low values of leg thrust time may limit the use of leg-based techniques at high speeds. A small role has been identified for the metabolic cost of locomotion, which determined the selection of DP for flat skiing. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Dielectric properties of classical and quantized ionic fluids.

    PubMed

    Høye, Johan S

    2010-06-01

    We study time-dependent correlation functions of classical and quantum gases using methods of equilibrium statistical mechanics for systems of uniform as well as nonuniform densities. The basis for our approach is the path integral formalism of quantum mechanical systems. With this approach the statistical mechanics of a quantum mechanical system becomes the equivalent of a classical polymer problem in four dimensions where imaginary time is the fourth dimension. Several nontrivial results for quantum systems have been obtained earlier by this analogy. Here, we will focus upon the presence of a time-dependent electromagnetic pair interaction where the electromagnetic vector potential that depends upon currents, will be present. Thus both density and current correlations are needed to evaluate the influence of this interaction. Then we utilize that densities and currents can be expressed by polarizations by which the ionic fluid can be regarded as a dielectric one for which a nonlocal susceptibility is found. This nonlocality has as a consequence that we find no contribution from a possible transverse electric zero-frequency mode for the Casimir force between metallic plates. Further, we establish expressions for a leading correction to ab initio calculations for the energies of the quantized electrons of molecules where now retardation effects also are taken into account.

  14. Playing-related disabling musculoskeletal disorders in young and adult classical piano students.

    PubMed

    Bruno, S; Lorusso, A; L'Abbate, N

    2008-07-01

    To determine the prevalence of instrument-related musculoskeletal problems in classical piano students and investigate piano-specific risk factors. A specially developed four parts questionnaire was administered to classical piano students of two Apulian conservatories, in southern Italy. A cross-sectional design was used. Prevalences of playing related musculoskeletal disorders (MSDs) were calculated and cases were compared with non-cases. A total of 195 out of the 224 piano students responded (87%). Among 195 responders, 75 (38.4%) were considered affected according to the pre-established criteria. Disabling MSDs showed similar prevalence rates for neck (29.3%), thoracic spine (21.3%) and upper limbs (from 20.0 to 30.4%) in the affected group. Univariate analyses showed statistical differences concerning mean age, number of hours per week spent playing, more than 60 min of continuative playing without breaks, lack of sport practice and acceptability of "No pain, no gain" criterion in students with music-related pain compared with pianists not affected. Statistical correlation was found only between upper limbs diseases in pianists and hand sizes. No correlation with the model of piano played was found in the affected group. The multivariate analyses performed by logistic regression confirmed the independent correlation of the risk factors age, lack of sport practice and acceptability of "No pain, no gain" criterion. Our study showed MSDs to be a common problem among classical piano students. With variance in several studies reported, older students appeared to be more frequently affected by disabling MSDs and no difference in the prevalence rate of the disorders was found in females.

  15. Objective Dysphonia Quantification in Vocal Fold Paralysis: Comparing Nonlinear with Classical Measures

    PubMed Central

    Little, Max A.; Costello, Declan A. E.; Harries, Meredydd L.

    2010-01-01

    Summary Clinical acoustic voice-recording analysis is usually performed using classical perturbation measures, including jitter, shimmer, and noise-to-harmonic ratios (NHRs). However, restrictive mathematical limitations of these measures prevent analysis for severely dysphonic voices. Previous studies of alternative nonlinear random measures addressed wide varieties of vocal pathologies. Here, we analyze a single vocal pathology cohort, testing the performance of these alternative measures alongside classical measures. We present voice analysis pre- and postoperatively in 17 patients with unilateral vocal fold paralysis (UVFP). The patients underwent standard medialization thyroplasty surgery, and the voices were analyzed using jitter, shimmer, NHR, nonlinear recurrence period density entropy (RPDE), detrended fluctuation analysis (DFA), and correlation dimension. In addition, we similarly analyzed 11 healthy controls. Systematizing the preanalysis editing of the recordings, we found that the novel measures were more stable and, hence, reliable than the classical measures on healthy controls. RPDE and jitter are sensitive to improvements pre- to postoperation. Shimmer, NHR, and DFA showed no significant change (P > 0.05). All measures detect statistically significant and clinically important differences between controls and patients, both treated and untreated (P < 0.001, area under curve [AUC] > 0.7). Pre- to postoperation grade, roughness, breathiness, asthenia, and strain (GRBAS) ratings show statistically significant and clinically important improvement in overall dysphonia grade (G) (AUC = 0.946, P < 0.001). Recalculating AUCs from other study data, we compare these results in terms of clinical importance. We conclude that, when preanalysis editing is systematized, nonlinear random measures may be useful for monitoring UVFP-treatment effectiveness, and there may be applications to other forms of dysphonia. PMID:19900790

  16. Quantifying the statistical importance of utilizing regression over classic energy intensity calculations for tracking efficiency improvements in industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nimbalkar, Sachin U.; Wenning, Thomas J.; Guo, Wei

    In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero,more » which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.« less

  17. Retrograde pyelogram using the flexible cystoscope.

    PubMed

    Reddy, P K; Hulbert, J C

    1986-12-01

    A retrograde pyelogram was performed on 2 men with the flexible choledochonephroscope and a 5F whistle-tip ureteral catheter. The procedure was done on an outpatient basis with topical anesthesia and patient tolerance was good. The technique is simple and is a useful alternative to the classical rigid cystoscopic technique.

  18. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  19. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Reconstruction of early phase deformations by integrated magnetic and mesotectonic data evaluation

    NASA Astrophysics Data System (ADS)

    Sipos, András A.; Márton, Emő; Fodor, László

    2018-02-01

    Markers of brittle faulting are widely used for recovering past deformation phases. Rocks often have oriented magnetic fabrics, which can be interpreted as connected to ductile deformation before cementation of the sediment. This paper reports a novel statistical procedure for simultaneous evaluation of AMS (Anisotropy of Magnetic Susceptibility) and fault-slip data. The new method analyzes the AMS data, without linearization techniques, so that weak AMS lineation and rotational AMS can be assessed that are beyond the scope of classical methods. This idea is extended to the evaluation of fault-slip data. While the traditional assumptions of stress inversion are not rejected, the method recovers the stress field via statistical hypothesis testing. In addition it provides statistical information needed for the combined evaluation of the AMS and the mesotectonic (0.1 to 10 m) data. In the combined evaluation a statistical test is carried out that helps to decide if the AMS lineation and the mesotectonic markers (in case of repeated deformation of the oldest set of markers) were formed in the same or different deformation phases. If this condition is met, the combined evaluation can improve the precision of the reconstruction. When the two data sets do not have a common solution for the direction of the extension, the deformational origin of the AMS is questionable. In this case the orientation of the stress field responsible for the AMS lineation might be different from that which caused the brittle deformation. Although most of the examples demonstrate the reconstruction of weak deformations in sediments, the new method is readily applicable to investigate the ductile-brittle transition of any rock formation as long as AMS and fault-slip data are available.

  1. Machine learning methods as a tool to analyse incomplete or irregularly sampled radon time series data.

    PubMed

    Janik, M; Bossew, P; Kurihara, O

    2018-07-15

    Machine learning is a class of statistical techniques which has proven to be a powerful tool for modelling the behaviour of complex systems, in which response quantities depend on assumed controls or predictors in a complicated way. In this paper, as our first purpose, we propose the application of machine learning to reconstruct incomplete or irregularly sampled data of time series indoor radon ( 222 Rn). The physical assumption underlying the modelling is that Rn concentration in the air is controlled by environmental variables such as air temperature and pressure. The algorithms "learn" from complete sections of multivariate series, derive a dependence model and apply it to sections where the controls are available, but not the response (Rn), and in this way complete the Rn series. Three machine learning techniques are applied in this study, namely random forest, its extension called the gradient boosting machine and deep learning. For a comparison, we apply the classical multiple regression in a generalized linear model version. Performance of the models is evaluated through different metrics. The performance of the gradient boosting machine is found to be superior to that of the other techniques. By applying learning machines, we show, as our second purpose, that missing data or periods of Rn series data can be reconstructed and resampled on a regular grid reasonably, if data of appropriate physical controls are available. The techniques also identify to which degree the assumed controls contribute to imputing missing Rn values. Our third purpose, though no less important from the viewpoint of physics, is identifying to which degree physical, in this case environmental variables, are relevant as Rn predictors, or in other words, which predictors explain most of the temporal variability of Rn. We show that variables which contribute most to the Rn series reconstruction, are temperature, relative humidity and day of the year. The first two are physical predictors, while "day of the year" is a statistical proxy or surrogate for missing or unknown predictors. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  3. Organ culture as a technique for casual embryology and its application in radiobiology (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BORGHESE, ELIO

    1961-11-01

    The classical methods of experimental embryology in amplubia are compared with the more recently introduced technique of culture in vitro of embryonic organs of warmblooded animals. Some isolation and recombination experiments carried out by means of organ culture are described. It is shown, by examples taken from research in progress, how this technique is applicable radiobiological experiments.

  4. A Gaussian wave packet phase-space representation of quantum canonical statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coughtrie, David J.; Tew, David P.

    2015-07-28

    We present a mapping of quantum canonical statistical averages onto a phase-space average over thawed Gaussian wave-packet (GWP) parameters, which is exact for harmonic systems at all temperatures. The mapping invokes an effective potential surface, experienced by the wave packets, and a temperature-dependent phase-space integrand, to correctly transition from the GWP average at low temperature to classical statistics at high temperature. Numerical tests on weakly and strongly anharmonic model systems demonstrate that thermal averages of the system energy and geometric properties are accurate to within 1% of the exact quantum values at all temperatures.

  5. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  6. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  7. Fisher, Neyman, and Bayes at FDA.

    PubMed

    Rubin, Donald B

    2016-01-01

    The wise use of statistical ideas in practice essentially requires some Bayesian thinking, in contrast to the classical rigid frequentist dogma. This dogma too often has seemed to influence the applications of statistics, even at agencies like the FDA. Greg Campbell was one of the most important advocates there for more nuanced modes of thought, especially Bayesian statistics. Because two brilliant statisticians, Ronald Fisher and Jerzy Neyman, are often credited with instilling the traditional frequentist approach in current practice, I argue that both men were actually seeking very Bayesian answers, and neither would have endorsed the rigid application of their ideas.

  8. Simultaneous spectrophotometric determination of glimepiride and pioglitazone in binary mixture and combined dosage form using chemometric-assisted techniques

    NASA Astrophysics Data System (ADS)

    El-Zaher, Asmaa A.; Elkady, Ehab F.; Elwy, Hanan M.; Saleh, Mahmoud Abo El Makarim

    2017-07-01

    In the present work, pioglitazone and glimepiride, 2 widely used antidiabetics, were simultaneously determined by a chemometric-assisted UV-spectrophotometric method which was applied to a binary synthetic mixture and a pharmaceutical preparation containing both drugs. Three chemometric techniques - Concentration residual augmented classical least-squares (CRACLS), principal component regression (PCR), and partial least-squares (PLS) were implemented by using the synthetic mixtures containing the two drugs in acetonitrile. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbencies in the range between 215 and 235 nm in the intervals with Δλ = 0.4 nm in their zero-order spectra. Then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of pioglitazone and glimepiride in their mixtures. The described techniques have been validated by analyzing synthetic mixtures containing the two drugs showing good mean recovery values lying between 98 and 100%. In addition, accuracy and precision of the three methods have been assured by recovery values lying between 98 and 102% and R.S.D. % ˂0.6 for intra-day precision and ˂1.2 for inter-day precision. The proposed chemometric techniques were successfully applied to a pharmaceutical preparation containing a combination of pioglitazone and glimepiride in the ratio of 30: 4, showing good recovery values. Finally, statistical analysis was carried out to add a value to the verification of the proposed methods. It was carried out by an intrinsic comparison between the 3 chemometric techniques and by comparing values of present methods with those obtained by implementing reference pharmacopeial methods for each of pioglitazone and glimepiride.

  9. Classical boson sampling algorithms with superior performance to near-term experiments

    NASA Astrophysics Data System (ADS)

    Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony

    2017-12-01

    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.

  10. Universal scaling for the quantum Ising chain with a classical impurity

    NASA Astrophysics Data System (ADS)

    Apollaro, Tony J. G.; Francica, Gianluca; Giuliano, Domenico; Falcone, Giovanni; Palma, G. Massimo; Plastina, Francesco

    2017-10-01

    We study finite-size scaling for the magnetic observables of an impurity residing at the end point of an open quantum Ising chain with transverse magnetic field, realized by locally rescaling the field by a factor μ ≠1 . In the homogeneous chain limit at μ =1 , we find the expected finite-size scaling for the longitudinal impurity magnetization, with no specific scaling for the transverse magnetization. At variance, in the classical impurity limit μ =0 , we recover finite scaling for the longitudinal magnetization, while the transverse one basically does not scale. We provide both analytic approximate expressions for the magnetization and the susceptibility as well as numerical evidences for the scaling behavior. At intermediate values of μ , finite-size scaling is violated, and we provide a possible explanation of this result in terms of the appearance of a second, impurity-related length scale. Finally, by going along the standard quantum-to-classical mapping between statistical models, we derive the classical counterpart of the quantum Ising chain with an end-point impurity as a classical Ising model on a square lattice wrapped on a half-infinite cylinder, with the links along the first circle modified as a function of μ .

  11. Statistical speed of quantum states: Generalized quantum Fisher information and Schatten speed

    NASA Astrophysics Data System (ADS)

    Gessner, Manuel; Smerzi, Augusto

    2018-02-01

    We analyze families of measures for the quantum statistical speed which include as special cases the quantum Fisher information, the trace speed, i.e., the quantum statistical speed obtained from the trace distance, and more general quantifiers obtained from the family of Schatten norms. These measures quantify the statistical speed under generic quantum evolutions and are obtained by maximizing classical measures over all possible quantum measurements. We discuss general properties, optimal measurements, and upper bounds on the speed of separable states. We further provide a physical interpretation for the trace speed by linking it to an analog of the quantum Cramér-Rao bound for median-unbiased quantum phase estimation.

  12. Quantum work in the Bohmian framework

    NASA Astrophysics Data System (ADS)

    Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.

    2018-01-01

    At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.

  13. Quantum machine learning: a classical perspective

    NASA Astrophysics Data System (ADS)

    Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Rocchetto, Andrea; Severini, Simone; Wossnig, Leonard

    2018-01-01

    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed.

  14. Quantum machine learning: a classical perspective

    PubMed Central

    Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Severini, Simone; Wossnig, Leonard

    2018-01-01

    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed. PMID:29434508

  15. Quantum machine learning: a classical perspective.

    PubMed

    Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Rocchetto, Andrea; Severini, Simone; Wossnig, Leonard

    2018-01-01

    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed.

  16. Experimental multiplexing of quantum key distribution with classical optical communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Liu-Jun; Chen, Luo-Kan; Ju, Lei

    2015-02-23

    We demonstrate the realization of quantum key distribution (QKD) when combined with classical optical communication, and synchronous signals within a single optical fiber. In the experiment, the classical communication sources use Fabry-Pérot (FP) lasers, which are implemented extensively in optical access networks. To perform QKD, multistage band-stop filtering techniques are developed, and a wavelength-division multiplexing scheme is designed for the multi-longitudinal-mode FP lasers. We have managed to maintain sufficient isolation among the quantum channel, the synchronous channel and the classical channels to guarantee good QKD performance. Finally, the quantum bit error rate remains below a level of 2% across themore » entire practical application range. The proposed multiplexing scheme can ensure low classical light loss, and enables QKD over fiber lengths of up to 45 km simultaneously when the fibers are populated with bidirectional FP laser communications. Our demonstration paves the way for application of QKD to current optical access networks, where FP lasers are widely used by the end users.« less

  17. The application of compressed sensing to long-term acoustic emission-based structural health monitoring

    NASA Astrophysics Data System (ADS)

    Cattaneo, Alessandro; Park, Gyuhae; Farrar, Charles; Mascareñas, David

    2012-04-01

    The acoustic emission (AE) phenomena generated by a rapid release in the internal stress of a material represent a promising technique for structural health monitoring (SHM) applications. AE events typically result in a discrete number of short-time, transient signals. The challenge associated with capturing these events using classical techniques is that very high sampling rates must be used over extended periods of time. The result is that a very large amount of data is collected to capture a phenomenon that rarely occurs. Furthermore, the high energy consumption associated with the required high sampling rates makes the implementation of high-endurance, low-power, embedded AE sensor nodes difficult to achieve. The relatively rare occurrence of AE events over long time scales implies that these measurements are inherently sparse in the spike domain. The sparse nature of AE measurements makes them an attractive candidate for the application of compressed sampling techniques. Collecting compressed measurements of sparse AE signals will relax the requirements on the sampling rate and memory demands. The focus of this work is to investigate the suitability of compressed sensing techniques for AE-based SHM. The work explores estimating AE signal statistics in the compressed domain for low-power classification applications. In the event compressed classification finds an event of interest, ι1 norm minimization will be used to reconstruct the measurement for further analysis. The impact of structured noise on compressive measurements is specifically addressed. The suitability of a particular algorithm, called Justice Pursuit, to increase robustness to a small amount of arbitrary measurement corruption is investigated.

  18. Replacement of Expensive, Disposable Instruments With Old-fashioned Surgical Techniques for Improved Cost-effectiveness in Laparoscopic Hysterectomy

    PubMed Central

    Morrison, John E.

    2004-01-01

    Objective: Patients demand that health care and procedures in rural areas be provided by ambulatory surgery centers close to home. However, the reimbursement rate for such procedures in ambulatory centers is extremely low, so a standard classic intrafascial supracervical hysterectomy procedure needs to be more cost effective to be performed there. Instruments and disposable devices can make up ≥50% of hospital costs for this procedure, so any cost reduction has to focus on this aspect. Methods: We identified the 3 most expensive disposable devices: (1) an Endostapler, US $498 and 3 staple reloads, US $179 each; (2) a calibrated uterine resection tool 15 mm for encoring of the endocervical canal, US $853; and (3) a serrated edged macro morcellator for intraabdominal uterus morcellation, US $321, and substituted them using classic conservative surgical techniques. Results: From September 2001 to September 2002, we performed 26 procedures with this modified technique at an ambulatory surgery center with a follow-up of 6.7 (2 to 14) months. This modified operative technique was feasible; no conversions were necessary, and no complications occurred. Cost savings were US $2209 per procedure; additional costs were US $266.33 for suture material and an Endopouch, resulting in an overall savings of US $50 509.42. The disadvantage was an increase in operating room time of about 1 hour 20 minutes per case. Conclusion: These modifications in the classic intrafascial supracervical hysterectomy technique have proven to be feasible, safe, and highly cost effective, especially for a rural ambulatory surgery center. Long-term follow-up is necessary to further evaluate these operative modifications. PMID:15119671

  19. Physiologic Waveform Analysis for Early Detection of Hemorrhage during Transport and Higher Echelon Medical Care of Combat Casualties

    DTIC Science & Technology

    2014-03-01

    waveforms that are easier to measure than ABP (e.g., pulse oximeter waveforms); (3) a NIH SBIR Phase I proposal with Retia Medical to develop automated...the training dataset. Integrating the technique with non-invasive pulse transit time (PTT) was most effective. The integrated technique specifically...the peripheral ABP waveforms in the training dataset. These techniques included the rudimentary mean ABP technique, the classic pulse pressure times

  20. A comparison of mini-FLOTAC and FLOTAC with classic methods to diagnosing intestinal parasites of dogs from Brazil.

    PubMed

    Lima, Victor Fernando Santana; Cringoli, Giuseppe; Rinaldi, Laura; Monteiro, Maria Fernanda Melo; Calado, Andréa Maria Campos; Ramos, Rafael Antonio Nascimento; Meira-Santos, Patrícia Oliveira; Alves, Leucio Câmara

    2015-09-01

    Dogs may be affected by different species of gastrointestinal parasites which present great importance in veterinary medicine and public health. Several techniques to diagnosing these parasites have been proposed, but different performances achieved by each method make difficult the choice of the best technique to be used. In this study, the performance of two classic methods (i.e., Willis and Hoffman techniques) and two recent techniques (i.e., FLOTAC and Mini-FLOTAC) to diagnosing gastrointestinal parasites of dogs was evaluated. Fecal samples (n = 127) of dogs divided in pools (n = 30) were collected and analyzed using four different techniques (see above). Eggs and/or oocysts of gastrointestinal parasites were detected in 93.3 % (28/30) of the samples. In particular, 20 % (6/30) were detected through the method of Hoffman, 53.3 % (16/30) by the Willis technique, and 63.3 % (19/30) and 90 % (27/30) by Mini-FLOTAC and FLOTAC, respectively. Ancylostomatidae, Trichuris vulpis and Toxocara canis were the most frequent parasites herein detected. The FLOTAC and Mini-FLOTAC techniques were the most efficient tools to detect eggs and/or oocysts of gastrointestinal parasites of dogs, therefore their use is recommended in the laboratorial routine of veterinary medicine. This study is the first report of the use of both techniques (i.e., FLOTAC and Mini-FLOTAC) to diagnosing parasites of dogs in Brazil.

  1. Classical subjective expected utility.

    PubMed

    Cerreia-Vioglio, Simone; Maccheroni, Fabio; Marinacci, Massimo; Montrucchio, Luigi

    2013-04-23

    We consider decision makers who know that payoff-relevant observations are generated by a process that belongs to a given class M, as postulated in Wald [Wald A (1950) Statistical Decision Functions (Wiley, New York)]. We incorporate this Waldean piece of objective information within an otherwise subjective setting à la Savage [Savage LJ (1954) The Foundations of Statistics (Wiley, New York)] and show that this leads to a two-stage subjective expected utility model that accounts for both state and model uncertainty.

  2. Asymmetrical flow field-flow fractionation with multi-angle light scattering and quasi-elastic light scattering for characterization of polymersomes: comparison with classical techniques.

    PubMed

    Till, Ugo; Gaucher-Delmas, Mireille; Saint-Aguet, Pascale; Hamon, Glenn; Marty, Jean-Daniel; Chassenieux, Christophe; Payré, Bruno; Goudounèche, Dominique; Mingotaud, Anne-Françoise; Violleau, Frédéric

    2014-12-01

    Polymersomes formed from amphiphilic block copolymers, such as poly(ethyleneoxide-b-ε-caprolactone) (PEO-b-PCL) or poly(ethyleneoxide-b-methylmethacrylate), were characterized by asymmetrical flow field-flow fractionation coupled with quasi-elastic light scattering (QELS), multi-angle light scattering (MALS), and refractive index detection, leading to the determination of their size, shape, and molecular weight. The method was cross-examined with more classical ones, like batch dynamic and static light scattering, electron microscopy, and atomic force microscopy. The results show good complementarities between all the techniques; asymmetrical flow field-flow fractionation being the most pertinent one when the sample exhibits several different types of population.

  3. Potential utilization of the absolute point cumulative semivariogram technique for the evaluation of distribution coefficient.

    PubMed

    Külahci, Fatih; Sen, Zekâi

    2009-09-15

    The classical solid/liquid distribution coefficient, K(d), for radionuclides in water-sediment systems is dependent on many parameters such as flow, geology, pH, acidity, alkalinity, total hardness, radioactivity concentration, etc. in a region. Considerations of all these effects require a regional analysis with an effective methodology, which has been based on the concept of the cumulative semivariogram concept in this paper. Although classical K(d) calculations are punctual and cannot represent regional pattern, in this paper a regional calculation methodology is suggested through the use of Absolute Point Cumulative SemiVariogram (APCSV) technique. The application of the methodology is presented for (137)Cs and (90)Sr measurements at a set of points in Keban Dam reservoir, Turkey.

  4. Specificity and timescales of cortical adaptation as inferences about natural movie statistics.

    PubMed

    Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia

    2016-10-01

    Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation.

  5. Specificity and timescales of cortical adaptation as inferences about natural movie statistics

    PubMed Central

    Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia

    2016-01-01

    Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation. PMID:27699416

  6. Influence of complaints and singing style in singers voice handicap.

    PubMed

    Moreti, Felipe; Ávila, Maria Emília Barros de; Rocha, Clara; Borrego, Maria Cristina de Menezes; Oliveira, Gisele; Behlau, Mara

    2012-01-01

    The aim of this research was to verify whether the difference of singing styles and the presence of vocal complaints influence the perception of voice handicap of singers. One hundred eighteen singing voice handicap self-assessment protocols were selected: 17 popular singers with vocal complaints, 42 popular singers without complaints, 17 classic singers with complaints, and 42 classic singers without complaints. The groups were similar regarding age, gender and voice types. Both protocols used--Modern Singing Handicap Index (MSHI) and Classical Singing Handicap Index (CSHI)--have specific questions to their respective singing styles, and consist of 30 items equally divided into three subscales: disability (functional domain), handicap (emotional domain) and impairment (organic domain), answered according to the frequency of occurrence. Each subscale has a maximum of 40 points, and the total score is 120 points. The higher the score, the higher the singing voice handicap perceived. For statistical analysis, we used the ANOVA test, with 5% of significance. Classical and popular singers referred higher impairment, followed by disability and handicap. However, the degree of this perception varied according to the singing style and the presence of vocal complaints. The classical singers with vocal complaints showed higher voice handicap than popular singers with vocal complaints, while the classic singers without complaints reported lower handicap than popular singers without complaints. This evidences that classical singers have higher perception of their own voice, and that vocal disturbances in this group may cause greater voice handicap when compared to popular singers.

  7. Quantum walks with tuneable self-avoidance in one dimension

    PubMed Central

    Camilleri, Elizabeth; Rohde, Peter P.; Twamley, Jason

    2014-01-01

    Quantum walks exhibit many unique characteristics compared to classical random walks. In the classical setting, self-avoiding random walks have been studied as a variation on the usual classical random walk. Here the walker has memory of its previous locations and preferentially avoids stepping back to locations where it has previously resided. Classical self-avoiding random walks have found numerous algorithmic applications, most notably in the modelling of protein folding. We consider the analogous problem in the quantum setting – a quantum walk in one dimension with tunable levels of self-avoidance. We complement a quantum walk with a memory register that records where the walker has previously resided. The walker is then able to avoid returning back to previously visited sites or apply more general memory conditioned operations to control the walk. We characterise this walk by examining the variance of the walker's distribution against time, the standard metric for quantifying how quantum or classical a walk is. We parameterise the strength of the memory recording and the strength of the memory back-action on the walker, and investigate their effect on the dynamics of the walk. We find that by manipulating these parameters, which dictate the degree of self-avoidance, the walk can be made to reproduce ideal quantum or classical random walk statistics, or a plethora of more elaborate diffusive phenomena. In some parameter regimes we observe a close correspondence between classical self-avoiding random walks and the quantum self-avoiding walk. PMID:24762398

  8. The reliability of using postero-anterior cephalometry and cone-beam CT to determine transverse dimensions in clinical practice.

    PubMed

    Tai, Benjamin; Goonewardene, Mithran Suresh; Murray, Kevin; Koong, Bernard; Islam, Syed Mohammed Shamsul

    2014-11-01

    This study primarily aimed to assess the accuracy of classically-advocated reference points for the measurement of transverse jaw-base and dental relationships using conventional Postero-Anterior Cephalometry (PAC) and Cone-Beam Computed Tomography (CBCT). PAC and CBCT images were collected from 31 randomly selected orthodontic patients (12 males, 19 females), all of whom had a full permanent dentition. The transverse widths of the maxilla, mandible and the dentition were measured using reference points on both image modalities. Confidence intervals, intra-class coefficients and Bland Altman plots were used to assess the measurement differences derived from the two acquirement methods. Measurements on PAC and CBCT images demonstrated statistically significant differences in the majority of the assessed variables. The interjugal (J-J) width was one of only two variables which did not demonstrate a statistically significant difference on image comparison. The mean differences of the antegonial width (Ag-Ag) (-4.44mm, 95% CI -5.38 to -3.51) represented the greatest difference between the imaging techniques. The application of these points to a transverse skeletal analysis (J-J/Ag-Ag ratio) revealed that five of the 31 subjects (16%) recorded 'false positive' readings according to the derived data. It is recommended that clinicians are cautious when interpreting and making decisions related to transverse dimensions derived from a PAC. The PAC has a higher tendency to falsely identify individuals who require maxillary expansion procedures based on conventional clinical criteria. The errors primarily associated with identifying structures which represent the width of the mandible are significant in both PAC and CBCT techniques and require further investigation. It is postulated that the confounding effects of overlying soft tissues have a sianificant impact on a clinician's ability to identify relevant landmarks.

  9. Classical molecular dynamics simulations for non-equilibrium correlated plasmas

    NASA Astrophysics Data System (ADS)

    Ferri, S.; Calisti, A.; Talin, B.

    2017-03-01

    A classical molecular dynamics model was recently extended to simulate neutral multi-component plasmas where various charge states of the same atom and electrons coexist. It is used to investigate the plasma effects on the ion charge and on the ionization potential in dense plasmas. Different simulated statistical properties will show that the concept of isolated particles is lost in such correlated plasmas. The charge equilibration is discussed for a carbon plasma at solid density and investigation on the charge distribution and on the ionization potential depression (IPD) for aluminum plasmas is discussed with reference to existing experiments.

  10. Signatures of chaos in the Brillouin zone.

    PubMed

    Barr, Aaron; Barr, Ariel; Porter, Max D; Reichl, Linda E

    2017-10-01

    When the classical dynamics of a particle in a finite two-dimensional billiard undergoes a transition to chaos, the quantum dynamics of the particle also shows manifestations of chaos in the form of scarring of wave functions and changes in energy level spacing distributions. If we "tile" an infinite plane with such billiards, we find that the Bloch states on the lattice undergo avoided crossings, energy level spacing statistics change from Poisson-like to Wigner-like, and energy sheets of the Brillouin zone begin to "mix" as the classical dynamics of the billiard changes from regular to chaotic behavior.

  11. ASSESSING THE IMPACTS OF ANTHROPOGENIC STRESSORS ON MACROINVERTEBRATE INDICATORS IN OHIO

    EPA Science Inventory

    In the past few years, there has been increasing interest in using biological community data to provide information about specific anthropogenic factors impacting streams. Previous studies have used statistical approaches that are variants of classical and modern multiple regres...

  12. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  13. Multilinear Graph Embedding: Representation and Regularization for Images.

    PubMed

    Chen, Yi-Lei; Hsu, Chiou-Ting

    2014-02-01

    Given a set of images, finding a compact and discriminative representation is still a big challenge especially when multiple latent factors are hidden in the way of data generation. To represent multifactor images, although multilinear models are widely used to parameterize the data, most methods are based on high-order singular value decomposition (HOSVD), which preserves global statistics but interprets local variations inadequately. To this end, we propose a novel method, called multilinear graph embedding (MGE), as well as its kernelization MKGE to leverage the manifold learning techniques into multilinear models. Our method theoretically links the linear, nonlinear, and multilinear dimensionality reduction. We also show that the supervised MGE encodes informative image priors for image regularization, provided that an image is represented as a high-order tensor. From our experiments on face and gait recognition, the superior performance demonstrates that MGE better represents multifactor images than classic methods, including HOSVD and its variants. In addition, the significant improvement in image (or tensor) completion validates the potential of MGE for image regularization.

  14. Cognitive methodology for forecasting oil and gas industry using pattern-based neural information technologies

    NASA Astrophysics Data System (ADS)

    Gafurov, O.; Gafurov, D.; Syryamkin, V.

    2018-05-01

    The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.

  15. Sub-millisecond ligand probing of cell receptors with multiple solution exchange

    PubMed Central

    Sylantyev, Sergiy; Rusakov, Dmitri A

    2013-01-01

    The accurate knowledge of receptor kinetics is crucial to our understanding of cell signal transduction in general and neural function in particular. The classical technique of probing membrane receptors on a millisecond scale involves placing a recording micropipette with a membrane patch in front of a double-barrel (θ-glass) application pipette mounted on a piezo actuator. Driven by electric pulses, the actuator can rapidly shift the θ-glass pipette tip, thus exposing the target receptors to alternating ligand solutions. However, membrane patches survive for only a few minutes, thus normally restricting such experiments to a single-application protocol. In order to overcome this deficiency, we have introduced pressurized supply microcircuits in the θ-glass channels, thus enabling repeated replacement of application solutions within 10–15 s. this protocol, which has been validated in our recent studies and takes 20–60 min to implement, allows the characterization of ligand-receptor interactions with high sensitivity, thereby also enabling a powerful paired-sample statistical design. PMID:23744290

  16. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors.

    PubMed

    Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014.

  17. Identification of volatile markers for indoor fungal growth and chemotaxonomic classification of Aspergillus species.

    PubMed

    Polizzi, Viviana; Adams, An; Malysheva, Svetlana V; De Saeger, Sarah; Van Peteghem, Carlos; Moretti, Antonio; Picco, Anna Maria; De Kimpe, Norbert

    2012-09-01

    Microbial volatile organic compounds (MVOCs) were collected in water-damaged buildings to evaluate their use as possible indicators of indoor fungal growth. Fungal species isolated from contaminated buildings were screened for MVOC production on malt extract agar by means of headspace solid-phase microextraction followed by gas chromatography-mass spectrometry (GC-MS) analysis. Some sesquiterpenes, specifically derived from fungal growth, were detected in the sampled environments and the corresponding fungal producers were identified. Statistical analysis of the detected MVOC profiles allowed the identification of species-specific MVOCs or MVOC patterns for Aspergillus versicolor group, Aspergillus ustus, and Eurotium amstelodami. In addition, Chaetomium spp. and Epicoccum spp. were clearly differentiated by their volatile production from a group of 76 fungal strains belonging to different genera. These results are useful in the chemotaxonomic discrimination of fungal species, in aid to the classical morphological and molecular identification techniques. Copyright © 2012 The British Mycological Society. Published by Elsevier Ltd. All rights reserved.

  18. Vibrational monitor of early demineralization in tooth enamel after in vitro exposure to phosphoridic liquid

    NASA Astrophysics Data System (ADS)

    Pezzotti, Giuseppe; Adachi, Tetsuya; Gasparutti, Isabella; Vincini, Giulio; Zhu, Wenliang; Boffelli, Marco; Rondinella, Alfredo; Marin, Elia; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato

    2017-02-01

    The Raman spectroscopic method has been applied to quantitatively assess the in vitro degree of demineralization in healthy human teeth. Based on previous evaluations of Raman selection rules (empowered by an orientation distribution function (ODF) statistical algorithm) and on a newly proposed analysis of phonon density of states (PDOS) for selected vibrational modes of the hexagonal structure of hydroxyapatite, a molecular-scale evaluation of the demineralization process upon in vitro exposure to a highly acidic beverage (i.e., CocaCola™ Classic, pH = 2.5) could be obtained. The Raman method proved quite sensitive and spectroscopic features could be directly related to an increase in off-stoichiometry of the enamel surface structure since the very early stage of the demineralization process (i.e., when yet invisible to other conventional analytical techniques). The proposed Raman spectroscopic algorithm might possess some generality for caries risk assessment, allowing a prompt non-contact diagnostic practice in dentistry.

  19. The effect of repeated firings on the color change and surface roughness of dental ceramics

    PubMed Central

    Yılmaz, Kerem; Ozturk, Caner

    2014-01-01

    PURPOSE The color of the ceramic restorations is affected by various factors such as brand, thickness of the layered the ceramic, condensation techniques, smoothness of surface, number of firings, firing temperature and thickness of dentin. The aim of this study was to evaluate the color change and surface roughness in dental porcelain with different thicknesses during repeated firings. MATERIALS AND METHODS Disc-shaped (N=21) metal-ceramic samples (IPS Classic; Ivoclar Vivadent; Shaar, Liechtenstein) with different thickness were exposed to repeated firings. Color measurement of the samples was made using a colorimeter and profilometer was used to determine surface roughness. ANOVA and Tukey tests with repeated measurements were used for statistical analysis. RESULTS The total thickness of the ceramics which is less than 2 mm significantly have detrimental effect on the surface properties and color of porcelains during firings (P<.05). CONCLUSION Repeated firings have effects on the color change and surface roughness of the dental ceramics and should be avoided. PMID:25177475

  20. Effectiveness of spinal anesthesia combined with obturator nerve blockade in preventing adductor muscle contraction during transurethral resection of bladder tumor

    PubMed Central

    Alavi, Cyrus Emir; Asgari, Seyed Alaeddin; Falahatkar, Siavash; Rimaz, Siamak; Naghipour, Mohammadreza; Khoshrang, Hossein; Jafari, Mehdi; Herfeh, Nadia

    2017-01-01

    Objective To determine whether spinal anesthesia combined with obturator nerve blockade (SOB) is effective in preventing obturator nerve stimulation, jerking and bladder perforation during transurethral resection of bladder tumor (TURBT). Material and methods In this clinical trial, 30 patients were randomly divided into two groups: spinal anesthesia (SA) and SOB. In SA group, 2.5 cc of 0.5% bupivacaine was injected intrathecally using a 25-gauge spinal needle and in SOB after spinal anesthesia, a classic obturator nerve blockade was performed by using nerve stimulation technique. Results There was a statistically significant difference between jerking in both groups (p=0.006). During the TURBT, surgeon satisfaction was significantly higher in SOB group compared to SA group (p=0.006). There was no significant correlation between sex, patient age and location of bladder tumor between the groups (p>0.05). Conclusion Obturator nerve blockade by using 15 cc lidocaine 1% is effective in preventing adductor muscle spasms during TURBT. PMID:29201516

Top