Advances in Statistical Approaches Oncology Drug Development
Ivanova, Anastasia; Rosner, Gary L.; Marchenko, Olga; Parke, Tom; Perevozskaya, Inna; Wang, Yanping
2014-01-01
We describe some recent developments in statistical methodology and practice in oncology drug development from an academic and an industry perspective. Many adaptive designs were pioneered in oncology, and oncology is still at the forefront of novel methods to enable better and faster Go/No-Go decision making while controlling the cost. PMID:25949927
Intermediate/Advanced Research Design and Statistics
NASA Technical Reports Server (NTRS)
Ploutz-Snyder, Robert
2009-01-01
The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs
Advanced Placement Course Description. Statistics.
ERIC Educational Resources Information Center
College Entrance Examination Board, New York, NY.
The Advanced Placement (AP) program is a cooperative educational effort of secondary schools, colleges, and the College Board that consists of 30 college-level courses and examinations in 17 academic disciplines for highly motivated students in secondary schools. AP courses are offered in more than 11,000 high schools and are recognized by nearly…
Statistical Approach to Protein Quantification*
Gerster, Sarah; Kwon, Taejoon; Ludwig, Christina; Matondo, Mariette; Vogel, Christine; Marcotte, Edward M.; Aebersold, Ruedi; Bühlmann, Peter
2014-01-01
A major goal in proteomics is the comprehensive and accurate description of a proteome. This task includes not only the identification of proteins in a sample, but also the accurate quantification of their abundance. Although mass spectrometry typically provides information on peptide identity and abundance in a sample, it does not directly measure the concentration of the corresponding proteins. Specifically, most mass-spectrometry-based approaches (e.g. shotgun proteomics or selected reaction monitoring) allow one to quantify peptides using chromatographic peak intensities or spectral counting information. Ultimately, based on these measurements, one wants to infer the concentrations of the corresponding proteins. Inferring properties of the proteins based on experimental peptide evidence is often a complex problem because of the ambiguity of peptide assignments and different chemical properties of the peptides that affect the observed concentrations. We present SCAMPI, a novel generic and statistically sound framework for computing protein abundance scores based on quantified peptides. In contrast to most previous approaches, our model explicitly includes information from shared peptides to improve protein quantitation, especially in eukaryotes with many homologous sequences. The model accounts for uncertainty in the input data, leading to statistical prediction intervals for the protein scores. Furthermore, peptides with extreme abundances can be reassessed and classified as either regular data points or actual outliers. We used the proposed model with several datasets and compared its performance to that of other, previously used approaches for protein quantification in bottom-up mass spectrometry. PMID:24255132
Advanced Statistical Properties of Dispersing Billiards
NASA Astrophysics Data System (ADS)
Chernov, N.
2006-03-01
A new approach to statistical properties of hyperbolic dynamical systems emerged recently; it was introduced by L.-S. Young and modified by D. Dolgopyat. It is based on coupling method borrowed from probability theory. We apply it here to one of the most physically interesting models—Sinai billiards. It allows us to derive a series of new results, as well as make significant improvements in the existing results. First we establish sharp bounds on correlations (including multiple correlations). Then we use our correlation bounds to obtain the central limit theorem (CLT), the almost sure invariance principle (ASIP), the law of iterated logarithms, and integral tests.
Recent advances in statistical energy analysis
NASA Technical Reports Server (NTRS)
Heron, K. H.
1992-01-01
Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.
Boltzmann's Approach to Statistical Mechanics
NASA Astrophysics Data System (ADS)
Goldstein, Sheldon
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations - such as the identification of the state of a physical system with a probability distribution \\varrho on its phase space, of its thermodynamic entropy with the Gibbs entropy of \\varrho, and the invocation of the notions of ergodicity and mixing for the justification of the foundations of statistical mechanics - are thoroughly misguided.
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge
ERIC Educational Resources Information Center
Haines, Brenna
2015-01-01
The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…
Reconciling statistical and systems science approaches to public health.
Ip, Edward H; Rahmandad, Hazhir; Shoham, David A; Hammond, Ross; Huang, Terry T-K; Wang, Youfa; Mabry, Patricia L
2013-10-01
Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision. PMID:24084395
Advance Report of Final Mortality Statistics, 1985.
ERIC Educational Resources Information Center
Monthly Vital Statistics Report, 1987
1987-01-01
This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…
Statistical approach to nuclear level density
Sen'kov, R. A.; Horoi, M.; Zelevinsky, V. G.
2014-10-15
We discuss the level density in a finite many-body system with strong interaction between the constituents. Our primary object of applications is the atomic nucleus but the same techniques can be applied to other mesoscopic systems. We calculate and compare nuclear level densities for given quantum numbers obtained by different methods, such as nuclear shell model (the most successful microscopic approach), our main instrument - moments method (statistical approach), and Fermi-gas model; the calculation with the moments method can use any shell-model Hamiltonian excluding the spurious states of the center-of-mass motion. Our goal is to investigate statistical properties of nuclear level density, define its phenomenological parameters, and offer an affordable and reliable way of calculation.
Pooling Morphometric Estimates: A Statistical Equivalence Approach.
Pardoe, Heath R; Cutter, Gary R; Alter, Rachel; Hiess, Rebecca Kucharsky; Semmelroch, Mira; Parker, Donna; Farquharson, Shawna; Jackson, Graeme D; Kuzniecky, Ruben
2016-01-01
Changes in hardware or image-processing settings are a common issue for large multicenter studies. To pool MRI data acquired under these changed conditions, it is necessary to demonstrate that the changes do not affect MRI-based measurements. In these circumstances, classical inference testing is inappropriate because it is designed to detect differences, not prove similarity. We used a method known as statistical equivalence testing to address this limitation. Equivalence testing was carried out on 3 datasets: (1) cortical thickness and automated hippocampal volume estimates obtained from healthy individuals imaged using different multichannel head coils; (2) manual hippocampal volumetry obtained using two readers; and (3) corpus callosum area estimates obtained using an automated method with manual cleanup carried out by two readers. Equivalence testing was carried out using the "two one-sided tests" (TOST) approach. Power analyses of the TOST were used to estimate sample sizes required for well-powered equivalence testing analyses. Mean and standard deviation estimates from the automated hippocampal volume dataset were used to carry out an example power analysis. Cortical thickness values were found to be equivalent over 61% of the cortex when different head coils were used (q < .05, false discovery rate correction). Automated hippocampal volume estimates obtained using the same two coils were statistically equivalent (TOST P = 4.28 × 10(-15) ). Manual hippocampal volume estimates obtained using two readers were not statistically equivalent (TOST P = .97). The use of different readers to carry out limited correction of automated corpus callosum segmentations yielded equivalent area estimates (TOST P = 1.28 × 10(-14) ). Power analysis of simulated and automated hippocampal volume data demonstrated that the equivalence margin affects the number of subjects required for well-powered equivalence tests. We have presented a statistical method for determining if
Introducing linear functions: an alternative statistical approach
NASA Astrophysics Data System (ADS)
Nolan, Caroline; Herbert, Sandra
2015-12-01
The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be `threshold concepts'. There is recognition that linear functions can be taught in context through the exploration of linear modelling examples, but this has its limitations. Currently, statistical data is easily attainable, and graphics or computer algebra system (CAS) calculators are common in many classrooms. The use of this technology provides ease of access to different representations of linear functions as well as the ability to fit a least-squares line for real-life data. This means these calculators could support a possible alternative approach to the introduction of linear functions. This study compares the results of an end-of-topic test for two classes of Australian middle secondary students at a regional school to determine if such an alternative approach is feasible. In this study, test questions were grouped by concept and subjected to concept by concept analysis of the means of test results of the two classes. This analysis revealed that the students following the alternative approach demonstrated greater competence with non-standard questions.
Phase statistics approach to human ventricular fibrillation
NASA Astrophysics Data System (ADS)
Wu, Ming-Chya; Watanabe, Eiichi; Struzik, Zbigniew R.; Hu, Chin-Kun; Yamamoto, Yoshiharu
2009-11-01
Ventricular fibrillation (VF) is known to be the most dangerous cardiac arrhythmia, frequently leading to sudden cardiac death (SCD). During VF, cardiac output drops to nil and, unless the fibrillation is promptly halted, death usually ensues within minutes. While delivering life saving electrical shocks is a method of preventing SCD, it has been recognized that some, though not many, VF episodes are self-terminating, and understanding the mechanism of spontaneous defibrillation might provide newer therapeutic options for treatment of this otherwise fatal arrhythmia. Using the phase statistics approach, recently developed to study financial and physiological time series, here, we reveal the timing characteristics of transient features of ventricular tachyarrhythmia (mostly VF) electrocardiogram (ECG) and find that there are three distinct types of probability density function (PDF) of phase distributions: uniform (UF), concave (CC), and convex (CV). Our data show that VF patients with UF or CC types of PDF have approximately the same probability of survival and nonsurvival, while VF patients with CV type PDF have zero probability of survival, implying that their VF episodes are never self-terminating. Our results suggest that detailed phase statistics of human ECG data may be a key to understanding the mechanism of spontaneous defibrillation of fatal VF.
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
Robot Trajectories Comparison: A Statistical Approach
Ansuategui, A.; Arruti, A.; Susperregi, L.; Yurramendi, Y.; Jauregi, E.; Lazkano, E.; Sierra, B.
2014-01-01
The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners. PMID:25525618
Robot trajectories comparison: a statistical approach.
Ansuategui, A; Arruti, A; Susperregi, L; Yurramendi, Y; Jauregi, E; Lazkano, E; Sierra, B
2014-01-01
The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM(2) and WaveFront, using different environments, robots, and local planners. PMID:25525618
Teaching Statistics by Spreadsheet: A Developmental Approach.
ERIC Educational Resources Information Center
Ostrowski, John W.
1988-01-01
Presents a framework for using spreadsheet software (Lotus 1 2 3) on a microcomputer to develop statistical procedure templates for teaching statistical concepts. Provides an overview of traditional computer-based statistical applications, an outline for teaching-oriented statistical applications with illustrations, and suggestions for integrating…
Intelligence and embodiment: a statistical mechanics approach.
Chinea, Alejandro; Korutcheva, Elka
2013-04-01
Evolutionary neuroscience has been mainly dominated by the principle of phylogenetic conservation, specifically, by the search for similarities in brain organization. This principle states that closely related species tend to be similar because they have a common ancestor. However, explaining, for instance, behavioral differences between humans and chimpanzees, has been revealed to be notoriously difficult. In this paper, the hypothesis of a common information-processing principle exploited by the brains evolved through natural evolution is explored. A model combining recent advances in cognitive psychology and evolutionary neuroscience is presented. The macroscopic effects associated with the intelligence-like structures postulated by the model are analyzed from a statistical mechanics point of view. As a result of this analysis, some plausible explanations are put forward concerning the disparities and similarities in cognitive capacities which are observed in nature across species. Furthermore, an interpretation on the efficiency of brain's computations is also provided. These theoretical results and their implications against modern theories of intelligence are shown to be consistent with the formulated hypothesis. PMID:23454920
Statistical approach to partial equilibrium analysis
NASA Astrophysics Data System (ADS)
Wang, Yougui; Stanley, H. E.
2009-04-01
A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.
Uncertainty quantification approaches for advanced reactor analyses.
Briggs, L. L.; Nuclear Engineering Division
2009-03-24
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
ERIC Educational Resources Information Center
McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley
2015-01-01
In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
Tools for the advancement of undergraduate statistics education
NASA Astrophysics Data System (ADS)
Schaffner, Andrew Alan
To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.
Optimal Statistical Approach to Optoacoustic Image Reconstruction
NASA Astrophysics Data System (ADS)
Zhulina, Yulia V.
2000-11-01
An optimal statistical approach is applied to the task of image reconstruction in photoacoustics. The physical essence of the task is as follows: Pulse laser irradiation induces an ultrasound wave on the inhomogeneities inside the investigated volume. This acoustic wave is received by the set of receivers outside this volume. It is necessary to reconstruct a spatial image of these inhomogeneities. Developed mathematical techniques of the radio location theory are used for solving the task. An algorithm of maximum likelihood is synthesized for the image reconstruction. The obtained algorithm is investigated by digital modeling. The number of receivers and their disposition in space are arbitrary. Results of the synthesis are applied to noninvasive medical diagnostics (breast cancer). The capability of the algorithm is tested on real signals. The image is built with use of signals obtained in vitro . The essence of the algorithm includes (i) summing of all signals in the image plane with the transform from the time coordinates of signals to the spatial coordinates of the image and (ii) optimal spatial filtration of this sum. The results are shown in the figures.
A Statistical Approach to Automatic Speech Summarization
NASA Astrophysics Data System (ADS)
Hori, Chiori; Furui, Sadaoki; Malkin, Rob; Yu, Hua; Waibel, Alex
2003-12-01
This paper proposes a statistical approach to automatic speech summarization. In our method, a set of words maximizing a summarization score indicating the appropriateness of summarization is extracted from automatically transcribed speech and then concatenated to create a summary. The extraction process is performed using a dynamic programming (DP) technique based on a target compression ratio. In this paper, we demonstrate how an English news broadcast transcribed by a speech recognizer is automatically summarized. We adapted our method, which was originally proposed for Japanese, to English by modifying the model for estimating word concatenation probabilities based on a dependency structure in the original speech given by a stochastic dependency context free grammar (SDCFG). We also propose a method of summarizing multiple utterances using a two-level DP technique. The automatically summarized sentences are evaluated by summarization accuracy based on a comparison with a manual summary of speech that has been correctly transcribed by human subjects. Our experimental results indicate that the method we propose can effectively extract relatively important information and remove redundant and irrelevant information from English news broadcasts.
Statistical physics approaches to understanding physiological signals
NASA Astrophysics Data System (ADS)
Chen, Zhi
This thesis applies novel statistical physics approaches to investigate complex mechanisms underlying some physiological signals related to human motor activity and stroke. The scale-invariant properties of motor activity fluctuations and the phase coupling between blood flow (BF) in the brain and blood pressure (BP) at the finger are studied. Both BF and BP signals are controlled by cerebral autoregulation, the impairment of which is relevant to stroke. Part I of this thesis introduces experimental methods of assessing human activity fluctuations, BF and BP signals. These signals are often nonstationary, i.e., the mean and the standard deviation of signals are not invariant under time shifts. This fact imposes challenges in correctly analyzing properties of such signals. A review of conventional methods and the methods from statistical physics in quantifying long-range power-law correlations (an important scale-invariant property) and phase coupling in nonstationary signals is provided. Part II investigates the effects of trends, nonstationarities and applying certain nonlinear filters on the scale-invariant properties of signals. Nonlinear logarithmic filters are shown to change correlation properties of anti-correlated signals and strongly positively-correlated signals. It is also shown that different types of trends may change correlation properties and thus mask true correlations in the original signal. A "superposition rule" is established to quantitatively describe the relationship among correlation properties of any two signals and the sum of these two signals. Based on this rule, simulations are conducted to show how to distinguish the correlations due to trends and nonstationaries from the true correlations in the real world signals. Part III investigates dynamics of human activity fluctuations. Results suggest that apparently random forearm motion possesses previously unrecognized dynamic patterns characterized by common distribution forms, scale
Statistical physics approaches to Alzheimer's disease
NASA Astrophysics Data System (ADS)
Peng, Shouyong
Alzheimer's disease (AD) is the most common cause of late life dementia. In the brain of an AD patient, neurons are lost and spatial neuronal organizations (microcolumns) are disrupted. An adequate quantitative analysis of microcolumns requires that we automate the neuron recognition stage in the analysis of microscopic images of human brain tissue. We propose a recognition method based on statistical physics. Specifically, Monte Carlo simulations of an inhomogeneous Potts model are applied for image segmentation. Unlike most traditional methods, this method improves the recognition of overlapped neurons, and thus improves the overall recognition percentage. Although the exact causes of AD are unknown, as experimental advances have revealed the molecular origin of AD, they have continued to support the amyloid cascade hypothesis, which states that early stages of aggregation of amyloid beta (Abeta) peptides lead to neurodegeneration and death. X-ray diffraction studies reveal the common cross-beta structural features of the final stable aggregates-amyloid fibrils. Solid-state NMR studies also reveal structural features for some well-ordered fibrils. But currently there is no feasible experimental technique that can reveal the exact structure or the precise dynamics of assembly and thus help us understand the aggregation mechanism. Computer simulation offers a way to understand the aggregation mechanism on the molecular level. Because traditional all-atom continuous molecular dynamics simulations are not fast enough to investigate the whole aggregation process, we apply coarse-grained models and discrete molecular dynamics methods to increase the simulation speed. First we use a coarse-grained two-bead (two beads per amino acid) model. Simulations show that peptides can aggregate into multilayer beta-sheet structures, which agree with X-ray diffraction experiments. To better represent the secondary structure transition happening during aggregation, we refine the
Hidden Statistics Approach to Quantum Simulations
NASA Technical Reports Server (NTRS)
Zak, Michail
2010-01-01
Recent advances in quantum information theory have inspired an explosion of interest in new quantum algorithms for solving hard computational (quantum and non-quantum) problems. The basic principle of quantum computation is that the quantum properties can be used to represent structure data, and that quantum mechanisms can be devised and built to perform operations with this data. Three basic non-classical properties of quantum mechanics superposition, entanglement, and direct-product decomposability were main reasons for optimism about capabilities of quantum computers that promised simultaneous processing of large massifs of highly correlated data. Unfortunately, these advantages of quantum mechanics came with a high price. One major problem is keeping the components of the computer in a coherent state, as the slightest interaction with the external world would cause the system to decohere. That is why the hardware implementation of a quantum computer is still unsolved. The basic idea of this work is to create a new kind of dynamical system that would preserve the main three properties of quantum physics superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. In other words, such a system would reinforce the advantages and minimize limitations of both quantum and classical aspects. Based upon a concept of hidden statistics, a new kind of dynamical system for simulation of Schroedinger equation is proposed. The system represents a modified Madelung version of Schroedinger equation. It preserves superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. Such an optimal combination of characteristics is a perfect match for simulating quantum systems. The model includes a transitional component of quantum potential (that has been overlooked in previous treatment of the Madelung equation). The role of the
A statistical approach to root system classification
Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter
2013-01-01
Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for “plant functional type” identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential. PMID:23914200
A statistical approach to root system classification.
Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter
2013-01-01
Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for "plant functional type" identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential. PMID:23914200
A statistical mechanics approach to Granovetter theory
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2012-05-01
In this paper we try to bridge breakthroughs in quantitative sociology/econometrics, pioneered during the last decades by Mac Fadden, Brock-Durlauf, Granovetter and Watts-Strogatz, by introducing a minimal model able to reproduce essentially all the features of social behavior highlighted by these authors. Our model relies on a pairwise Hamiltonian for decision-maker interactions which naturally extends the multi-populations approaches by shifting and biasing the pattern definitions of a Hopfield model of neural networks. Once introduced, the model is investigated through graph theory (to recover Granovetter and Watts-Strogatz results) and statistical mechanics (to recover Mac-Fadden and Brock-Durlauf results). Due to the internal symmetries of our model, the latter is obtained as the relaxation of a proper Markov process, allowing even to study its out-of-equilibrium properties. The method used to solve its equilibrium is an adaptation of the Hamilton-Jacobi technique recently introduced by Guerra in the spin-glass scenario and the picture obtained is the following: shifting the patterns from [-1,+1]→[0.+1] implies that the larger the amount of similarities among decision makers, the stronger their relative influence, and this is enough to explain both the different role of strong and weak ties in the social network as well as its small-world properties. As a result, imitative interaction strengths seem essentially a robust request (enough to break the gauge symmetry in the couplings), furthermore, this naturally leads to a discrete choice modelization when dealing with the external influences and to imitative behavior à la Curie-Weiss as the one introduced by Brock and Durlauf.
Statistical physics approaches to financial fluctuations
NASA Astrophysics Data System (ADS)
Wang, Fengzhong
2009-12-01
Complex systems attract many researchers from various scientific fields. Financial markets are one of these widely studied complex systems. Statistical physics, which was originally developed to study large systems, provides novel ideas and powerful methods to analyze financial markets. The study of financial fluctuations characterizes market behavior, and helps to better understand the underlying market mechanism. Our study focuses on volatility, a fundamental quantity to characterize financial fluctuations. We examine equity data of the entire U.S. stock market during 2001 and 2002. To analyze the volatility time series, we develop a new approach, called return interval analysis, which examines the time intervals between two successive volatilities exceeding a given value threshold. We find that the return interval distribution displays scaling over a wide range of thresholds. This scaling is valid for a range of time windows, from one minute up to one day. Moreover, our results are similar for commodities, interest rates, currencies, and for stocks of different countries. Further analysis shows some systematic deviations from a scaling law, which we can attribute to nonlinear correlations in the volatility time series. We also find a memory effect in return intervals for different time scales, which is related to the long-term correlations in the volatility. To further characterize the mechanism of price movement, we simulate the volatility time series using two different models, fractionally integrated generalized autoregressive conditional heteroscedasticity (FIGARCH) and fractional Brownian motion (fBm), and test these models with the return interval analysis. We find that both models can mimic time memory but only fBm shows scaling in the return interval distribution. In addition, we examine the volatility of daily opening to closing and of closing to opening. We find that each volatility distribution has a power law tail. Using the detrended fluctuation
An Active Learning Approach to Teaching Statistics.
ERIC Educational Resources Information Center
Dolinsky, Beverly
2001-01-01
Provides suggestions for using active learning as the primary means to teaching statistics in order to create a collaborative environment. Addresses such strategies as using SPSS Base 7.5 for Windows and course periods centered on answering student-generated questions. Discusses various writing intensive assignments. (CMK)
Unified statistical approach to cortical thickness analysis.
Chung, Moo K; Robbins, Steve; Evans, Alan C
2005-01-01
This paper presents a unified image processing and analysis framework for cortical thickness in characterizing a clinical population. The emphasis is placed on the development of data smoothing and analysis framework. The human brain cortex is a highly convoluted surface. Due to the convoluted non-Euclidean surface geometry, data smoothing and analysis on the cortex are inherently difficult. When measurements lie on a curved surface, it is natural to assign kernel smoothing weights based on the geodesic distance along the surface rather than the Euclidean distance. We present a new data smoothing framework that address this problem implicitly without actually computing the geodesic distance and present its statistical properties. Afterwards, the statistical inference is based on the random field theory based multiple comparison correction. As an illustration, we have applied the method in detecting the regions of abnormal cortical thickness in 16 high functioning autistic children. PMID:17354731
A statistical approach for polarized parton distributions
NASA Astrophysics Data System (ADS)
Bourrely, C.; Soffer, J.; Buccella, F.
2002-04-01
A global next-to-leading order QCD analysis of unpolarized and polarized deep-inelastic scattering data is performed with parton distributions constructed in a statistical physical picture of the nucleon. The chiral properties of QCD lead to strong relations between quarks and antiquarks distributions and the importance of the Pauli exclusion principle is also emphasized. We obtain a good description, in a broad range of x and Q^2, of all measured structure functions in terms of very few free parameters. We stress the fact that at RHIC-BNL the ratio of the unpolarized cross sections for the production of W^+ and W^- in pp collisions will directly probe the behavior of the bar d(x) / bar u(x) ratio for x ≥ 0.2, a definite and important test for the statistical model. Finally, we give specific predictions for various helicity asymmetries for the W^±, Z production in pp collisions at high energies, which will be measured with forthcoming experiments at RHIC-BNL and which are sensitive tests of the statistical model for Δ bar u(x) and Δ bar d(x).
Supersymmetric Liouville theory: A statistical mechanical approach
Barrozo, M.C.; Belvedere, L.V.
1996-02-01
The statistical mechanical system associated with the two-dimensional supersymmetric Liouville theory is obtained through an infrared-finite perturbation expansion. Considering the system confined in a finite volume and in the presence of a uniform neutralizing background, we show that the grand-partition function of this system describes a one-component gas, in which the Boltzmann factor is weighted by an integration over the Grassmann variables. This weight function introduces the dimensional reduction phenomenon. After performing the thermodynamic limit, the resulting supersymmetric quantum theory is translationally invariant. {copyright} {ital 1996 The American Physical Society.}
Random graph coloring: statistical physics approach.
van Mourik, J; Saad, D
2002-11-01
The problem of vertex coloring in random graphs is studied using methods of statistical physics and probability. Our analytical results are compared to those obtained by exact enumeration and Monte Carlo simulations. We critically discuss the merits and shortcomings of the various methods, and interpret the results obtained. We present an exact analytical expression for the two-coloring problem as well as general replica symmetric approximated solutions for the thermodynamics of the graph coloring problem with p colors and K-body edges. PMID:12513569
Statistical mechanical approach to human language
NASA Astrophysics Data System (ADS)
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-07-01
We use the formulation of equilibrium statistical mechanics in order to study some important characteristics of language. Using a simple expression for the Hamiltonian of a language system, which is directly implied by the Zipf law, we are able to explain several characteristic features of human language that seem completely unrelated, such as the universality of the Zipf exponent, the vocabulary size of children, the reduced communication abilities of people suffering from schizophrenia, etc. While several explanations are necessarily only qualitative at this stage, we have, nevertheless, been able to derive a formula for the vocabulary size of children as a function of age, which agrees rather well with experimental data.
A Hierarchical Statistic Methodology for Advanced Memory System Evaluation
Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.
1999-04-12
Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.
A Statistical Approach for Ambiguous Sequence Mappings
Technology Transfer Automated Retrieval System (TEKTRAN)
When attempting to map RNA sequences to a reference genome, high percentages of short sequence reads are often assigned to multiple genomic locations. One approach to handling these “ambiguous mappings” has been to discard them. This results in a loss of data, which can sometimes be as much as 45% o...
Statistical Physics Approaches to RNA Editing
NASA Astrophysics Data System (ADS)
Bundschuh, Ralf
2012-02-01
The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.
Advanced Safeguards Approaches for New Reprocessing Facilities
Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Richard; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.
2007-06-24
U.S. efforts to promote the international expansion of nuclear energy through the Global Nuclear Energy Partnership (GNEP) will result in a dramatic expansion of nuclear fuel cycle facilities in the United States. New demonstration facilities, such as the Advanced Fuel Cycle Facility (AFCF), the Advanced Burner Reactor (ABR), and the Consolidated Fuel Treatment Center (CFTC) will use advanced nuclear and chemical process technologies that must incorporate increased proliferation resistance to enhance nuclear safeguards. The ASA-100 Project, “Advanced Safeguards Approaches for New Nuclear Fuel Cycle Facilities,” commissioned by the NA-243 Office of NNSA, has been tasked with reviewing and developing advanced safeguards approaches for these demonstration facilities. Because one goal of GNEP is developing and sharing proliferation-resistant nuclear technology and services with partner nations, the safeguards approaches considered are consistent with international safeguards as currently implemented by the International Atomic Energy Agency (IAEA). This first report reviews possible safeguards approaches for the new fuel reprocessing processes to be deployed at the AFCF and CFTC facilities. Similar analyses addressing the ABR and transuranic (TRU) fuel fabrication lines at AFCF and CFTC will be presented in subsequent reports.
An Integrated, Statistical Molecular Approach to the Physical Chemistry Curriculum
ERIC Educational Resources Information Center
Cartier, Stephen F.
2009-01-01
As an alternative to the "thermodynamics first" or "quantum first" approaches to the physical chemistry curriculum, the statistical definition of entropy and the Boltzmann distribution are introduced in the first days of the course and the entire two-semester curriculum is then developed from these concepts. Once the tools of statistical mechanics…
Statistical Approach To Determination Of Texture In SAR
NASA Technical Reports Server (NTRS)
Rignot, Eric J.; Kwok, Ronald
1993-01-01
Paper presents statistical approach to analysis of texture in synthetic-aperture-radar (SAR) images. Objective: to extract intrinsic spatial variability of distributed target from overall spatial variability of SAR image.
Chemical Approaches for Advanced Optical Imaging
NASA Astrophysics Data System (ADS)
Chen, Zhixing
Advances in optical microscopy have been constantly expanding our knowledge of biological systems. The achievements therein are a result of close collaborations between physicists/engineers who build the imaging instruments and chemists/biochemists who design the corresponding probe molecules. In this work I present a number of chemical approaches for the development of advanced optical imaging methods. Chapter 1 provides an overview of the recent advances of novel imaging approaches taking advantage of chemical tag technologies. Chapter 2 describes the second-generation covalent trimethoprim-tag as a viable tool for live cell protein-specific labeling and imaging. In Chapter 3 we present a fluorescence lifetime imaging approach to map protein-specific micro-environment in live cells using TMP-Cy3 as a chemical probe. In Chapter 4, we present a method harnessing photo-activatable fluorophores to extend the fundamental depth limit in multi-photon microscopy. Chapter 5 describes the development of isotopically edited alkyne palette for multi-color live cell vibrational imaging of cellular small molecules. These studies exemplify the impact of modern chemical approaches in the development of advanced optical microscopies.
An approach to dyspnea in advanced disease.
Gallagher, Romayne
2003-01-01
INTRODUCTION: To describe an approach to assessment and treatment of dyspnea. SOURCES OF INFORMATION: New level I evidence can guide management of dyspnea in advanced illness. Assessment and use of adjuvant medications and oxygen relies on level II and III evidence. MAIN MESSAGE: Opioids are first-line therapy for managing dyspnea in advanced illness. They are safe and effective in reducing shortness of breath. Neuroleptics are useful adjuvant medications. Evidence does not support use of oxygen for every patient experiencing dyspnea; it should be tried for patients who do not benefit from first-line medications and nonmedicinal therapies. CONCLUSION: Opioids relieve dyspnea and are indicated as first-line treatment for dyspnea arising from advanced disease of any cause. PMID:14708926
NASA Astrophysics Data System (ADS)
Mountcastle, Donald B.; Bucy, Brandon R.; Thompson, John R.
2007-11-01
Equilibrium properties of macroscopic systems are highly predictable as n, the number of particles approaches and exceeds Avogadro's number; theories of statistical physics depend on these results. Typical pedagogical devices used in statistical physics textbooks to introduce entropy (S) and multiplicity (ω) (where S = k ln(ω)) include flipping coins and/or other equivalent binary events, repeated n times. Prior to instruction, our statistical mechanics students usually gave reasonable answers about the probabilities, but not the relative uncertainties, of the predicted outcomes of such events. However, they reliably predicted that the uncertainty in a measured continuous quantity (e.g., the amount of rainfall) does decrease as the number of measurements increases. Typical textbook presentations assume that students understand that the relative uncertainty of binary outcomes will similarly decrease as the number of events increases. This is at odds with our findings, even though most of our students had previously completed mathematics courses in statistics, as well as an advanced electronics laboratory course that included statistical analysis of distributions of dart scores as n increased.
Automated statistical approach to Langley evaluation for a solar radiometer.
Kuester, Michele A; Thome, Kurtis J; Reagan, John A
2003-08-20
We present a statistical approach to Langley evaluation (SALE) leading to an improved method of calibration of an automated solar radiometer. Software was developed with the SALE method to first determine whether a day is a good calibration day and then to automatically calculate an intercept value for the solar radiometer. Results from manual processing of calibration data sets agree with those of the automated method to within the errors of each approach. PMID:12952339
Automated statistical approach to Langley evaluation for a solar radiometer
NASA Astrophysics Data System (ADS)
Kuester, Michele A.; Thome, Kurtis J.; Reagan, John A.
2003-08-01
We present a statistical approach to Langley evaluation (SALE) leading to an improved method of calibration of an automated solar radiometer. Software was developed with the SALE method to first determine whether a day is a good calibration day and then to automatically calculate an intercept value for the solar radiometer. Results from manual processing of calibration data sets agree with those of the automated method to within the errors of each approach.
Advances in Testing the Statistical Significance of Mediation Effects
ERIC Educational Resources Information Center
Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.
2006-01-01
P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…
Reconciling Statistical and Systems Science Approaches to Public Health
ERIC Educational Resources Information Center
Ip, Edward H.; Rahmandad, Hazhir; Shoham, David A.; Hammond, Ross; Huang, Terry T. -K.; Wang, Youfa; Mabry, Patricia L.
2013-01-01
Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which…
A Standardization Approach to Adjusting Pretest Item Statistics.
ERIC Educational Resources Information Center
Chang, Shun-Wen; Hanson, Bradley A.; Harris, Deborah J.
This study presents and evaluates a method of standardization that may be used by test practitioners to standardize classical item statistics when sample sizes are small. The effectiveness of this standardization approach was compared through simulation with the one-parameter logistic (1PL) and three parameter logistic (3PL) models based on the…
New Results in the Quantum Statistical Approach to Parton Distributions
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude; Buccella, Franco
2015-02-01
We will describe the quantum statistical approach to parton distributions allowing to obtain simultaneously the unpolarized distributions and the helicity distributions. We will present some recent results, in particular related to the nucleon spin structure in QCD. Future measurements are challenging to check the validity of this novel physical framework.
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
A Statistical Approach to Autocorrelation Detection of Low Frequency Earthquakes
NASA Astrophysics Data System (ADS)
Aguiar, A. C.; Beroza, G. C.
2012-12-01
We have analyzed tremor data during the April, 2006 tremor episode in the Nankai Trough in SW Japan using the auto-correlation approach of Brown et al. (2008), which detects low frequency earthquakes (LFEs) based on pair-wise matching. We have found that the statistical behavior of the autocorrelations of each station is different and for this reason we have based our LFE detection method on the autocorrelation of each station individually. Analyzing one station at a time assures that the detection threshold will only depend on the station being analyzed. Once detections are found on each station individually, using a low detection threshold based on a Gaussian distribution of the correlation coefficients, the results are compared within stations and declared a detection if they are found in a statistically significant number of the stations, following multinomial statistics. We have compared our detections using the single station method to the detections found by Shelly et al. (2007) for the 2006 April 16 events and find a significant number of similar detections as well as many new detections that were not found using templates from known LFEs. We are working towards developing a sound statistical basis for event detection. This approach should improve our ability to detect LFEs within weak tremor signals where they are not already identified, and should be applicable to earthquake swarms and sequences in general.
Statistics of topography : multifractal approach to describe planetary topography
NASA Astrophysics Data System (ADS)
Landais, Francois; Schmidt, Frédéric; Lovejoy, Shaun
2016-04-01
In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeters to thousands of kilometers. In our study, we investigate the statistical properties of the topography. Our statistical approach is motivated by the well known scaling behavior of topography that has been widely studied in the past. Indeed, scaling laws are strongly present in geophysical field and can be studied using fractal formalism. More precisely, we expect multifractal behavior in global topographic fields. This behavior reflects the high variability and intermittency observed in topographic fields that can not be generated by simple scaling models. In the multifractal formalism, each statistical moment exhibits a different scaling law characterized by a function called the moment scaling function. Previous studies were conducted at regional scale to demonstrate that topography present multifractal statistics (Gagnon et al., 2006, NPG). We have obtained similar results on Mars (Landais et al. 2015) and more recently on different body in the the solar system including the Moon, Venus and Mercury. We present the result of different multifractal approaches performed on global and regional basis and compare the fractal parameters from a body to another.
Statistical Approach to Quality Control of Large Thermodynamic Databases
NASA Astrophysics Data System (ADS)
Nyman, Henrik; Talonen, Tarja; Roine, Antti; Hupa, Mikko; Corander, Jukka
2012-10-01
In chemistry and engineering, thermodynamic databases are widely used to obtain the basic properties of pure substances or mixtures. Large and reliable databases are the basis of all thermodynamic modeling of complex chemical processes or systems. However, the effort needed in the establishment, maintenance, and management of a database increases exponentially along with the size and scope of the database. Therefore, we developed a statistical modeling approach to assist an expert in the evaluation and management process, which can pinpoint various types of erroneous records in a database. We have applied this method to investigate the enthalpy, entropy, and heat capacity characteristics in a large commercial database for approximately 25,000 chemical species. Our highly successful results show that a statistical approach is a valuable tool (1) for the management of such databases and (2) to create enthalpy, entropy and heat capacity estimates for such species in which thermochemical data are not available.
Primordial statistical anisotropies: the effective field theory approach
NASA Astrophysics Data System (ADS)
Akbar Abolhasani, Ali; Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan
2016-03-01
In this work we present the effective field theory of primordial statistical anisotropies generated during anisotropic inflation involving a background U(1) gauge field. Besides the usual Goldstone boson associated with the breaking of time diffeomorphism we have two additional Goldstone bosons associated with the breaking of spatial diffeomorphisms. We further identify these two new Goldstone bosons with the expected two transverse degrees of the U(1) gauge field fluctuations. Upon defining the appropriate unitary gauge, we present the most general quadratic action which respects the remnant symmetry in the unitary gauge. The interactions between various Goldstone bosons leads to statistical anisotropy in curvature perturbation power spectrum. Calculating the general results for power spectrum anisotropy, we recover the previously known results in specific models of anisotropic inflation. In addition, we present novel results for statistical anisotropy in models with non-trivial sound speed for inflaton fluctuations. Also we identify the interaction which leads to birefringence-like effects in anisotropic power spectrum in which the speed of gauge field fluctuations depends on the direction of the mode propagation and the two polarization of gauge field fluctuations contribute differently in statistical anisotropy. As another interesting application, our EFT approach naturally captures interactions generating parity violating statistical anisotropies.
A new statistical approach to climate change detection and attribution
NASA Astrophysics Data System (ADS)
Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe
2016-04-01
We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the "models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).
New Statistical Approaches to RHESSI Solar Flare Imaging
NASA Astrophysics Data System (ADS)
Schwartz, Richard A.; Benvenuto, F.; Massone, A.; Piana, M.; Sorrentino, A.
2012-05-01
We present two statistical approaches to image reconstruction from RHESSI measurements. The first approach implements maximum likelihood by means of an expectation-maximization algorithm resembling the Lucy-Richardson method. The second approach is genuinely Bayesian in the fact that it introduces the use of a prior probability distribution coding information known a priori on the flaring source. The posterior distribution is computed by means of an importance sampling Monte Carlo technique. Further, this approach will be extended to a filtering method in which the posterior distribution at a specific energy or time interval is used as a prior for the next interval. Finally, we will also study the possibility to adapt this method to multi-scaling reconstruction exploiting the different resolution powers provided by the nine RHESSI collimators.
Advanced Approach of Multiagent Based Buoy Communication
Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej
2015-01-01
Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197
Advanced Approach of Multiagent Based Buoy Communication.
Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej
2015-01-01
Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197
Defining statistical perceptions with an empirical Bayesian approach
NASA Astrophysics Data System (ADS)
Tajima, Satohiro
2013-04-01
Extracting statistical structures (including textures or contrasts) from a natural stimulus is a central challenge in both biological and engineering contexts. This study interprets the process of statistical recognition in terms of hyperparameter estimations and free-energy minimization procedures with an empirical Bayesian approach. This mathematical interpretation resulted in a framework for relating physiological insights in animal sensory systems to the functional properties of recognizing stimulus statistics. We applied the present theoretical framework to two typical models of natural images that are encoded by a population of simulated retinal neurons, and demonstrated that the resulting cognitive performances could be quantified with the Fisher information measure. The current enterprise yielded predictions about the properties of human texture perception, suggesting that the perceptual resolution of image statistics depends on visual field angles, internal noise, and neuronal information processing pathways, such as the magnocellular, parvocellular, and koniocellular systems. Furthermore, the two conceptually similar natural-image models were found to yield qualitatively different predictions, striking a note of warning against confusing the two models when describing a natural image.
Advances in assessing geomorphic plausibility in statistical susceptibility modelling
NASA Astrophysics Data System (ADS)
Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas
2014-05-01
The quality, reliability and applicability of landslide susceptibility maps is regularly deduced directly by interpreting quantitative model performance measures. These quantitative estimates are usually calculated for an independent test sample of a landslide inventory. Numerous studies demonstrate that totally unbiased landslide inventories are rarely available. We assume that such biases are also inherent in the test sample used to quantitatively validate the models. Therefore we suppose that the explanatory power of statistical performance measures is limited by the quality of the inventory used to calculate these statistics. To investigate this assumption, we generated and validated 16 statistical susceptibility models by using two landslide inventories of differing qualities for the Rhenodanubian Flysch zone of Lower Austria (1,354 km²). The ALS-based (Airborne Laser Scan) Inventory (n=6,218) was mapped purposely for susceptibility modelling from a high resolution hillshade and exhibits a high positional accuracy. The less accurate building ground register (BGR; n=681) provided by the Geological Survey of Lower Austria represents reported damaging events and shows a substantially lower completeness. Both inventories exhibit differing systematic biases regarding the land cover. For instance, due to human impact on the visibility of geomorphic structures (e.g. planation), few ALS landslides could be mapped on settlements and pastures (ALS-mapping bias). In contrast, damaging events were frequently reported for settlements and pastures (BGR-report bias). Susceptibility maps were calculated by applying four multivariate classification methods, namely generalized linear model, generalized additive model, random forest and support vector machine separately for both inventories and two sets of explanatory variables (with and without land cover). Quantitative validation was performed by calculating the area under the receiver operating characteristics curve (AUROC
A statistical approach to optimizing concrete mixture design.
Ahmad, Shamsad; Alghamdi, Saeid A
2014-01-01
A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405
A Statistical Approach to Optimizing Concrete Mixture Design
Alghamdi, Saeid A.
2014-01-01
A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405
Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials
J. J. Einerson
2005-05-01
Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.
An advanced approach to reactivity rating.
Kossoy, A; Benin, A; Akhmetshin, Yu
2005-02-14
Reactive hazards remain a significant safety challenge in the chemical industry despite continual attention devoted to this problem. The application of various criteria, which are recommended by the guidelines for assessment of reactive hazards, often causes unsafe results to be obtained. The main origins of such failures are as follows: (a) reactivity of a compound is considered as an inherent property of a compound; (b) some appropriate criteria are determined by using too simple methods that cannot reveal potential hazards properly. Four well-known hazard indicators--time to certain conversion limit, TCL; adiabatic time to maximum rate, TMR; adiabatic temperature rise; and NFPA reactivity rating number, Nr--are analyzed in the paper. It was ascertained that they could be safely used for preliminary assessment of reactive hazards provided that: (a) the selected indicator is appropriate for the specific conditions of a process; (b) the indicators have been determined by using the pertinent methods. The applicability limits for every indicator were determined and the advanced kinetics-based simulation approach, which allows reliable determination of the indicators, is proposed. The technique of applying this approach is illustrated by two practical examples. PMID:15721524
Advanced Safeguards Approaches for New Fast Reactors
Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Rick L.; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.
2007-12-15
This third report in the series reviews possible safeguards approaches for new fast reactors in general, and the ABR in particular. Fast-neutron spectrum reactors have been used since the early 1960s on an experimental and developmental level, generally with fertile blanket fuels to “breed” nuclear fuel such as plutonium. Whether the reactor is designed to breed plutonium, or transmute and “burn” actinides depends mainly on the design of the reactor neutron reflector and the whether the blanket fuel is “fertile” or suitable for transmutation. However, the safeguards issues are very similar, since they pertain mainly to the receipt, shipment and storage of fresh and spent plutonium and actinide-bearing “TRU”-fuel. For these reasons, the design of existing fast reactors and details concerning how they have been safeguarded were studied in developing advanced safeguards approaches for the new fast reactors. In this regard, the design of the Experimental Breeder Reactor-II “EBR-II” at the Idaho National Laboratory (INL) was of interest, because it was designed as a collocated fast reactor with a pyrometallurgical reprocessing and fuel fabrication line – a design option being considered for the ABR. Similarly, the design of the Fast Flux Facility (FFTF) on the Hanford Site was studied, because it was a successful prototype fast reactor that ran for two decades to evaluate fuels and the design for commercial-scale fast reactors.
A Flexible Approach for the Statistical Visualization of Ensemble Data
Potter, K.; Wilson, A.; Bremer, P.; Williams, Dean N.; Pascucci, V.; Johnson, C.
2009-09-29
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methods that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.
Statistically Based Approach to Broadband Liner Design and Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M. (Inventor); Jones, Michael G. (Inventor)
2016-01-01
A broadband liner design optimization includes utilizing in-duct attenuation predictions with a statistical fan source model to obtain optimum impedance spectra over a number of flow conditions for one or more liner locations in a bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners having impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increasing weighting to specific frequencies and/or operating conditions. One or more broadband design approaches are utilized to produce a broadband liner that targets a full range of frequencies and operating conditions.
Statistical Approaches for the Study of Cognitive and Brain Aging
Chen, Huaihou; Zhao, Bingxin; Cao, Guanqun; Proges, Eric C.; O'Shea, Andrew; Woods, Adam J.; Cohen, Ronald A.
2016-01-01
Neuroimaging studies of cognitive and brain aging often yield massive datasets that create many analytic and statistical challenges. In this paper, we discuss and address several limitations in the existing work. (1) Linear models are often used to model the age effects on neuroimaging markers, which may be inadequate in capturing the potential nonlinear age effects. (2) Marginal correlations are often used in brain network analysis, which are not efficient in characterizing a complex brain network. (3) Due to the challenge of high-dimensionality, only a small subset of the regional neuroimaging markers is considered in a prediction model, which could miss important regional markers. To overcome those obstacles, we introduce several advanced statistical methods for analyzing data from cognitive and brain aging studies. Specifically, we introduce semiparametric models for modeling age effects, graphical models for brain network analysis, and penalized regression methods for selecting the most important markers in predicting cognitive outcomes. We illustrate these methods using the healthy aging data from the Active Brain Study. PMID:27486400
STATISTICS OF DARK MATTER HALOS FROM THE EXCURSION SET APPROACH
Lapi, A.; Salucci, P.; Danese, L.
2013-08-01
We exploit the excursion set approach in integral formulation to derive novel, accurate analytic approximations of the unconditional and conditional first crossing distributions for random walks with uncorrelated steps and general shapes of the moving barrier; we find the corresponding approximations of the unconditional and conditional halo mass functions for cold dark matter (DM) power spectra to represent very well the outcomes of state-of-the-art cosmological N-body simulations. In addition, we apply these results to derive, and confront with simulations, other quantities of interest in halo statistics, including the rates of halo formation and creation, the average halo growth history, and the halo bias. Finally, we discuss how our approach and main results change when considering random walks with correlated instead of uncorrelated steps, and warm instead of cold DM power spectra.
The statistical multifragmentation model: Origins and recent advances
NASA Astrophysics Data System (ADS)
Donangelo, R.; Souza, S. R.
2016-07-01
We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.
An alternative approach to advancing resuscitation science.
Kern, Karl B; Valenzuela, Terence D; Clark, Lani L; Berg, Robert A; Hilwig, Ronald W; Berg, Marc D; Otto, Charles W; Newburn, Daniel; Ewy, Gordon A
2005-03-01
Stagnant survival rates in out-of-hospital cardiac arrest remain a great impetus for advancing resuscitation science. International resuscitation guidelines, with all their advantages for standardizing resuscitation therapeutic protocols, can be difficult to change. A formalized evidence-based process has been adopted by the International Liason Committee on Resuscitation (ILCOR) in formulating such guidelines. Currently, randomized clinical trials are considered optimal evidence, and very few major changes in the Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care are made without such. An alternative approach is to allow externally controlled clinical trials more weight in Guideline formulation and resuscitation protocol adoption. In Tucson, Arizona (USA), the Fire Department cardiac arrest database has revealed a number of resuscitation issues. These include a poor bystander CPR rate, a lack of response to initial defibrillation after prolonged ventricular fibrillation, and substantial time without chest compressions during the resuscitation effort. A local change in our previous resuscitation protocols had been instituted based upon this historical database information. PMID:15733752
A feature refinement approach for statistical interior CT reconstruction
NASA Astrophysics Data System (ADS)
Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong
2016-07-01
Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)—minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements.
A feature refinement approach for statistical interior CT reconstruction.
Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong
2016-07-21
Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)-minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements. PMID:27362527
Statistical physics approach to earthquake occurrence and forecasting
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio
2016-04-01
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different
Thermodynamics, reversibility and Jaynes' approach to statistical mechanics
NASA Astrophysics Data System (ADS)
Parker, Daniel N.
This dissertation contests David Albert's recent arguments that the proposition that the universe began in a particularly low entropy state (the "past hypothesis") is necessary and sufficient to ground the thermodynamic asymmetry against the reversibility objection, which states that the entropy of thermodynamic systems was previously larger than it is now. In turn, it argues that this undermines Albert's suggestion that the past hypothesis can underwrite other temporal asymmetries such as those of records and causation. This thesis thus concerns the broader philosophical problem of understanding the interrelationships among the various temporal asymmetries that we find in the world, such as those of thermodynamic phenomena, causation, human agency and inference. The position argued for is that the thermodynamic asymmetry is nothing more than an inferential asymmetry, reflecting a distinction between the inferences made towards the past and the future. As such, it cannot be used to derive a genuine physical asymmetry. At most, an inferential asymmetry can provide evidence for an asymmetry not itself forthcoming from the formalism of statistical mechanics. The approach offered here utilises an epistemic, information-theoretic interpretation of thermodynamics applied to individual "branch" systems in order to ground irreversible thermodynamic behaviour (Branch systems are thermodynamic systems quasi-isolated from their environments for short periods of time). I argue that such an interpretation solves the reversibility objection by treating thermodynamics as part of a more general theory of statistical inference supported by information theory and developed in the context of thermodynamics by E.T. Jaynes. It is maintained that by using an epistemic interpretation of probability (where the probabilities reflect one's knowledge about a thermodynamic system rather than a property of the system itself), the reversibility objection can be disarmed by severing the link
Multilayer Approach for Advanced Hybrid Lithium Battery.
Ming, Jun; Li, Mengliu; Kumar, Pushpendra; Li, Lain-Jong
2016-06-28
Conventional intercalated rechargeable batteries have shown their capacity limit, and the development of an alternative battery system with higher capacity is strongly needed for sustainable electrical vehicles and hand-held devices. Herein, we introduce a feasible and scalable multilayer approach to fabricate a promising hybrid lithium battery with superior capacity and multivoltage plateaus. A sulfur-rich electrode (90 wt % S) is covered by a dual layer of graphite/Li4Ti5O12, where the active materials S and Li4Ti5O12 can both take part in redox reactions and thus deliver a high capacity of 572 mAh gcathode(-1) (vs the total mass of electrode) or 1866 mAh gs(-1) (vs the mass of sulfur) at 0.1C (with the definition of 1C = 1675 mA gs(-1)). The battery shows unique voltage platforms at 2.35 and 2.1 V, contributed from S, and 1.55 V from Li4Ti5O12. A high rate capability of 566 mAh gcathode(-1) at 0.25C and 376 mAh gcathode(-1) at 1C with durable cycle ability over 100 cycles can be achieved. Operando Raman and electron microscope analysis confirm that the graphite/Li4Ti5O12 layer slows the dissolution/migration of polysulfides, thereby giving rise to a higher sulfur utilization and a slower capacity decay. This advanced hybrid battery with a multilayer concept for marrying different voltage plateaus from various electrode materials opens a way of providing tunable capacity and multiple voltage platforms for energy device applications. PMID:27268064
Statistical approaches and software for clustering islet cell functional heterogeneity
Wills, Quin F.; Boothe, Tobias; Asadi, Ali; Ao, Ziliang; Warnock, Garth L.; Kieffer, Timothy J.
2016-01-01
ABSTRACT Worldwide efforts are underway to replace or repair lost or dysfunctional pancreatic β-cells to cure diabetes. However, it is unclear what the final product of these efforts should be, as β-cells are thought to be heterogeneous. To enable the analysis of β-cell heterogeneity in an unbiased and quantitative way, we developed model-free and model-based statistical clustering approaches, and created new software called TraceCluster. Using an example data set, we illustrate the utility of these approaches by clustering dynamic intracellular Ca2+ responses to high glucose in ∼300 simultaneously imaged single islet cells. Using feature extraction from the Ca2+ traces on this reference data set, we identified 2 distinct populations of cells with β-like responses to glucose. To the best of our knowledge, this report represents the first unbiased cluster-based analysis of human β-cell functional heterogeneity of simultaneous recordings. We hope that the approaches and tools described here will be helpful for those studying heterogeneity in primary islet cells, as well as excitable cells derived from embryonic stem cells or induced pluripotent cells. PMID:26909740
Urban pavement surface temperature. Comparison of numerical and statistical approach
NASA Astrophysics Data System (ADS)
Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia
2015-04-01
The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach
NASA Astrophysics Data System (ADS)
Bard, D.; Kratochvil, J. M.; Dawson, W.
2016-03-01
The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.
Statistical approach to the analysis of cell desynchronization data
NASA Astrophysics Data System (ADS)
Milotti, Edoardo; Del Fabbro, Alessio; Dalla Pellegrina, Chiara; Chignola, Roberto
2008-07-01
Experimental measurements on semi-synchronous tumor cell populations show that after a few cell cycles they desynchronize completely, and this desynchronization reflects the intercell variability of cell-cycle duration. It is important to identify the sources of randomness that desynchronize a population of cells living in a homogeneous environment: for example, being able to reduce randomness and induce synchronization would aid in targeting tumor cells with chemotherapy or radiotherapy. Here we describe a statistical approach to the analysis of the desynchronization measurements that is based on minimal modeling hypotheses, and can be derived from simple heuristics. We use the method to analyze existing desynchronization data and to draw conclusions on the randomness of cell growth and proliferation.
Rate-equation approach to atomic-laser light statistics
Chusseau, Laurent; Arnaud, Jacques; Philippe, Fabrice
2002-11-01
We consider three- and four-level atomic lasers that are either incoherently (unidirectionally) or coherently (bidirectionally) pumped, the single-mode cavity being resonant with the laser transition. The intracavity Fano factor and the photocurrent spectral density are evaluated on the basis of rate equations. According to that approach, fluctuations are caused by jumps in active and detecting atoms. The algebra is simple. Whenever a comparison is made, the expressions obtained coincide with the previous results. The conditions under which the output light exhibits sub-Poissonian statistics are considered in detail. Analytical results, based on linearization, are verified by comparison with Monte Carlo simulations. An essentially exhaustive investigation of sub-Poissonian light generation by three- and four-level lasers has been performed. Only special forms were reported earlier.
Statistical physics approach to quantifying differences in myelinated nerve fibers
Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene
2014-01-01
We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross–sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146
Statistical physics approach to quantifying differences in myelinated nerve fibers
NASA Astrophysics Data System (ADS)
Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene
2014-03-01
We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.
A statistical approach to nuclear fuel design and performance
NASA Astrophysics Data System (ADS)
Cunning, Travis Andrew
As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance
Statistical approach to meteoroid shape estimation based on recovered meteorites
NASA Astrophysics Data System (ADS)
Vinnikov, V.; Gritsevich, M.; Turchak, L.
2014-07-01
Each meteorite sample can provide data on the chemical and physical properties of interplanetary matter. The set of recovered fragments within one meteorite fall can give additional information on the history of its parent asteroid. A reliably estimated meteoroid shape is a valuable input parameter for the atmospheric entry scenario, since the pre-entry mass, terminal meteorite mass, and fireball luminosity are proportional to the pre-entry shape factor of the meteoroid to the power of 3 [1]. We present a statistical approach to the estimation of meteoroid pre-entry shape [2], applied to the detailed data on recovered meteorite fragments. This is a development of our recent study on the fragment mass distribution functions for the Košice meteorite fall [3]. The idea of the shape estimation technique is based on experiments that show that brittle fracturing produces multiple fragments of sizes smaller than or equal to the smallest dimension of the body [2]. Such shattering has fractal properties similar to many other natural phenomena [4]. Thus, this self-similarity for scaling mass sequences can be described by the power law statistical expressions [5]. The finite mass and the number of fragments N are represented via an exponential cutoff for the maximum fragment mass m_U. The undersampling of tiny unrecoverable fragments is handled via an additional constraint on the minimum fragment mass m_L. The complementary cumulative distribution function has the form F( m)={N-j}/{m_j}( {m}/{m_j})^{-β_0}exp( {m-m_j}/{m_U}). The resulting parameters sought (scaling exponent β_0 and mass limits) are computed to fit the empirical fragment mass distribution: S(β_0, j, m_U) = sum_{i=j}^{N}[F(m_i)-{N-j}/{m_j}]^2, m_j = m_L. The scaling exponent correlates with the dimensionless shape parameter d [2]: 0.13d^2-0.21d+1.1-β=0, which, in turn, is expressed via the ratio of the linear dimensions a, b, c of the shattering body [2]: d = 1+2(ab+ac+bc)(a^2+b^2+c^2)^{-1}. We apply the
New Statistical Approach to the Analysis of Hierarchical Data
NASA Astrophysics Data System (ADS)
Neuman, S. P.; Guadagnini, A.; Riva, M.
2014-12-01
Many variables possess a hierarchical structure reflected in how their increments vary in space and/or time. Quite commonly the increments (a) fluctuate in a highly irregular manner; (b) possess symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with separation distance or lag; (c) exhibit nonlinear power-law scaling of sample structure functions in a midrange of lags, with breakdown in such scaling at small and large lags; (d) show extended power-law scaling (ESS) at all lags; and (e) display nonlinear scaling of power-law exponent with order of sample structure function. Some interpret this to imply that the variables are multifractal, which explains neither breakdowns in power-law scaling nor ESS. We offer an alternative interpretation consistent with all above phenomena. It views data as samples from stationary, anisotropic sub-Gaussian random fields subordinated to truncated fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). The fields are scaled Gaussian mixtures with random variances. Truncation of fBm and fGn entails filtering out components below data measurement or resolution scale and above domain scale. Our novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. These parameters in turn make it possible to downscale or upscale all statistical moments to situations entailing smaller or larger measurement or resolution and sampling scales, respectively. They also allow one to perform conditional or unconditional Monte Carlo simulations of random field realizations corresponding to these scales. Aspects of our approach are illustrated on field and laboratory measured porous and fractured rock permeabilities, as well as soil texture characteristics and neural network estimates of unsaturated hydraulic parameters in a deep vadose zone near Phoenix, Arizona. We also use our approach
Statistical physics and physiology: monofractal and multifractal approaches
NASA Technical Reports Server (NTRS)
Stanley, H. E.; Amaral, L. A.; Goldberger, A. L.; Havlin, S.; Peng, C. K.
1999-01-01
Even under healthy, basal conditions, physiologic systems show erratic fluctuations resembling those found in dynamical systems driven away from a single equilibrium state. Do such "nonequilibrium" fluctuations simply reflect the fact that physiologic systems are being constantly perturbed by external and intrinsic noise? Or, do these fluctuations actually, contain useful, "hidden" information about the underlying nonequilibrium control mechanisms? We report some recent attempts to understand the dynamics of complex physiologic fluctuations by adapting and extending concepts and methods developed very recently in statistical physics. Specifically, we focus on interbeat interval variability as an important quantity to help elucidate possibly non-homeostatic physiologic variability because (i) the heart rate is under direct neuroautonomic control, (ii) interbeat interval variability is readily measured by noninvasive means, and (iii) analysis of these heart rate dynamics may provide important practical diagnostic and prognostic information not obtainable with current approaches. The analytic tools we discuss may be used on a wider range of physiologic signals. We first review recent progress using two analysis methods--detrended fluctuation analysis and wavelets--sufficient for quantifying monofractual structures. We then describe recent work that quantifies multifractal features of interbeat interval series, and the discovery that the multifractal structure of healthy subjects is different than that of diseased subjects.
A descriptive statistical approach to the Korean pharmacopuncture therapy.
Kim, Jungdae; Kang, Dae-In
2010-09-01
This paper reviews trends in research related to Korean pharmacopuncture therapy. Specifically, basic and clinical research in pharmacopuncture within the last decade is summarized by introducing categorical variables for classification. These variables are also analyzed for association. This literature review is based on articles published from February 1997 to December 2008 in a Korean journal, the Journal of the Korean Institute of Herbal Acupuncture, which was renamed the Journal of the Korean Pharmacopuncture Institute in 2007. Among the total of 379 papers published in the journal during this period, 164 papers were selected for their direct relevance to pharmacopuncture research and were categorized according to three variables: medicinal materials, acupuncture points and disease. The most frequently studied medicinal materials were bee-venom pharmacopuncture (42%), followed by meridian-field pharmacopuncture (24%), single-compound pharmacopuncture (24%), and eight-principle pharmacopuncture (10%). The frequency distributions of the acupuncture points and meridians for the injection of medicinal materials are presented. The most frequently used meridian and acupuncture point was the Bladder meridian and ST36, respectively. Contingency tables are also displayed to analyze the relationship between the categorized variables. Chi-squared analysis showed a significant association between the type of pharmacopuncture and disease. The trend in research reports on Korean pharmacopuncture therapy was reviewed and analyzed using a descriptive statistical approach to evaluate the therapeutic value of this technique for future research. PMID:20869014
Statistical physics and physiology: monofractal and multifractal approaches.
Stanley, H E; Amaral, L A; Goldberger, A L; Havlin, S; Ivanov PCh; Peng, C K
1999-08-01
Even under healthy, basal conditions, physiologic systems show erratic fluctuations resembling those found in dynamical systems driven away from a single equilibrium state. Do such "nonequilibrium" fluctuations simply reflect the fact that physiologic systems are being constantly perturbed by external and intrinsic noise? Or, do these fluctuations actually, contain useful, "hidden" information about the underlying nonequilibrium control mechanisms? We report some recent attempts to understand the dynamics of complex physiologic fluctuations by adapting and extending concepts and methods developed very recently in statistical physics. Specifically, we focus on interbeat interval variability as an important quantity to help elucidate possibly non-homeostatic physiologic variability because (i) the heart rate is under direct neuroautonomic control, (ii) interbeat interval variability is readily measured by noninvasive means, and (iii) analysis of these heart rate dynamics may provide important practical diagnostic and prognostic information not obtainable with current approaches. The analytic tools we discuss may be used on a wider range of physiologic signals. We first review recent progress using two analysis methods--detrended fluctuation analysis and wavelets--sufficient for quantifying monofractual structures. We then describe recent work that quantifies multifractal features of interbeat interval series, and the discovery that the multifractal structure of healthy subjects is different than that of diseased subjects. PMID:11543220
Understanding Vrikshasana using body mounted sensors: A statistical approach
Yelluru, Suhas Niranjan; Shanbhag, Ranjith Ravindra; Omkar, SN
2016-01-01
Aim: A scheme for understanding how the human body organizes postural movements while performing Vrikshasana is developed in the format of this paper. Settings and Design: The structural characteristics of the body and the geometry of the muscular actions are incorporated into a graphical representation of the human movement mechanics in the frontal plane. A series of neural organizational hypotheses enables us to understand the mechanics behind the hip and ankle strategy: (1) Body sway in the mediolateral direction; and (2) influence of hip and ankle to correct instabilities caused in body while performing Vrikshasana. Materials and Methods: A methodological study on 10 participants was performed by mounting four inertial measurement units on the surface of the trapezius, thoracolumbar fascia, vastus lateralis, and gastrocnemius muscles. The kinematic accelerations of three mutually exclusive trials were recorded for a period of 30 s. Results: The results of every trial were processed using two different approaches namely statistical signal processing (variance and cross-correlation). Conclusions obtained from both these studies were in favor of the initial hypothesis. Conclusions: This study enabled us to understand the role of hip abductors and adductors, and ankle extensions and flexions in correcting the posture while performing Vrikshasana. PMID:26865765
ERIC Educational Resources Information Center
Hassan, Mahamood M.; Schwartz, Bill N.
2014-01-01
This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
ERIC Educational Resources Information Center
Billings, Paul H.
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…
Statistical approach to color rendition properties of solid state light sources
NASA Astrophysics Data System (ADS)
Žukauskas, Artūras; Vaicekauskas, Rimantas; Tuzikas, Arūnas; Vitta, Pranciškus; Shur, Michael
2011-10-01
Versatile spectral power distribution of solid-state light sources offers vast possibilities in color rendition engineering. The optimization of such sources requires the development and psychophysical validation of an advanced metric for assessing their color quality. Here we report on the application and validation of the recently introduced statistical approach to color quality of illumination. This new metric uses the computational grouping of a large number of test color samples depending on the magnitude and direction of color-shift vectors in respect of just perceived differences of chromaticity and luminance. This approach introduces single-format statistical color rendition indices, such as Color Fidelity Index, Color Saturation Index and Color Dulling Index, which are the percentages of test color samples with particular behavior of the color-shift vectors. The new metric has been used for the classification of practical phosphor conversion white light-emitting diodes (LEDs) and polychromatic LED clusters into several distinct categories, such as high-fidelity, color saturating, and color dulling light sources. We also report on the development of the tetrachromatic light source with dynamically tailored color rendition properties and using this source for the psychophysical validation of the statistical metric and finding subjective preferences to the color quality of lighting.
Linear induction accelerator approach for advanced radiography
Caporaso, G.J.
1997-05-01
Recent advances in induction accelerator technology make it possible to envision a single accelerator that can serve as an intense, precision multiple pulse x-ray source for advanced radiography. Through the use of solid-state modulator technology repetition rates on the order of 1 MHz can be achieved with beam pulse lengths ranging from 200 ns to 2 {micro}secs. By using fast kickers, these pulses may be sectioned into pieces which are directed to different beam lines so as to interrogate the object under study from multiple lines of sight. The ultimate aim is to do a time dependent tomographic reconstruction of a dynamic object. The technology to accomplish these objectives along with a brief discussion of the experimental plans to verify it will be presented.
Resistive switching phenomena: A review of statistical physics approaches
Lee, Jae Sung; Lee, Shinbuhm; Noh, Tae Won
2015-08-31
Here we report that resistive switching (RS) phenomena are reversible changes in the metastable resistance state induced by external electric fields. After discovery ~50 years ago, RS phenomena have attracted great attention due to their potential application in next-generation electrical devices. Considerable research has been performed to understand the physical mechanisms of RS and explore the feasibility and limits of such devices. There have also been several reviews on RS that attempt to explain the microscopic origins of how regions that were originally insulators can change into conductors. However, little attention has been paid to the most important factor in determining resistance: how conducting local regions are interconnected. Here, we provide an overview of the underlying physics behind connectivity changes in highly conductive regions under an electric field. We first classify RS phenomena according to their characteristic current–voltage curves: unipolar, bipolar, and threshold switchings. Second, we outline the microscopic origins of RS in oxides, focusing on the roles of oxygen vacancies: the effect of concentration, the mechanisms of channel formation and rupture, and the driving forces of oxygen vacancies. Third, we review RS studies from the perspective of statistical physics to understand connectivity change in RS phenomena. We discuss percolation model approaches and the theory for the scaling behaviors of numerous transport properties observed in RS. Fourth, we review various switching-type conversion phenomena in RS: bipolar-unipolar, memory-threshold, figure-of-eight, and counter-figure-of-eight conversions. Finally, we review several related technological issues, such as improvement in high resistance fluctuations, sneak-path problems, and multilevel switching problems.
Resistive switching phenomena: A review of statistical physics approaches
Lee, Jae Sung; Lee, Shinbuhm; Noh, Tae Won
2015-08-31
Here we report that resistive switching (RS) phenomena are reversible changes in the metastable resistance state induced by external electric fields. After discovery ~50 years ago, RS phenomena have attracted great attention due to their potential application in next-generation electrical devices. Considerable research has been performed to understand the physical mechanisms of RS and explore the feasibility and limits of such devices. There have also been several reviews on RS that attempt to explain the microscopic origins of how regions that were originally insulators can change into conductors. However, little attention has been paid to the most important factor inmore » determining resistance: how conducting local regions are interconnected. Here, we provide an overview of the underlying physics behind connectivity changes in highly conductive regions under an electric field. We first classify RS phenomena according to their characteristic current–voltage curves: unipolar, bipolar, and threshold switchings. Second, we outline the microscopic origins of RS in oxides, focusing on the roles of oxygen vacancies: the effect of concentration, the mechanisms of channel formation and rupture, and the driving forces of oxygen vacancies. Third, we review RS studies from the perspective of statistical physics to understand connectivity change in RS phenomena. We discuss percolation model approaches and the theory for the scaling behaviors of numerous transport properties observed in RS. Fourth, we review various switching-type conversion phenomena in RS: bipolar-unipolar, memory-threshold, figure-of-eight, and counter-figure-of-eight conversions. Finally, we review several related technological issues, such as improvement in high resistance fluctuations, sneak-path problems, and multilevel switching problems.« less
ERIC Educational Resources Information Center
Petocz, Agnes; Newbery, Glenn
2010-01-01
Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…
Symmetries and the approach to statistical equilibrium in isotropic turbulence
NASA Astrophysics Data System (ADS)
Clark, Timothy T.; Zemach, Charles
1998-11-01
The relaxation in time of an arbitrary isotropic turbulent state to a state of statistical equilibrium is identified as a transition to a state which is invariant under a symmetry group. We deduce the allowed self-similar forms and time-decay laws for equilibrium states by applying Lie-group methods (a) to a family of scaling symmetries, for the limit of high Reynolds number, as well as (b) to a unique scaling symmetry, for nonzero viscosity or nonzero hyperviscosity. This explains why a diverse collection of turbulence models, going back half a century, arrived at the same time-decay laws, either through derivations embedded in the mechanics of a particular model, or through numerical computation. Because the models treat the same dynamical variables having the same physical dimensions, they are subject to the same scaling invariances and hence to the same time-decay laws, independent of the eccentricities of their different formulations. We show in turn, by physical argument, by an explicitly solvable analytical model, and by numerical computation in more sophisticated models, that the physical mechanism which drives (this is distinct from the mathematical circumstance which allows) the relaxation to equilibrium is the cascade of turbulence energy toward higher wave numbers, with the rate of cascade approaching zero in the low wave-number limit and approaching infinity in the high wave-number limit. Only the low-wave-number properties of the initial state can influence the equilibrium state. This supplies the physical basis, beyond simple dimensional analysis, for quantitative estimates of relaxation times. These relaxation times are estimated to be as large as hundreds or more times the initial dominant-eddy cycle times, and are determined by the large-eddy cycle times. This mode of analysis, applied to a viscous turbulent system in a wind tunnel with typical initial laboratory parameters, shows that the time necessary to reach the final stage of decay is
Ice Shelf Modeling: A Cross-Polar Bayesian Statistical Approach
NASA Astrophysics Data System (ADS)
Kirchner, N.; Furrer, R.; Jakobsson, M.; Zwally, H. J.
2010-12-01
Ice streams interlink glacial terrestrial and marine environments: embedded in a grounded inland ice such as the Antarctic Ice Sheet or the paleo ice sheets covering extensive parts of the Eurasian and Amerasian Arctic respectively, ice streams are major drainage agents facilitating the discharge of substantial portions of continental ice into the ocean. At their seaward side, ice streams can either extend onto the ocean as floating ice tongues (such as the Drygalsky Ice Tongue/East Antarctica), or feed large ice shelves (as is the case for e.g. the Siple Coast and the Ross Ice Shelf/West Antarctica). The flow behavior of ice streams has been recognized to be intimately linked with configurational changes in their attached ice shelves; in particular, ice shelf disintegration is associated with rapid ice stream retreat and increased mass discharge from the continental ice mass, contributing eventually to sea level rise. Investigations of ice stream retreat mechanism are however incomplete if based on terrestrial records only: rather, the dynamics of ice shelves (and, eventually, the impact of the ocean on the latter) must be accounted for. However, since floating ice shelves leave hardly any traces behind when melting, uncertainty regarding the spatio-temporal distribution and evolution of ice shelves in times prior to instrumented and recorded observation is high, calling thus for a statistical modeling approach. Complementing ongoing large-scale numerical modeling efforts (Pollard & DeConto, 2009), we model the configuration of ice shelves by using a Bayesian Hiearchial Modeling (BHM) approach. We adopt a cross-polar perspective accounting for the fact that currently, ice shelves exist mainly along the coastline of Antarctica (and are virtually non-existing in the Arctic), while Arctic Ocean ice shelves repeatedly impacted the Arctic ocean basin during former glacial periods. Modeled Arctic ocean ice shelf configurations are compared with geological spatial
Statistical Learning of Phonetic Categories: Insights from a Computational Approach
ERIC Educational Resources Information Center
McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.
2009-01-01
Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…
Artificial Intelligence Approach to Support Statistical Quality Control Teaching
ERIC Educational Resources Information Center
Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno
2006-01-01
Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…
Advancing Instructional Communication: Integrating a Biosocial Approach
ERIC Educational Resources Information Center
Horan, Sean M.; Afifi, Tamara D.
2014-01-01
Celebrating 100 years of the National Communication Association necessitates that, as we commemorate our past, we also look toward our future. As part of a larger conversation about the future of instructional communication, this essay reinvestigates the importance of integrating biosocial approaches into instructional communication research. In…
Approaches for advancing scientific understanding of macrosystems
Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.
2014-01-01
The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.
Advanced drug delivery approaches against periodontitis.
Joshi, Deeksha; Garg, Tarun; Goyal, Amit K; Rath, Goutam
2016-01-01
Periodontitis is an inflammatory disease of gums involving the degeneration of periodontal ligaments, creation of periodontal pocket and resorption of alveolar bone, resulting in the disruption of the support structure of teeth. According to WHO, 10-15% of the global population suffers from severe periodontitis. The disease results from the growth of a diverse microflora (especially anaerobes) in the pockets and release of toxins, enzymes and stimulation of body's immune response. Various local or systemic approaches were used for an effective treatment of periodontitis. Currently, controlled local drug delivery approach is more favorable as compared to systemic approach because it mainly focuses on improving the therapeutic outcomes by achieving factors like site-specific delivery, low dose requirement, bypass of first-pass metabolism, reduction in gastrointestinal side effects and decrease in dosing frequency. Overall it provides a safe and effective mode of treatment, which enhances patient compliance. Complete eradication of the organisms from the sites was not achieved by using various surgical and mechanical treatments. So a number of polymer-based delivery systems like fibers, films, chips, strips, microparticles, nanoparticles and nanofibers made from a variety of natural and synthetic materials have been successfully tested to deliver a variety of drugs. These systems are biocompatible and biodegradable, completely fill the pockets, and have strong retention on the target site due to excellent mucoadhesion properties. The review summarizes various available and recently developing targeted delivery devices for the treatment of periodontitis. PMID:25005586
Statistical Approach To Extraction Of Texture In SAR
NASA Technical Reports Server (NTRS)
Rignot, Eric J.; Kwok, Ronald
1992-01-01
Improved statistical method of extraction of textural features in synthetic-aperture-radar (SAR) images takes account of effects of scheme used to sample raw SAR data, system noise, resolution of radar equipment, and speckle. Treatment of speckle incorporated into overall statistical treatment of speckle, system noise, and natural variations in texture. One computes speckle auto-correlation function from system transfer function that expresses effect of radar aperature and incorporates range and azimuth resolutions.
Advances in myelofibrosis: a clinical case approach.
Mascarenhas, John O; Orazi, Attilio; Bhalla, Kapil N; Champlin, Richard E; Harrison, Claire; Hoffman, Ronald
2013-10-01
Primary myelofibrosis is a member of the myeloproliferative neoplasms, a diverse group of bone marrow malignancies. Symptoms of myelofibrosis, particularly those associated with splenomegaly (abdominal distention and pain, early satiety, dyspnea, and diarrhea) and constitutional symptoms, represent a substantial burden to patients. Most patients eventually die from the disease, with a median survival ranging from approximately 5-7 years. Mutations in Janus kinase 2 (JAK2), a kinase that is essential for the normal development of erythrocytes, granulocytes, and platelets, notably the V617F mutation, have been identified in approximately 50% of patients with myelofibrosis. The approval of a JAK2 inhibitor in 2011 has improved the outlook of many patients with myelofibrosis and has changed the treatment landscape. This article focuses on some of the important issues in current myelofibrosis treatment management, including differentiation of myelofibrosis from essential thrombocythemia and polycythemia vera, up-dated data on the results of JAK2 inhibitor therapy, the role of epigenetic mechanisms in myelofibrosis pathogenesis, investigational therapies for myelofibrosis, and advances in hematopoietic stem cell transplant. Three myelofibrosis cases are included to underscore the issues in diagnosing and treating this complex disease. PMID:24091929
Advancing Profiling Sensors with a Wireless Approach
Galvis, Alex; Russomanno, David J.
2012-01-01
The notion of a profiling sensor was first realized by a Near-Infrared (N-IR) retro-reflective prototype consisting of a vertical column of wired sparse detectors. This paper extends that prior work and presents a wireless version of a profiling sensor as a collection of sensor nodes. The sensor incorporates wireless sensing elements, a distributed data collection and aggregation scheme, and an enhanced classification technique. In this novel approach, a base station pre-processes the data collected from the sensor nodes and performs data re-alignment. A back-propagation neural network was also developed for the wireless version of the N-IR profiling sensor that classifies objects into the broad categories of human, animal or vehicle with an accuracy of approximately 94%. These enhancements improve deployment options as compared with the first generation of wired profiling sensors, possibly increasing the application scenarios for such sensors, including intelligent fence applications. PMID:23443371
Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods
NASA Astrophysics Data System (ADS)
Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.
2015-02-01
This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.
Europe's Neogene and Quaternary lake gastropod diversity - a statistical approach
NASA Astrophysics Data System (ADS)
Neubauer, Thomas A.; Georgopoulou, Elisavet; Harzhauser, Mathias; Mandic, Oleg; Kroh, Andreas
2014-05-01
During the Neogene Europe's geodynamic history gave rise to several long-lived lakes with conspicuous endemic radiations. However, such lacustrine systems are rare today as well as in the past compared to the enormous numbers of "normal" lakes. Most extant European lakes are mainly results of the Ice Ages and are due to their (geologically) temporary nature largely confined to the Pleistocene-Holocene. As glacial lakes are also geographically restricted to glacial regions (and their catchment areas) their preservation potential is fairly low. Also deposits of streams, springs, and groundwater, which today are inhabited by species-rich gastropod assemblages, are rarely preserved. Thus, the pre-Quaternary lacustrine record is biased towards long-lived systems, such as the Late Miocene Lake Pannon, the Early to Middle Miocene Dinaride Lake System, the Middle Miocene Lake Steinheim and several others. All these systems have been studied for more than 150 years concerning their mollusk inventories and the taxonomic literature is formidable. However, apart from few general overviews precise studies on the γ-diversities of the post-Oligocene European lake systems and the shifting biodiversity in European freshwater systems through space and time are entirely missing. Even for the modern faunas, literature on large-scale freshwater gastropod diversity in extant lakes is scarce and lacks a statistical approach. Our preliminary data suggest fundamental differences between modern and pre-Pleistocene freshwater biogeography in central Europe. A rather homogenous central European Pleistocene and Holocene lake fauna is contrasted by considerable provincialism during the early Middle Miocene. Aside from the ancient Dessaretes lakes of the Balkan Peninsula, Holocene lake faunas are dominated by planorbids and lymnaeids in species numbers. This composition differs considerably from many Miocene and Pliocene lake faunas, which comprise pyrgulid-, hydrobiid-, viviparid-, melanopsid
ERIC Educational Resources Information Center
Johnson, H. Dean; Dasgupta, Nairanjana; Zhang, Hao; Evans, Marc A.
2009-01-01
The use of the Internet as a teaching tool continues to grow in popularity at colleges and universities. We consider, from the students' perspective, the use of an Internet approach compared to a lecture and lab-based approach for teaching an introductory course in statistical methods. We conducted a survey of introductory statistics students.…
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
A BAYESIAN STATISTICAL APPROACHES FOR THE EVALUATION OF CMAQ
This research focuses on the application of spatial statistical techniques for the evaluation of the Community Multiscale Air Quality (CMAQ) model. The upcoming release version of the CMAQ model was run for the calendar year 2001 and is in the process of being evaluated by EPA an...
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
A BAYESIAN STATISTICAL APPROACH FOR THE EVALUATION OF CMAQ
Bayesian statistical methods are used to evaluate Community Multiscale Air Quality (CMAQ) model simulations of sulfate aerosol over a section of the eastern US for 4-week periods in summer and winter 2001. The observed data come from two U.S. Environmental Protection Agency data ...
Teaching MBA Statistics Online: A Pedagogically Sound Process Approach
ERIC Educational Resources Information Center
Grandzol, John R.
2004-01-01
Delivering MBA statistics in the online environment presents significant challenges to education and students alike because of varying student preparedness levels, complexity of content, difficulty in assessing learning outcomes, and faculty availability and technological expertise. In this article, the author suggests a process model that…
Analysis of Coastal Dunes: A Remote Sensing and Statistical Approach.
ERIC Educational Resources Information Center
Jones, J. Richard
1985-01-01
Remote sensing analysis and statistical methods were used to analyze the coastal dunes of Plum Island, Massachusetts. The research methodology used provides an example of a student project for remote sensing, geomorphology, or spatial analysis courses at the university level. (RM)
Statistical and Microscopic Approach to Gas Phase Chemical Kinetics.
ERIC Educational Resources Information Center
Perez, J. M.; Quereda, R.
1983-01-01
Describes advanced undergraduate laboratory exercise examining the dependence of the rate constants and the instantaneous concentrations with the nature and energy content in a gas-phase complex reaction. Computer program (with instructions and computation flow charts) used with the exercise is available from the author. (Author/JN)
Statistical approaches to pharmacodynamic modeling: motivations, methods, and misperceptions.
Mick, R; Ratain, M J
1993-01-01
We have attempted to outline the fundamental statistical aspects of pharmacodynamic modeling. Unexpected yet substantial variability in effect in a group of similarly treated patients is the key motivation for pharmacodynamic investigations. Pharmacokinetic and/or pharmacodynamic factors may influence this variability. Residual variability in effect that persists after accounting for drug exposure indicates that further statistical modeling with pharmacodynamic factors is warranted. Factors that significantly predict interpatient variability in effect may then be employed to individualize the drug dose. In this paper we have emphasized the need to understand the properties of the effect measure and explanatory variables in terms of scale, distribution, and statistical relationship. The assumptions that underlie many types of statistical models have been discussed. The role of residual analysis has been stressed as a useful method to verify assumptions. We have described transformations and alternative regression methods that are employed when these assumptions are found to be in violation. Sequential selection procedures for the construction of multivariate models have been presented. The importance of assessing model performance has been underscored, most notably in terms of bias and precision. In summary, pharmacodynamic analyses are now commonly performed and reported in the oncologic literature. The content and format of these analyses has been variable. The goals of such analyses are to identify and describe pharmacodynamic relationships and, in many cases, to propose a statistical model. However, the appropriateness and performance of the proposed model are often difficult to judge. Table 1 displays suggestions (in a checklist format) for structuring the presentation of pharmacodynamic analyses, which reflect the topics reviewed in this paper. PMID:8269582
Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach
ERIC Educational Resources Information Center
Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney
2012-01-01
Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…
Class G cement in Brazil - A statistical approach
Rosa, F.C.; Coelho, O. Jr.; Parente, F.J. )
1993-09-01
Since 1975, Petrobras has worked with Brazilian Portland cement manufacturers to develop high-quality Class G cements. The Petrobras R and D Center has analyzed each batch of Class G cement manufactured by prequalified producers to API Spec. 10 standards and to Brazilian Assoc. of Technical Standards (ABNT) NBR 9831 standards. As a consequence, the Drilling Dept. at Petrobras now is supplied by three approved Class G cement factories strategically located in Brazil. This paper statistically analyzes test results on the basis of physical parameters of these Class G cements over 3 years. Statistical indices are reported to evaluate dispersion of the physical properties to obtain a reliability index for each Class G cement.
Statistical approach to linewidth control in a logic fab
NASA Astrophysics Data System (ADS)
Pitter, Michael; Doleschel, Bernhard; Eibl, Ludwig; Steinkirchner, Erwin; Grassmann, Andreas
1999-04-01
We designed an adaptive line width controller specially tailored to the needs of a highly diversified logic fab. Simulations of different controller types fed with historic CD data show advantages of an SPC based controller over a Run by Run controller. This result confirms the SPC assumption that as long as a process is in statistical control, changing the process parameters will only increase the variability of the output.
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
W± bosons production in the quantum statistical parton distributions approach
NASA Astrophysics Data System (ADS)
Bourrely, Claude; Buccella, Franco; Soffer, Jacques
2013-10-01
We consider W± gauge bosons production in connection with recent results from BNL-RHIC and FNAL-Tevatron and interesting predictions from the statistical parton distributions. They concern relevant aspects of the structure of the nucleon sea and the high-x region of the valence quark distributions. We also give predictions in view of future proton-neutron collisions experiments at BNL-RHIC.
Extreme event statistics of daily rainfall: dynamical systems approach
NASA Astrophysics Data System (ADS)
Cigdem Yalcin, G.; Rabassa, Pau; Beck, Christian
2016-04-01
We analyse the probability densities of daily rainfall amounts at a variety of locations on Earth. The observed distributions of the amount of rainfall fit well to a q-exponential distribution with exponent q close to q≈ 1.3. We discuss possible reasons for the emergence of this power law. In contrast, the waiting time distribution between rainy days is observed to follow a near-exponential distribution. A careful investigation shows that a q-exponential with q≈ 1.05 yields the best fit of the data. A Poisson process where the rate fluctuates slightly in a superstatistical way is discussed as a possible model for this. We discuss the extreme value statistics for extreme daily rainfall, which can potentially lead to flooding. This is described by Fréchet distributions as the corresponding distributions of the amount of daily rainfall decay with a power law. Looking at extreme event statistics of waiting times between rainy days (leading to droughts for very long dry periods) we obtain from the observed near-exponential decay of waiting times extreme event statistics close to Gumbel distributions. We discuss superstatistical dynamical systems as simple models in this context.
A Statistical Approach to Characterizing the Reliability of Systems Utilizing HBT Devices
NASA Technical Reports Server (NTRS)
Chen, Yuan; Wang, Qing; Kayali, Sammy
2004-01-01
This paper presents a statistical approach to characterizing the reliability of systems with HBT devices. The proposed approach utilizes the statistical reliability information of the HBT individual devices, along with the analysis on the critical paths of the system, to provide more accurate and more comprehensive reliability information about the HBT systems compared to the conventional worst-case method.
NASA Astrophysics Data System (ADS)
Tsallis, Constantino
2006-03-01
Boltzmann-Gibbs ( BG) statistical mechanics is, since well over one century, successfully used for many nonlinear dynamical systems which, in one way or another, exhibit strong chaos. A typical case is a classical many-body short-range-interacting Hamiltonian system (e.g., the Lennard-Jones model for a real gas at moderately high temperature). Its Lyapunov spectrum (which characterizes the sensitivity to initial conditions) includes positive values. This leads to ergodicity, the stationary state being thermal equilibrium, hence standard applicability of the BG theory is verified. The situation appears to be of a different nature for various phenomena occurring in living organisms. Indeed, such systems exhibit a complexity which does not really accommodate with this standard dynamical behavior. Life appears to emerge and evolve in a kind of delicate situation, at the frontier between large order (low adaptability and long memory; typically characterized by regular dynamics, hence only nonpositive Lyapunov exponents) and large disorder (high adaptability and short memory; typically characterized by strong chaos, hence at least one positive Lyapunov exponent). Along this frontier, the maximal relevant Lyapunov exponents are either zero or close to that, characterizing what is currently referred to as weak chaos. This type of situation is shared by a great variety of similar complex phenomena in economics, linguistics, to cite but a few. BG statistical mechanics is built upon the entropy S=-k∑plnp. A generalization of this form, S=k(1-∑piq)/(q-1) (with S=S), has been proposed in 1988 as a basis for formulating what is nowadays currently called nonextensive statistical mechanics. This theory appears to be particularly adapted for nonlinear dynamical systems exhibiting, precisely, weak chaos. Here, we briefly review the theory, its dynamical foundation, its applications in a variety of disciplines (with special emphasis to living systems), and its connections with
Statistical Thermodynamic Approach to Vibrational Solitary Waves in Acetanilide
NASA Astrophysics Data System (ADS)
Vasconcellos, Áurea R.; Mesquita, Marcus V.; Luzzi, Roberto
1998-03-01
We analyze the behavior of the macroscopic thermodynamic state of polymers, centering on acetanilide. The nonlinear equations of evolution for the populations and the statistically averaged field amplitudes of CO-stretching modes are derived. The existence of excitations of the solitary wave type is evidenced. The infrared spectrum is calculated and compared with the experimental data of Careri et al. [Phys. Rev. Lett. 51, 104 (1983)], resulting in a good agreement. We also consider the situation of a nonthermally highly excited sample, predicting the occurrence of a large increase in the lifetime of the solitary wave excitation.
A Statistical Approach to Establishing Subsystem Environmental Test Specifications
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1974-01-01
Results are presented of a research task to evaluate structural responses at various subsystem mounting locations during spacecraft level test exposures to the environments of mechanical shock, acoustic noise, and random vibration. This statistical evaluation is presented in the form of recommended subsystem test specifications for these three environments as normalized to a reference set of spacecraft test levels and are thus suitable for extrapolation to a set of different spacecraft test levels. The recommendations are dependent upon a subsystem's mounting location in a spacecraft, and information is presented on how to determine this mounting zone for a given subsystem.
Recent Advances in Targeted Drug Delivery Approaches Using Dendritic Polymers
Bugno, Jason; Hsu, Hao-Jui; Hong, Seungpyo
2014-01-01
Since they were first synthesized over 30 years ago, dendrimers have seen rapid translation into various biomedical applications. A number of reports have not only demonstrated their clinical utility, but also revealed novel design approaches and strategies based on the elucidation of underlying mechanisms governing their biological interactions. This review focuses on presenting the latest advances in dendrimer design, discussing the current mechanistic understandings, and highlighting recent developments and targeted approaches using dendrimers in drug/gene delivery. PMID:26221937
Geo-Statistical Approach to Estimating Asteroid Exploration Parameters
NASA Technical Reports Server (NTRS)
Lincoln, William; Smith, Jeffrey H.; Weisbin, Charles
2011-01-01
NASA's vision for space exploration calls for a human visit to a near earth asteroid (NEA). Potential human operations at an asteroid include exploring a number of sites and analyzing and collecting multiple surface samples at each site. In this paper two approaches to formulation and scheduling of human exploration activities are compared given uncertain information regarding the asteroid prior to visit. In the first approach a probability model was applied to determine best estimates of mission duration and exploration activities consistent with exploration goals and existing prior data about the expected aggregate terrain information. These estimates were compared to a second approach or baseline plan where activities were constrained to fit within an assumed mission duration. The results compare the number of sites visited, number of samples analyzed per site, and the probability of achieving mission goals related to surface characterization for both cases.
Demarcating Advanced Learning Approaches from Methodological and Technological Perspectives
ERIC Educational Resources Information Center
Horvath, Imre; Peck, David; Verlinden, Jouke
2009-01-01
In the field of design and engineering education, the fast and expansive evolution of information and communication technologies is steadily converting traditional learning approaches into more advanced ones. Facilitated by Broadband (high bandwidth) personal computers, distance learning has developed into web-hosted electronic learning. The…
New Therapeutic Approaches for Advanced Gastrointestinal Stromal Tumors (GISTs)
Somaiah, Neeta
2010-01-01
Synopsis The management of advanced GIST is increasingly complex due to imatinib refractory disease. Primary resistance to imatinib is uncommon, and most patients progress after development of additional genetic changes. This article reviews management strategies including surgical approaches, local modalities for progressive liver metastases, as well as novel therapeutic agents. PMID:19248977
Advance Approach to Concept and Design Studies for Space Missions
NASA Technical Reports Server (NTRS)
Deutsch, M.; Nichols, J.
1999-01-01
Recent automated and advanced techniques developed at JPL have created a streamlined and fast-track approach to initial mission conceptualization and system architecture design, answering the need for rapid turnaround of trade studies for potential proposers, as well as mission and instrument study groups.
A statistical modeling approach for detecting generalized synchronization
Schumacher, Johannes; Haslinger, Robert; Pipa, Gordon
2012-01-01
Detecting nonlinear correlations between time series presents a hard problem for data analysis. We present a generative statistical modeling method for detecting nonlinear generalized synchronization. Truncated Volterra series are used to approximate functional interactions. The Volterra kernels are modeled as linear combinations of basis splines, whose coefficients are estimated via l1 and l2 regularized maximum likelihood regression. The regularization manages the high number of kernel coefficients and allows feature selection strategies yielding sparse models. The method's performance is evaluated on different coupled chaotic systems in various synchronization regimes and analytical results for detecting m:n phase synchrony are presented. Experimental applicability is demonstrated by detecting nonlinear interactions between neuronal local field potentials recorded in different parts of macaque visual cortex. PMID:23004851
Bayesian statistical approach to binary asteroid orbit determination
NASA Astrophysics Data System (ADS)
Kovalenko, Irina D.; Stoica, Radu S.; Emelyanov, N. V.; Doressoundiram, A.; Hestroffer, D.
2016-01-01
The problem of binary asteroids orbit determination is of particular interest, given knowledge of the orbit is the best way to derive the mass of the system. Orbit determination from observed points is a classic problem of celestial mechanics. However, in the case of binary asteroids, particularly with a small number of observations, the solution is not evident to derive. In the case of resolved binaries the problem consists in the determination of the relative orbit from observed relative positions of a secondary asteroid with respect to the primary. In this work, the problem is investigated as a statistical inverse problem. Within this context, we propose a method based on Bayesian modelling together with a global optimisation procedure that is based on the simulated annealing algorithm.
Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas
NASA Astrophysics Data System (ADS)
Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud
2016-06-01
The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.
Statistical mechanics approach to lock-key supramolecular chemistry interactions.
Odriozola, Gerardo; Lozada-Cassou, Marcelo
2013-03-01
In the supramolecular chemistry field, intuitive concepts such as molecular complementarity and molecular recognition are used to explain the mechanism of lock-key associations. However, these concepts lack a precise definition, and consequently this mechanism is not well defined and understood. Here we address the physical basis of this mechanism, based on formal statistical mechanics, through Monte Carlo simulation and compare our results with recent experimental data for charged or uncharged lock-key colloids. We find that, given the size range of the molecules involved in these associations, the entropy contribution, driven by the solvent, rules the interaction, over that of the enthalpy. A universal behavior for the uncharged lock-key association is found. Based on our results, we propose a supramolecular chemistry definition. PMID:23521272
A Statistical Approach To An Expert Diagnostic Ultrasonic System
NASA Astrophysics Data System (ADS)
Insana, Michael F.; Wagner, Robert F.; Garra, Brian S.; Shawker, Thomas H.
1986-06-01
The techniques of statistical pattern recognition are implemented to determine the best combination of tissue characterization parameters for maximizing the diagnostic accuracy of a given task. In this paper, we considered combinations of four ultrasonic tissue parameters to discriminate between normal liver and chronic hepatitis. The separation between normal and diseased samples was made by application of the Bayes test for minimum risk which minimizes the error rate for classifying tissue states while including the prior probability for the presence of disease and the cost of misclassification. Large differences in classification performance of various tissue parameter combinations were demonstrated by ROC analysis. The power of additional features to classify tissue states, even those derived from other imaging modalities, can be compared directly in this manner.
A statistical approach to the temporal development of orbital associations
NASA Astrophysics Data System (ADS)
Kastinen, D.; Kero, J.
2016-01-01
We have performed preliminary studies on the use of a Monte-Carlo based statistical toolbox for small body solar system dynamics to find trends in the temporal development of orbital associations. As a part of this preliminary study four different similarity functions where implemented and applied to the 21P/Giacobini-Zinner meteoroid stream, and resulting simulated meteor showers. The simulations indicate that the temporal behavior of orbital element distributions in the meteoroid stream and the meteor shower differ on century size time scales. The configuration of the meteor shower remains compact for a long time and dissipates an order of magnitude slower than the stream. The main effect driving the shower dissipation is shown to be the addition of new trails to the stream.
Statistical Approaches to Aerosol Dynamics for Climate Simulation
Zhu, Wei
2014-09-02
In this work, we introduce two general non-parametric regression analysis methods for errors-in-variable (EIV) models: the compound regression, and the constrained regression. It is shown that these approaches are equivalent to each other and, to the general parametric structural modeling approach. The advantages of these methods lie in their intuitive geometric representations, their distribution free nature, and their ability to offer a practical solution when the ratio of the error variances is unknown. Each includes the classic non-parametric regression methods of ordinary least squares, geometric mean regression, and orthogonal regression as special cases. Both methods can be readily generalized to multiple linear regression with two or more random regressors.
Inverse problems and computational cell metabolic models: a statistical approach
NASA Astrophysics Data System (ADS)
Calvetti, D.; Somersalo, E.
2008-07-01
In this article, we give an overview of the Bayesian modelling of metabolic systems at the cellular and subcellular level. The models are based on detailed description of key biochemical reactions occurring in tissue, which may in turn be compartmentalized into cytosol and mitochondria, and of transports between the compartments. The classical deterministic approach which models metabolic systems as dynamical systems with Michaelis-Menten kinetics, is replaced by a stochastic extension where the model parameters are interpreted as random variables with an appropriate probability density. The inverse problem of cell metabolism in this setting consists of estimating the density of the model parameters. After discussing some possible approaches to solving the problem, we address the issue of how to assess the reliability of the predictions of a stochastic model by proposing an output analysis in terms of model uncertainties. Visualization modalities for organizing the large amount of information provided by the Bayesian dynamic sensitivity analysis are also illustrated.
Application of statistical physics approaches to complex organizations
NASA Astrophysics Data System (ADS)
Matia, Kaushik
The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class
Advanced Safeguards Approaches for New TRU Fuel Fabrication Facilities
Durst, Philip C.; Ehinger, Michael H.; Boyer, Brian; Therios, Ike; Bean, Robert; Dougan, A.; Tolk, K.
2007-12-15
This second report in a series of three reviews possible safeguards approaches for the new transuranic (TRU) fuel fabrication processes to be deployed at AFCF – specifically, the ceramic TRU (MOX) fuel fabrication line and the metallic (pyroprocessing) line. The most common TRU fuel has been fuel composed of mixed plutonium and uranium dioxide, referred to as “MOX”. However, under the Advanced Fuel Cycle projects custom-made fuels with higher contents of neptunium, americium, and curium may also be produced to evaluate if these “minor actinides” can be effectively burned and transmuted through irradiation in the ABR. A third and final report in this series will evaluate and review the advanced safeguards approach options for the ABR. In reviewing and developing the advanced safeguards approach for the new TRU fuel fabrication processes envisioned for AFCF, the existing international (IAEA) safeguards approach at the Plutonium Fuel Production Facility (PFPF) and the conceptual approach planned for the new J-MOX facility in Japan have been considered as a starting point of reference. The pyro-metallurgical reprocessing and fuel fabrication process at EBR-II near Idaho Falls also provided insight for safeguarding the additional metallic pyroprocessing fuel fabrication line planned for AFCF.
Clusterization of water molecules as deduced from statistical mechanical approach
NASA Astrophysics Data System (ADS)
Krasnoholovets, Volodymyr
2004-12-01
Using the methods of statistical mechanics we have shown that a homogeneous water network is unstable and spontaneously disintegrates to the nonhomogeneous state (i.e. peculiar clusters), which can be treated as an ordinary state of liquid water. The major peculiarity of the concept is that it separates the paired potential into two independent components—the attractive potential and the repulsive one, which in turn should feature a very different dependence on the distance from the particle (a water molecule in the present case). We choose the interaction potential as a combination of the ionic crystal potential and the vibratory potential associated with the elastic properties of the water system as a whole. The number ℵ of water molecules that enters a cluster is calculated as a function of several parameters, such as the dielectric constant, the mass of a water molecule, the distance between nearest molecules, and the vibrations of nearest molecules in their nodes. The number of H2O molecules that comprise a cluster is estimated as about ℵ ≈ 900, which agrees with the available experimental data.
Glass viscosity calculation based on a global statistical modelling approach
Fluegel, Alex
2007-02-01
A global statistical glass viscosity model was developed for predicting the complete viscosity curve, based on more than 2200 composition-property data of silicate glasses from the scientific literature, including soda-lime-silica container and float glasses, TV panel glasses, borosilicate fiber wool and E type glasses, low expansion borosilicate glasses, glasses for nuclear waste vitrification, lead crystal glasses, binary alkali silicates, and various further compositions from over half a century. It is shown that within a measurement series from a specific laboratory the reported viscosity values are often over-estimated at higher temperatures due to alkali and boron oxide evaporation during the measurement and glass preparation, including data by Lakatos et al. (1972) and the recently published High temperature glass melt property database for process modeling by Seward et al. (2005). Similarly, in the glass transition range many experimental data of borosilicate glasses are reported too high due to phase separation effects. The developed global model corrects those errors. The model standard error was 9-17°C, with R^2 = 0.985-0.989. The prediction 95% confidence interval for glass in mass production largely depends on the glass composition of interest, the composition uncertainty, and the viscosity level. New insights in the mixed-alkali effect are provided.
[Statistical Process Control applied to viral genome screening: experimental approach].
Reifenberg, J M; Navarro, P; Coste, J
2001-10-01
During the National Multicentric Study concerning the introduction of NAT for HCV and HIV-1 viruses in blood donation screening which was supervised by the Medical and Scientific departments of the French Blood Establishment (Etablissement français du sang--EFS), Transcription-Mediated transcription Amplification (TMA) technology (Chiron/Gen Probe) was experimented in the Molecular Biology Laboratory of Montpellier, EFS Pyrénées-Méditerranée. After a preliminary phase of qualification of the material and training of the technicians, routine screening of homologous blood and apheresis donations using this technology was applied for two months. In order to evaluate the different NAT systems, exhaustive daily operations and data were registered. Among these, the luminescence results expressed as RLU of the positive and negative calibrators and the associated internal controls were analysed using Control Charts, Statistical Process Control methods, which allow us to display rapidly process drift and to anticipate the appearance of incidents. This study demonstrated the interest of these quality control methods, mainly used for industrial purposes, to follow and to increase the quality of any transfusion process. it also showed the difficulties of the post-investigations of uncontrolled sources of variations of a process which was experimental. Such tools are in total accordance with the new version of the ISO 9000 norms which are particularly focused on the use of adapted indicators for processes control, and could be extended to other transfusion activities, such as blood collection and component preparation. PMID:11729395
The interaction of physical properties of seawater via statistical approach
NASA Astrophysics Data System (ADS)
Hamzah, Firdaus Mohamad; Jaafar, Othman; Sabri, Samsul Rijal Mohd; Ismail, Mohd Tahir; Jaafar, Khamisah; Arbin, Norazman
2015-09-01
It is of importance to determine the relationships between physical parameters in marine ecology. Model and expert opinion are needed for exploration of the form of relationship between two parameters due to the complexity of the ecosystems. These need justification with observed data over a particular periods. Novel statistical techniques such as nonparametric regression is presented to investigate the ecological relationships. These are achieved by demonstrating the features of pH, salinity and conductivity at in Straits of Johor. The monthly data measurements from 2004 until 2013 at a chosen sampling location are examined. Testing for no-effect followed by linearity testing for the relationships between salinity and pH; conductivity and pH, and conductivity and salinity are carried out, with the ecological objectives of investigating the evidence of changes in each of the above physical parameters. The findings reveal the appropriateness of smooth function to explain the variation of pH in response to the changes in salinity whilst the changes in conductivity with regards to different concentrations of salinity could be modelled parametrically. The analysis highlights the importance of both parametric and nonparametric models for assessing ecological response to environmental change in seawater.
Statistical approach to anatomical landmark extraction in AP radiographs
NASA Astrophysics Data System (ADS)
Bernard, Rok; Pernus, Franjo
2001-07-01
A novel method for the automated extraction of important geometrical parameters of the pelvis and hips from APR images is presented. The shape and intensity variations in APR images are encompassed by the statistical shape and appearance models built from a set of training images for each of the three anatomies, i.e., pelvis, right and left hip, separately. The identification of the pelvis and hips is defined as a flexible object recognition problem, which is solved by generating anatomically plausible object instances and matching them to the APR image. The criterion function minimizes the resulting match error and considers the object topology. The obtained flexible object defines the positions of anatomical landmarks, which are further used to calculate the hip joint contact stress. A leave-one-out test was used to evaluate the performance of the proposed method on a set of 26 APR images. The results show the method is able to properly treat image variations and can reliably and accurately identify anatomies in the image and extract the anatomical landmarks needed in the hip joint contact stress calculation.
Territorial developments based on graffiti: A statistical mechanics approach
NASA Astrophysics Data System (ADS)
Barbaro, Alethea B. T.; Chayes, Lincoln; D'Orsogna, Maria R.
2013-01-01
We study the well-known sociological phenomenon of gang aggregation and territory formation through an interacting agent system defined on a lattice. We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this model, all interactions are indirect and occur only via graffiti markings, on-site as well as on nearest neighbor locations. We also allow for gang proliferation and graffiti suppression. Within the context of this model, we show that gang clustering and territory formation may arise under specific parameter choices and that a phase transition may occur between well-mixed, possibly dilute configurations and well separated, clustered ones. Using methods from statistical mechanics, we study the phase transition between these two qualitatively different scenarios. In the mean-fields rendition of this model, we identify parameter regimes where the transition is first or second order. In all cases, we have found that the transitions are a consequence solely of the gang to graffiti couplings, implying that direct gang to gang interactions are not strictly necessary for gang territory formation; in particular, graffiti may be the sole driving force behind gang clustering. We further discuss possible sociological-as well as ecological-ramifications of our results.
Statistical approaches to short-term electricity forecasting
NASA Astrophysics Data System (ADS)
Kellova, Andrea
The study of the short-term forecasting of electricity demand has played a key role in the economic optimization of the electric energy industry and is essential for power systems planning and operation. In electric energy markets, accurate short-term forecasting of electricity demand is necessary mainly for economic operations. Our focus is directed to the question of electricity demand forecasting in the Czech Republic. Firstly, we describe the current structure and organization of the Czech, as well as the European, electricity market. Secondly, we provide a complex description of the most powerful external factors influencing electricity consumption. The choice of the most appropriate model is conditioned by these electricity demand determining factors. Thirdly, we build up several types of multivariate forecasting models, both linear and nonlinear. These models are, respectively, linear regression models and artificial neural networks. Finally, we compare the forecasting power of both kinds of models using several statistical accuracy measures. Our results suggest that although the electricity demand forecasting in the Czech Republic is for the considered years rather a nonlinear than a linear problem, for practical purposes simple linear models with nonlinear inputs can be adequate. This is confirmed by the values of the empirical loss function applied to the forecasting results.
Bayesian Statistical Approach To Binary Asteroid Orbit Determination
NASA Astrophysics Data System (ADS)
Dmitrievna Kovalenko, Irina; Stoica, Radu S.
2015-08-01
Orbit determination from observations is one of the classical problems in celestial mechanics. Deriving the trajectory of binary asteroid with high precision is much more complicate than the trajectory of simple asteroid. Here we present a method of orbit determination based on the algorithm of Monte Carlo Markov Chain (MCMC). This method can be used for the preliminary orbit determination with relatively small number of observations, or for adjustment of orbit previously determined.The problem consists on determination of a conditional a posteriori probability density with given observations. Applying the Bayesian statistics, the a posteriori probability density of the binary asteroid orbital parameters is proportional to the a priori and likelihood probability densities. The likelihood function is related to the noise probability density and can be calculated from O-C deviations (Observed minus Calculated positions). The optionally used a priori probability density takes into account information about the population of discovered asteroids. The a priori probability density is used to constrain the phase space of possible orbits.As a MCMC method the Metropolis-Hastings algorithm has been applied, adding a globally convergent coefficient. The sequence of possible orbits derives through the sampling of each orbital parameter and acceptance criteria.The method allows to determine the phase space of every possible orbit considering each parameter. It also can be used to derive one orbit with the biggest probability density of orbital elements.
Modeling Insurgent Dynamics Including Heterogeneity. A Statistical Physics Approach
NASA Astrophysics Data System (ADS)
Johnson, Neil F.; Manrique, Pedro; Hui, Pak Ming
2013-05-01
Despite the myriad complexities inherent in human conflict, a common pattern has been identified across a wide range of modern insurgencies and terrorist campaigns involving the severity of individual events—namely an approximate power-law x - α with exponent α≈2.5. We recently proposed a simple toy model to explain this finding, built around the reported loose and transient nature of operational cells of insurgents or terrorists. Although it reproduces the 2.5 power-law, this toy model assumes every actor is identical. Here we generalize this toy model to incorporate individual heterogeneity while retaining the model's analytic solvability. In the case of kinship or team rules guiding the cell dynamics, we find that this 2.5 analytic result persists—however an interesting new phase transition emerges whereby this cell distribution undergoes a transition to a phase in which the individuals become isolated and hence all the cells have spontaneously disintegrated. Apart from extending our understanding of the empirical 2.5 result for insurgencies and terrorism, this work illustrates how other statistical physics models of human grouping might usefully be generalized in order to explore the effect of diverse human social, cultural or behavioral traits.
A Statistical Approach to Provide Individualized Privacy for Surveys
Esponda, Fernando; Huerta, Kael; Guerrero, Victor M.
2016-01-01
In this paper we propose an instrument for collecting sensitive data that allows for each participant to customize the amount of information that she is comfortable revealing. Current methods adopt a uniform approach where all subjects are afforded the same privacy guarantees; however, privacy is a highly subjective property with intermediate points between total disclosure and non-disclosure: each respondent has a different criterion regarding the sensitivity of a particular topic. The method we propose empowers respondents in this respect while still allowing for the discovery of interesting findings through the application of well-known inferential procedures. PMID:26824758
Statistical Approaches for Estimating Actinobacterial Diversity in Marine Sediments
Stach, James E. M.; Maldonado, Luis A.; Masson, Douglas G.; Ward, Alan C.; Goodfellow, Michael; Bull, Alan T.
2003-01-01
Bacterial diversity in a deep-sea sediment was investigated by constructing actinobacterium-specific 16S ribosomal DNA (rDNA) clone libraries from sediment sections taken 5 to 12, 15 to 18, and 43 to 46 cm below the sea floor at a depth of 3,814 m. Clones were placed into operational taxonomic unit (OTU) groups with ≥99% 16S rDNA sequence similarity; the cutoff value for an OTU was derived by comparing 16S rRNA homology with DNA-DNA reassociation values for members of the class Actinobacteria. Diversity statistics were used to determine how the level of dominance, species richness, and genetic diversity varied with sediment depth. The reciprocal of Simpson's index (1/D) indicated that the pattern of diversity shifted toward dominance from uniformity with increasing sediment depth. Nonparametric estimation of the species richness in the 5- to 12-, 15- to 18-, and 43- to 46-cm sediment sections revealed a trend of decreasing species number with depth, 1,406, 308, and 212 OTUs, respectively. Application of the LIBSHUFF program indicated that the 5- to 12-cm clone library was composed of OTUs significantly (P = 0.001) different from those of the 15- to 18- and 43- to 46-cm libraries. FST and phylogenetic grouping of taxa (P tests) were both significant (P < 0.00001 and P < 0.001, respectively), indicating that genetic diversity decreased with sediment depth and that each sediment community harbored unique phylogenetic lineages. It was also shown that even nonconservative OTU definitions result in severe underestimation of species richness; unique phylogenetic clades detected in one OTU group suggest that OTUs do not correspond to real ecological groups sensu Palys (T. Palys, L. K. Nakamura, and F. M. Cohan, Int. J. Syst. Bacteriol. 47:1145-1156, 1997). Mechanisms responsible for diversity and their implications are discussed. PMID:14532080
Evaluation of current statistical approaches for predictive geomorphological mapping
NASA Astrophysics Data System (ADS)
Miska, Luoto; Jan, Hjort
2005-04-01
Predictive models are increasingly used in geomorphology, but systematic evaluations of novel statistical techniques are still limited. The aim of this study was to compare the accuracy of generalized linear models (GLM), generalized additive models (GAM), classification tree analysis (CTA), neural networks (ANN) and multiple adaptive regression splines (MARS) in predictive geomorphological modelling. Five different distribution models both for non-sorted and sorted patterned ground were constructed on the basis of four terrain parameters and four soil variables. To evaluate the models, the original data set of 9997 squares of 1 ha in size was randomly divided into model training (70%, n=6998) and model evaluation sets (30%, n=2999). In general, active sorted patterned ground is clearly defined in upper fell areas with high slope angle and till soils. Active non-sorted patterned ground is more common in valleys with higher soil moisture and fine-scale concave topography. The predictive performance of each model was evaluated using the area under the receiver operating characteristic curve (AUC) and the Kappa value. The relatively high discrimination capacity of all models, AUC=0.85 0.88 and Kappa=0.49 0.56, implies that the model's predictions provide an acceptable index of sorted and non-sorted patterned ground occurrence. The best performance for model calibration data for both data sets was achieved by the CTA. However, when the predictive mapping ability was explored through the evaluation data set, the model accuracies of CTA decreased clearly compared to the other modelling techniques. For model evaluation data MARS performed marginally best. Our results show that the digital elevation model and soil data can be used to predict relatively robustly the activity of patterned ground in fine scale in a subarctic landscape. This indicates that predictive geomorphological modelling has the advantage of providing relevant and useful information on earth surface
Statistical physics approaches to quantifying sleep-stage transitions
NASA Astrophysics Data System (ADS)
Lo, Chung-Chuan
Sleep can be viewed as a sequence of transitions in a very complex neuronal system. Traditionally, studies of the dynamics of sleep control have focused on the circadian rhythm of sleep-wake transitions or on the ultradian rhythm of the sleep cycle. However, very little is known about the mechanisms responsible for the time structure or even the statistics of the rapid sleep-stage transitions that appear without periodicity. I study the time dynamics of sleep-wake transitions for different species, including humans, rats, and mice, and find that the wake and sleep episodes exhibit completely different behaviors: the durations of wake episodes are characterized by a scale-free power-law distribution, while the durations of sleep episodes have an exponential distribution with a characteristic time scale. The functional forms of the distributions of the sleep and wake durations hold for human subjects of different ages and for subjects with sleep apnea. They also hold for all the species I investigate. Surprisingly, all species have the same power-law exponent for the distribution of wake durations, but the exponential characteristic time of the distribution of sleep durations changes across species. I develop a stochastic model which accurately reproduces our empirical findings. The model suggests that the difference between the dynamics of the sleep and wake states arises from the constraints on the number of microstates in the sleep-wake system. I develop a measure of asymmetry in sleep-stage transitions using a transition probability matrix. I find that both normal and sleep apnea subjects are characterized by two types of asymmetric sleep-stage transition paths, and that the sleep apnea group exhibits less asymmetry in the sleep-stage transitions.
A Nonequilibrium Statistical Thermodynamics Approach to Non-Gaussian Statistics in Space Plasmas.
NASA Astrophysics Data System (ADS)
Consolini, G.
2005-12-01
One of the most interesting aspect of magnetic field and plasma parameters fluctuations is the non-Gaussian shape of the Probability Distribution Functions (PDFs). This fact along with the occurrence of scaling features has been read as an evidence of intermittency. In the past, several models have been proposed for the non-gaussianity of the PDFs (Castaign et al., 1990; Frisch, 1996; Frisch & Sornette, 1997; Arimitsu & Arimitsu, 2000; Beck, 2000; Leubner & Vörös, 2005). Recently, by introducing the concept of randomized operational temperature Beck & Cohen proposed the concept of superstatistics (Beck & Cohen, 2003; Beck, 2004) as the origin of non-Gaussian PDFs in nonequilibrium, long-range correlated, systems. Here, the origin of non-Gaussian PDFs in space plasmas is discussed in the framework of composite thermodynamic systems starting from the idea of randomized operational temperature and using the concept of Lèvy transformation. This approach is motivated by recent theoretical and experimental evidences of multiscale magnetic and plasma structures in space plasmas (Chang, 1999; Chang et al, 2004). A novel shape of the small-scale PDFs is derived and compared with PDFs computed by magnetic field measurements in space plasmas.
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
ERIC Educational Resources Information Center
Perrett, Jamis J.
2012-01-01
This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different editions…
Statistical Physics Approaches to Respiratory Dynamics and Lung Structure
NASA Astrophysics Data System (ADS)
Suki, Bela
2004-03-01
The lung consists of a branching airway tree embedded in viscoelastic tissue and provides life-sustaining gas exchange to the body. In diseases, its structure is damaged and its function is compromised. We review two recent works about lung structure and dynamics and how they change in disease. 1) We introduced a new acoustic imaging approach to study airway structure. When airways in a collapsed lung are inflated, they pop open in avalanches. A single opening emits a sound package called crackle consisting of an initial spike (s) followed by ringing. The distribution n(s) of s follows a power law and the exponent of n(s) can be used to calculate the diameter ratio d defined as the ratio of the diameters of an airway to that of its parent averaged over all bifurcations. To test this method, we measured crackles in dogs, rabbits, rats and mice by inflating collapsed isolated lungs with air or helium while recording crackles with a microphone. In each species, n(s) follows a power law with an exponent that depends on species, but not on gas in agreement with theory. Values of d from crackles compare well with those calculated from morphometric data suggesting that this approach is suitable to study airway structure in disease. 2) Using novel experiments and computer models, we studied pulmonary emphysema which is caused by cigarette smoking. In emphysema, the elastic protein fibers of the tissue are actively remodeled by lung cells due to the chemicals present in smoke. We measured the mechanical properties of tissue sheets from normal and emphysematous lungs and imaged its structure which appears as a heterogeneous hexagonal network of fibers. We found evidence that during uniaxial stretching, the collagen and elastin fibers in emphysematous tissue can fail at a critical stress generating holes of various sizes (h). We developed network models of the failure process. When the failure is governed by mechanical forces, the distribution n(h) of h is a power law which
Biorefinery approach for coconut oil valorisation: a statistical study.
Bouaid, Abderrahim; Martínez, Mercedes; Aracil, José
2010-06-01
The biorefinery approach, consisting in transesterification using methanol and potassium hydroxide as catalyst, has been used to assess coconut oil valorisation. Due to the fatty acid composition of coconut oil, low (LMWME) and high (HMWME) molecular weight fatty acid methyl esters were obtained. Methyl laurate (78.30 wt.%) is the major component of the low molecular weight fraction. The influence of variables such as temperature and catalyst concentration on the production of both fractions has been studied and optimized by means of factorial design and response surface methodology (RSM). Two separate optimum conditions were found to be a catalyst concentration of 0.9% and 1% and an operation temperature of 42.5 degrees C and 57 degrees C for LMWME and HMWME, respectively, obtaining conversion rates of 77.54% and 25.41%. The valuable components of LMWME may be recovered for sale as biolubricants or biosolvents, the remaining fraction could be used as biodiesel, matching the corresponding European Standard. PMID:20129777
Jensen-Feynman approach to the statistics of interacting electrons
Pain, Jean-Christophe; Gilleron, Franck; Faussurier, Gerald
2009-08-15
Faussurier et al. [Phys. Rev. E 65, 016403 (2001)] proposed to use a variational principle relying on Jensen-Feynman (or Gibbs-Bogoliubov) inequality in order to optimize the accounting for two-particle interactions in the calculation of canonical partition functions. It consists of a decomposition into a reference electron system and a first-order correction. The procedure appears to be very efficient in order to evaluate the free energy and the orbital populations. In this work, we present numerical applications of the method and propose to extend it using a reference energy which includes the interaction between two electrons inside a given orbital. This is possible, thanks to our efficient recursion relation for the calculation of partition functions. We also show that a linear reference energy, however, is usually sufficient to achieve a good precision and that the most promising way to improve the approach of Faussurier et al. is to apply Jensen's inequality to a more convenient convex function.
Predicting major element mineral/melt equilibria - A statistical approach
NASA Technical Reports Server (NTRS)
Hostetler, C. J.; Drake, M. J.
1980-01-01
Empirical equations have been developed for calculating the mole fractions of NaO0.5, MgO, AlO1.5, SiO2, KO0.5, CaO, TiO2, and FeO in a solid phase of initially unknown identity given only the composition of the coexisting silicate melt. The approach involves a linear multivariate regression analysis in which solid composition is expressed as a Taylor series expansion of the liquid compositions. An internally consistent precision of approximately 0.94 is obtained, that is, the nature of the liquidus phase in the input data set can be correctly predicted for approximately 94% of the entries. The composition of the liquidus phase may be calculated to better than 5 mol % absolute. An important feature of this 'generalized solid' model is its reversibility; that is, the dependent and independent variables in the linear multivariate regression may be inverted to permit prediction of the composition of a silicate liquid produced by equilibrium partial melting of a polymineralic source assemblage.
Whole-genome CNV analysis: advances in computational approaches
Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.
2015-01-01
Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519
P37: Locally advanced thymoma-robotic approach
Asaf, Belal B.; Kumar, Arvind
2015-01-01
Background The conventional approach to locally advanced thymoma has been via a sternotomy. VATS and robotic thymectomies have been described but typically are reserved for patients with myasthenia gravis only or for small, encapsulated thymic tumors. There have been few reports of minimally invasive resection of locally advanced thymomas. Our objective is to present a case in which a large, locally advanced thymoma was resected en bloc with the pericardium employing robotic assisted thoracoscopic approach. Methods This case illustrates a case of an asymptomatic 29-year-old female found to have an 11 cm anterior mediastinal mass on CT scan. A right-sided, 4 port robotic approach was utilized with the camera port in the 5th intercostal space anterior axillary line and two accessory ports for robotic arm 1 and 2 in the 3rd intercostal space anterior axillary line and 8th intercostal space anterior axillary line. A 5 mm port was used between the camera and 2nd robotic arm for assistance. On exploration the mass was found to be adherent to the pericardium that was resected en bloc via anterior pericardiectomy. Her post-operative course was uncomplicated, and she was discharged home on postoperative day 1. Results Final pathology revealed an 11 cm × 7.5 cm × 3.0 cm WHO class B2 thymoma invading the pericardium, TNM stage T3N0M0, with negative margins. The patient was subsequently sent to receive 5,040 cGy of adjuvant radiation, and follow-up CT scan 6 months postoperatively showed no evidence of disease. Conclusions Very little data exist demonstrating the efficacy of resecting locally advanced thymomas utilising the minimally invasive approach. Our case demonstrates that a robotic assisted thoracoscopic approach is feasible for performing thymectomy for locally advanced thymomas. This may help limit the morbidity of a trans-sternal approach while achieving comparable oncologic results. However, further studies are needed to evaluate its efficacy and long term
Advanced statistical process control: controlling sub-0.18-μm lithography and other processes
NASA Astrophysics Data System (ADS)
Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.
2001-08-01
access of the analysis to include the external variables involved in CMP, deposition etc. We then applied yield analysis methods to identify the significant lithography-external process variables from the history of lots, subsequently adding the identified process variable to the signatures database and to the PPC calculations. With these improvements, the authors anticipate a 50% improvement of the process window. This improvement results in a significant reduction of rework and improved yield depending on process demands and equipment configuration. A statistical theory that explains the PPC is then presented. This theory can be used to simulate a general PPC application. In conclusion, the PPC concept is not lithography or semiconductors limited. In fact it is applicable for any production process that is signature biased (chemical industry, car industry, .). Requirements for the PPC are large data collection, a controllable process that is not too expensive to tune the process for every lot, and the ability to employ feedback calculations. PPC is a major change in the process management approach and therefor will first be employed where the need is high and the return on investment is very fast. The best industry to start with is the semiconductors and the most likely process area to start with is lithography.
Papa, Lesther A.; Litson, Kaylee; Lockhart, Ginger; Chassin, Laurie; Geiser, Christian
2015-01-01
Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI) data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454) is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. The new approach allows for a more comprehensive and effective use of MI data when testing mediation models. PMID:26617536
Statistical methods and neural network approaches for classification of data from multiple sources
NASA Technical Reports Server (NTRS)
Benediktsson, Jon Atli; Swain, Philip H.
1990-01-01
Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.
Dual-band, infrared buried mine detection using a statistical pattern recognition approach
Buhl, M.R.; Hernandez, J.E.; Clark, G.A.; Sengupta, S.K.
1993-08-01
The main objective of this work was to detect surrogate land mines, which were buried in clay and sand, using dual-band, infrared images. A statistical pattern recognition approach was used to achieve this objective. This approach is discussed and results of applying it to real images are given.
Advanced Stirling Convertor Dynamic Test Approach and Results
NASA Technical Reports Server (NTRS)
Meer, David W.; Hill, Dennis; Ursic, Joseph J.
2010-01-01
The U.S. Department of Energy (DOE), Lockheed Martin Corporation (LM), and NASA Glenn Research Center (GRC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. As part of the extended operation testing of this power system, the Advanced Stirling Convertors (ASC) at NASA GRC undergo a vibration test sequence intended to simulate the vibration history that an ASC would experience when used in an ASRG for a space mission. This sequence includes testing at workmanship and flight acceptance levels interspersed with periods of extended operation to simulate prefueling and post fueling. The final step in the test sequence utilizes additional testing at flight acceptance levels to simulate launch. To better replicate the acceleration profile seen by an ASC incorporated into an ASRG, the input spectra used in testing the convertors was modified based on dynamic testing of the ASRG Engineering Unit (ASRG EU) at LM. This paper outlines the overall test approach, summarizes the test results from the ASRG EU, describes the incorporation of those results into the test approach, and presents the results of applying the test approach to the ASC-1 #3 and #4 convertors. The test results include data from several accelerometers mounted on the convertors as well as the piston position and output power variables.
NASA Astrophysics Data System (ADS)
Stefani, Jerry A.; Poarch, Scott; Saxena, Sharad; Mozumder, P. K.
1994-09-01
An advanced multivariable off-line process control system, which combines traditional Statistical Process Control (SPC) with feedback control, has been applied to the CVD tungsten process on an Applied Materials Centura reactor. The goal of the model-based controller is to compensate for shifts in the process and maintain the wafer state responses on target. In the present application the controller employs measurements made on test wafers by off-line metrology tools to track the process behavior. This is accomplished by using model- bases SPC, which compares the measurements with predictions obtained from empirically-derived process models. For CVD tungsten, a physically-based modeling approach was employed based on the kinetically-limited H2 reduction of WF6. On detecting a statistically significant shift in the process, the controller calculates adjustments to the settings to bring the process responses back on target. To achieve this a few additional test wafers are processed at slightly different settings than the nominal. This local experiment allows the models to be updated to reflect the current process performance. The model updates are expressed as multiplicative or additive changes in the process inputs and a change in the model constant. This approach for model updating not only tracks the present process/equipment state, but it also provides some diagnostic capability regarding the cause of the process shift. The updated models are used by an optimizer to compute new settings to bring the responses back to target. The optimizer is capable of incrementally entering controllables into the strategy, reflecting the degree to which the engineer desires to manipulates each setting. The capability of the controller to compensate for shifts in the CVD tungsten process has been demonstrated. Targets for film bulk resistivity and deposition rate were maintained while satisfying constraints on film stress and WF6 conversion efficiency.
Unal, Cetin; Pasamehmetoglu, Kemal; Carmack, Jon
2010-01-01
Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.
NASA Astrophysics Data System (ADS)
Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.
2016-06-01
We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).
ERIC Educational Resources Information Center
Potter, James Thomson, III
2012-01-01
Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…
"I am Not a Statistic": Identities of African American Males in Advanced Science Courses
NASA Astrophysics Data System (ADS)
Johnson, Diane Wynn
The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these
Classification of human colonic tissues using FTIR spectra and advanced statistical techniques
NASA Astrophysics Data System (ADS)
Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.
2010-04-01
One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.
A Novel Statistical Approach for Brain MR Images Segmentation Based on Relaxation Times
Ferraioli, Giampaolo; Pascazio, Vito
2015-01-01
Brain tissue segmentation in Magnetic Resonance Imaging is useful for a wide range of applications. Classical approaches exploit the gray levels image and implement criteria for differentiating regions. Within this paper a novel approach for brain tissue joint segmentation and classification is presented. Starting from the estimation of proton density and relaxation times, we propose a novel method for identifying the optimal decision regions. The approach exploits the statistical distribution of the involved signals in the complex domain. The technique, compared to classical threshold based ones, is able to globally improve the classification rate. The effectiveness of the approach is evaluated on both simulated and real datasets. PMID:26798631
The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...
ERIC Educational Resources Information Center
McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.
2010-01-01
This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
"I am Not a Statistic": Identities of African American Males in Advanced Science Courses
NASA Astrophysics Data System (ADS)
Johnson, Diane Wynn
The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these
How large is the gluon polarization in the statistical parton distributions approach?
Soffer, Jacques; Bourrely, Claude; Buccella, Franco
2015-04-10
We review the theoretical foundations of the quantum statistical approach to parton distributions and we show that by using some recent experimental results from Deep Inelastic Scattering, we are able to improve the description of the data by means of a new determination of the parton distributions. We will see that a large gluon polarization emerges, giving a significant contribution to the proton spin.
How large is the gluon polarization in the statistical parton distributions approach?
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude; Buccella, Franco
2015-04-01
We review the theoretical foundations of the quantum statistical approach to parton distributions and we show that by using some recent experimental results from Deep Inelastic Scattering, we are able to improve the description of the data by means of a new determination of the parton distributions. We will see that a large gluon polarization emerges, giving a significant contribution to the proton spin.
A Statistical Filtering Approach for Gravity Recovery and Climate Experiment (GRACE) Gravity Data
NASA Technical Reports Server (NTRS)
Davis. J. L.; Tamisiea, M. E.; Elosegui, P.; Mitrovica, J. X.; Hill, E. M.
2008-01-01
We describe and analyze a statistical filtering approach for GRACE data that uses a parametrized model for the temporal evolution of the GRACE coefficients. After least-squares adjustment, a statistical test is performed to assess the significance of the estimated parameters. If the test is passed, the parameters are used by the filter in the reconstruction of the field; otherwise they are rejected. The test is performed, and the filter is formed, separately for annual components of the model and the trend. This new approach is distinct from Gaussian smoothing since it uses the data themselves to test for specific components of the time-varying gravity field. The statistical filter appears inherently to remove most of the "stripes" present in the GRACE fields, although destriping the fields prior to filtering seems to help the trend recovery. We demonstrate that the statistical filter produces reasonable maps for the annual components and trend. We furthermore assess the statistical filter for the annual components using ground-based GPS data in South America by assuming that the annual component of the gravity signal is associated only with groundwater storage. The un-destriped, statistically filtered field has a X2 value relative to the GPS data consistent with the best result from smoothing. In the space domain, the statistical filters are qualitatively similar to Gaussian smoothing. Unlike Gaussian smoothing, however, the statistical filter has significant sidelobes, including large negative sidelobes on the north-south axis, potentially revealing information on the errors, and the correlations among the errors, for the GRACE coefficients.
Simulating advanced life support systems to test integrated control approaches
NASA Astrophysics Data System (ADS)
Kortenkamp, D.; Bell, S.
Simulations allow for testing of life support control approaches before hardware is designed and built. Simulations also allow for the safe exploration of alternative control strategies during life support operation. As such, they are an important component of any life support research program and testbed. This paper describes a specific advanced life support simulation being created at NASA Johnson Space Center. It is a discrete-event simulation that is dynamic and stochastic. It simulates all major components of an advanced life support system, including crew (with variable ages, weights and genders), biomass production (with scalable plantings of ten different crops), water recovery, air revitalization, food processing, solid waste recycling and energy production. Each component is modeled as a producer of certain resources and a consumer of certain resources. The control system must monitor (via sensors) and control (via actuators) the flow of resources throughout the system to provide life support functionality. The simulation is written in an object-oriented paradigm that makes it portable, extensible and reconfigurable.
Reliability Demonstration Approach for Advanced Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Ha, CHuong; Zampino, Edward; Penswick, Barry; Spronz, Michael
2010-01-01
Developed for future space missions as a high-efficiency power system, the Advanced Stirling Radioisotope Generator (ASRG) has a design life requirement of 14 yr in space following a potential storage of 3 yr after fueling. In general, the demonstration of long-life dynamic systems remains difficult in part due to the perception that the wearout of moving parts cannot be minimized, and associated failures are unpredictable. This paper shows a combination of systematic analytical methods, extensive experience gained from technology development, and well-planned tests can be used to ensure a high level reliability of ASRG. With this approach, all potential risks from each life phase of the system are evaluated and the mitigation adequately addressed. This paper also provides a summary of important test results obtained to date for ASRG and the planned effort for system-level extended operation.
A Novel Approach to Material Development for Advanced Reactor Systems
Was, G.S.; Atzmon, M.; Wang, L.
1999-12-22
OAK B188 A Novel Approach to Material Development for Advanced Reactor Systems. Year one of this project had three major goals. First, to specify, order and install a new high current ion source for more rapid and stable proton irradiation. Second, to assess the use low temperature irradiation and chromium pre-enrichment in an effort to isolate a radiation damage microstructure in stainless steels without the effects of RIS. Third, to prepare for the irradiation of reactor pressure vessel steel and Zircaloy. In year 1 quarter 1, the project goal was to order the high current ion source and to procure and prepare samples of stainless steel for low temperature proton irradiation.
A Novel Approach to Material Development for Advanced Reactor Systems
Was, G.S.; Atzmon, M.; Wang, L.
2000-06-27
OAK B188 A Novel Approach to Material Development for Advanced Reactor Systems. Year one of this project had three major goals. First, to specify, order and install a new high current ion source for more rapid and stable proton irradiation. Second, to assess the use of low temperature irradiation and chromium pre-enrichment in an effort to isolate a radiation damage microstructure in stainless steel without the effects of RIS. Third, to initiate irradiation of reactor pressure vessel steel and Zircaloy. In year 1 quarter 3, the project goal was to complete irradiation of model alloys of RPV steels for a range of doses and begin sample characterization. We also planned to prepare samples for microstructure isolation in stainless steels, and to identify sources of Zircaloy for irradiation and characterization.
NASA Astrophysics Data System (ADS)
Plotnikov, M. Yu.; Shkarupa, E. V.
2015-11-01
Presently, the direct simulation Monte Carlo (DSMC) method is widely used for solving rarefied gas dynamics problems. As applied to steady-state problems, a feature of this method is the use of dependent sample values of random variables for the calculation of macroparameters of gas flows. A new combined approach to estimating the statistical error of the method is proposed that does not practically require additional computations, and it is applicable for any degree of probabilistic dependence of sample values. Features of the proposed approach are analyzed theoretically and numerically. The approach is tested using the classical Fourier problem and the problem of supersonic flow of rarefied gas through permeable obstacle.
A Statistical Approach to Identifying Compact Objects in X-ray Binaries
NASA Astrophysics Data System (ADS)
Vrtilek, Saeqa D.
2013-04-01
A standard approach towards statistical inferences in astronomy has been the application of Principal Components Analysis (PCA) to reduce dimensionality. However, for non-linear distributions this is not always an effective approach. A non-linear technique called ``diffusion maps" (Freema \\eta 2009; Richard \\eta 2009; Lee \\& Waterman 2010), a robust eigenmode-based framework, allows retention of the full ``connectivity" of the data points. Through this approach we define the highly non-linear geometry of X-ray binaries in a color-color-intensity diagram in an efficient and statistically sound manner providing a broadly applicable means of distinguishing between black holes and neutron stars in Galactic X-ray binaries.
Murari, A; Gelfusa, M; Peluso, E; Gaudio, P; Mazon, D; Hawkes, N; Point, G; Alper, B; Eich, T
2014-12-01
In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe. PMID:25554293
NASA Astrophysics Data System (ADS)
Murari, A.; Gelfusa, M.; Peluso, E.; Gaudio, P.; Mazon, D.; Hawkes, N.; Point, G.; Alper, B.; Eich, T.
2014-12-01
In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.
Lung volume reduction for advanced emphysema: surgical and bronchoscopic approaches.
Tidwell, Sherry L; Westfall, Elizabeth; Dransfield, Mark T
2012-01-01
Chronic obstructive pulmonary disease is the third leading cause of death in the United States, affecting more than 24 million people. Inhaled bronchodilators are the mainstay of therapy; they improve symptoms and quality of life and reduce exacerbations. These and smoking cessation and long-term oxygen therapy for hypoxemic patients are the only medical treatments definitively demonstrated to reduce mortality. Surgical approaches include lung transplantation and lung volume reduction and the latter has been shown to improve exercise tolerance, quality of life, and survival in highly selected patients with advanced emphysema. Lung volume reduction surgery results in clinical benefits. The procedure is associated with a short-term risk of mortality and a more significant risk of cardiac and pulmonary perioperative complications. Interest has been growing in the use of noninvasive, bronchoscopic methods to address the pathological hyperinflation that drives the dyspnea and exercise intolerance that is characteristic of emphysema. In this review, the mechanism by which lung volume reduction improves pulmonary function is outlined, along with the risks and benefits of the traditional surgical approach. In addition, the emerging bronchoscopic techniques for lung volume reduction are introduced and recent clinical trials examining their efficacy are summarized. PMID:22189668
NASA Astrophysics Data System (ADS)
Bourrely, Claude; Buccella, Franco; Soffer, Jacques
2011-04-01
We consider the extension of the statistical parton distributions to include their transverse momentum dependence, by using two different methods, one is based on our quantum statistical approach, the other on a relativistic covariant method. We take into account the effects of the Melosh-Wigner rotation for the polarized distributions. The results obtained can be compared with recent semi-inclusive deep inelastic scattering (DIS) data on the cross section and double longitudinal-spin asymmetries from JLab. We also give some predictions for future experiments on electron-neutron scattering.
NASA Astrophysics Data System (ADS)
Ruggles, Adam J.
2015-11-01
This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent
A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants.
Broadaway, K Alaine; Cutler, David J; Duncan, Richard; Moore, Jacob L; Ware, Erin B; Jhun, Min A; Bielak, Lawrence F; Zhao, Wei; Smith, Jennifer A; Peyser, Patricia A; Kardia, Sharon L R; Ghosh, Debashis; Epstein, Michael P
2016-03-01
Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286
MacKinnon, David P.; Pirlott, Angela G.
2016-01-01
Statistical mediation methods provide valuable information about underlying mediating psychological processes, but the ability to infer that the mediator variable causes the outcome variable is more complex than widely known. Researchers have recently emphasized how violating assumptions about confounder bias severely limits causal inference of the mediator to dependent variable relation. Our article describes and addresses these limitations by drawing on new statistical developments in causal mediation analysis. We first review the assumptions underlying causal inference and discuss three ways to examine the effects of confounder bias when assumptions are violated. We then describe four approaches to address the influence of confounding variables and enhance causal inference, including comprehensive structural equation models, instrumental variable methods, principal stratification, and inverse probability weighting. Our goal is to further the adoption of statistical methods to enhance causal inference in mediation studies. PMID:25063043
NASA Technical Reports Server (NTRS)
Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.
1990-01-01
Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.
Pulsipher, B.A.; Kuhn, W.L.
1987-02-01
Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs.
A Challenging Surgical Approach to Locally Advanced Primary Urethral Carcinoma
Lucarelli, Giuseppe; Spilotros, Marco; Vavallo, Antonio; Palazzo, Silvano; Miacola, Carlos; Forte, Saverio; Matera, Matteo; Campagna, Marcello; Colamonico, Ottavio; Schiralli, Francesco; Sebastiani, Francesco; Di Cosmo, Federica; Bettocchi, Carlo; Di Lorenzo, Giuseppe; Buonerba, Carlo; Vincenti, Leonardo; Ludovico, Giuseppe; Ditonno, Pasquale; Battaglia, Michele
2016-01-01
Abstract Primary urethral carcinoma (PUC) is a rare and aggressive cancer, often underdetected and consequently unsatisfactorily treated. We report a case of advanced PUC, surgically treated with combined approaches. A 47-year-old man underwent transurethral resection of a urethral lesion with histological evidence of a poorly differentiated squamous cancer of the bulbomembranous urethra. Computed tomography (CT) and bone scans excluded metastatic spread of the disease but showed involvement of both corpora cavernosa (cT3N0M0). A radical surgical approach was advised, but the patient refused this and opted for chemotherapy. After 17 months the patient was referred to our department due to the evidence of a fistula in the scrotal area. CT scan showed bilateral metastatic disease in the inguinal, external iliac, and obturator lymph nodes as well as the involvement of both corpora cavernosa. Additionally, a fistula originating from the right corpus cavernosum extended to the scrotal skin. At this stage, the patient accepted the surgical treatment, consisting of different phases. Phase I: Radical extraperitoneal cystoprostatectomy with iliac-obturator lymph nodes dissection. Phase II: Creation of a urinary diversion through a Bricker ileal conduit. Phase III: Repositioning of the patient in lithotomic position for an overturned Y skin incision, total penectomy, fistula excision, and “en bloc” removal of surgical specimens including the bladder, through the perineal breach. Phase IV: Right inguinal lymphadenectomy. The procedure lasted 9-and-a-half hours, was complication-free, and intraoperative blood loss was 600 mL. The patient was discharged 8 days after surgery. Pathological examination documented a T4N2M0 tumor. The clinical situation was stable during the first 3 months postoperatively but then metastatic spread occurred, not responsive to adjuvant chemotherapy, which led to the patient's death 6 months after surgery. Patients with advanced stage tumors of
Organic and inorganic nitrogen dynamics in soil - advanced Ntrace approach
NASA Astrophysics Data System (ADS)
Andresen, Louise C.; Björsne, Anna-Karin; Bodé, Samuel; Klemedtsson, Leif; Boeckx, Pascal; Rütting, Tobias
2016-04-01
Depolymerization of soil organic nitrogen (SON) into monomers (e.g. amino acids) is currently thought to be the rate limiting step for the terrestrial nitrogen (N) cycle. The production of free amino acids (AA) is followed by AA mineralization to ammonium, which is an important fraction of the total N mineralization. Accurate assessment of depolymerization and AA mineralization rate is important for a better understanding of the rate limiting steps. Recent developments in the 15N pool dilution techniques, based on 15N labelling of AA's, allow quantifying gross rates of SON depolymerization and AA mineralization (Wanek et al., 2010; Andersen et al., 2015) in addition to gross N mineralization. However, it is well known that the 15N pool dilution approach has limitations; in particular that gross rates of consumption processes (e.g. AA mineralization) are overestimated. This has consequences for evaluating the rate limiting step of the N cycle, as well as for estimating the nitrogen use efficiency (NUE). Here we present a novel 15N tracing approach, which combines 15N-AA labelling with an advanced version of the 15N tracing model Ntrace (Müller et al., 2007) explicitly accounting for AA turnover in soil. This approach (1) provides a more robust quantification of gross depolymerization and AA mineralization and (2) suggests a more realistic estimate for the microbial NUE of amino acids. Advantages of the new 15N tracing approach will be discussed and further improvements will be identified. References: Andresen, L.C., Bodé, S., Tietema, A., Boeckx, P., and Rütting, T.: Amino acid and N mineralization dynamics in heathland soil after long-term warming and repetitive drought, SOIL, 1, 341-349, 2015. Müller, C., Rütting, T., Kattge, J., Laughlin, R. J., and Stevens, R. J.: Estimation of parameters in complex 15N tracing models via Monte Carlo sampling, Soil Biology & Biochemistry, 39, 715-726, 2007. Wanek, W., Mooshammer, M., Blöchl, A., Hanreich, A., and Richter
NASA Astrophysics Data System (ADS)
Demura, A. V.; Kadomtsev, M. B.; Lisitsa, V. S.; Shurygin, V. A.
2015-06-01
The universal statistical approach for calculation of radiative and collisional processes with multielectron ions in plasmas is developed. It is based on the atomic structure representation similar to that used in a condensed medium. The distribution of local atomic electron density determines the set of elementary excitations with classical plasma frequency. The statistical method is tested by the calculations of the total electron impact single ionization cross-sections, ionization rates and radiative losses of various ions. In the coronal limit the radiative losses of heavy plasma impurities with any type of multielectron ions are determined by the excitation of collective atomic oscillations due to collisions with plasma electrons. It is shown that for low plasma densities the tungsten ions total radiative loss scatter within universal statistical approach does not exceed similar results of current complex numerical codes in the wide range of plasma temperatures. The general expression for the radiative losses in the case of the intermediate state between limiting cases of coronal and Boltzmann population distributions is derived as well. The total electron impact ionization cross-sections and ionization rates for ions of various charge stages for a wide range of elements from Ar to U are compared with experimental and conventional complex code data showing satisfactory agreement. As the universal statistical method operates in terms of collective excitations, it implicitly includes direct and indirect ionization processes.
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Tomlinson, Alan; Hair, Mario; McFadyen, Angus
2013-10-01
Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. PMID:24112230
Burn, K.W.
1995-01-01
The Direct Statistical Approach (DSA) to surface splitting and Russian Roulette (RR) is one of the current routes toward automatism in Monte Carlo and is currently applied to fixed source particle transport problems. A general volumetric particle bifurcation capability has been inserted into the Direct Statistical Approach (DSA) surface parameter and cell models. The resulting extended DSA describes the second moment and time functions in terms of phase-space surface splitting/Russian roulette parameters (surface parameter model) or phase-space cell importances (cell model) in the presence of volumetric particle bifurcations including both natural events [such as (n,xn) or gamma production from neutron collisions] and artificial events (such as DXTRAN). At the same time, other limitations in the DSA models (concerning tally scores direct from the source and tracks surviving an event at which a tally score occurs) are removed. Given the second moment and time functions, the foregoing surface or cell parameters may then be optimized.
Ni, Weiping; Yan, Weidong; Bian, Hui; Wu, Junzheng
2014-01-01
A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI) generation process. The new DI performs much better than the familiar log ratio (LR) DI as well as the cumulant based Kullback-Leibler divergence (CKLD) DI. The statistical region merging (SRM) approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM) method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency. PMID:25258740
Sound source measurement by using a passive sound insulation and a statistical approach
NASA Astrophysics Data System (ADS)
Dragonetti, Raffaele; Di Filippo, Sabato; Mercogliano, Francesco; Romano, Rosario A.
2015-10-01
This paper describes a measurement technique developed by the authors that allows carrying out acoustic measurements inside noisy environments reducing background noise effects. The proposed method is based on the integration of a traditional passive noise insulation system with a statistical approach. The latter is applied to signals picked up by usual sensors (microphones and accelerometers) equipping the passive sound insulation system. The statistical approach allows improving of the sound insulation given only by the passive sound insulation system at low frequency. The developed measurement technique has been validated by means of numerical simulations and measurements carried out inside a real noisy environment. For the case-studies here reported, an average improvement of about 10 dB has been obtained in a frequency range up to about 250 Hz. Considerations on the lower sound pressure level that can be measured by applying the proposed method and the measurement error related to its application are reported as well.
A Statistical-Physics Approach to Language Acquisition and Language Change
NASA Astrophysics Data System (ADS)
Cassandro, Marzio; Collet, Pierre; Galves, Antonio; Galves, Charlotte
1999-02-01
The aim of this paper is to explain why Statistical Physics can help understanding two related linguistic questions. The first question is how to model first language acquisition by a child. The second question is how language change proceeds in time. Our approach is based on a Gibbsian model for the interface between syntax and prosody. We also present a simulated annealing model of language acquisition, which extends the Triggering Learning Algorithm recently introduced in the linguistic literature.
Carboni, Michele; Gianneo, Andrea; Giglio, Marco
2015-07-01
This research investigates a Lamb-wave based structural health monitoring approach matching an out-of-phase actuation of a pair of piezoceramic transducers at low frequency. The target is a typical quasi-isotropic carbon fibre reinforced polymer aeronautical laminate subjected to artificial, via Teflon patches, and natural, via suitable low velocity drop weight impact tests, delaminations. The performance and main influencing factors of such an approach are studied through a Design of Experiment statistical method, considering both Pulse Echo and Pitch Catch configurations of PZT sensors. Results show that some factors and their interactions can effectively influence the detection of a delamination-like damage. PMID:25746761
NASA Astrophysics Data System (ADS)
Peng, C.-K.; Yang, Albert C.-C.; Goldberger, Ary L.
2007-03-01
We recently proposed a novel approach to categorize information carried by symbolic sequences based on their usage of repetitive patterns. A simple quantitative index to measure the dissimilarity between two symbolic sequences can be defined. This information dissimilarity index, defined by our formula, is closely related to the Shannon entropy and rank order of the repetitive patterns in the symbolic sequences. Here we discuss the underlying statistical physics assumptions of this dissimilarity index. We use human cardiac interbeat interval time series and DNA sequences as examples to illustrate the applicability of this generic approach to real-world problems.
Advancement in contemporary diagnostic and therapeutic approaches for rheumatoid arthritis.
Kumar, L Dinesh; Karthik, R; Gayathri, N; Sivasudha, T
2016-04-01
This review is intended to provide a summary of the pathogenesis, diagnosis and therapies for rheumatoid arthritis. Rheumatoid arthritis (RA) is a common form of inflammatory autoimmune disease with unknown aetiology. Bone degradation, cartilage and synovial destruction are three major pathways of RA pathology. Sentinel cells includes dendritic cells, macrophages and mast cells bound with the auto antigens and initiate the inflammation of the joints. Those cells further activates the immune cells on synovial membrane by releasing inflammatory cytokines Interleukin 1, 6, 17, etc., Diagnosis of this disease is a combinational approach comprises radiological imaging, blood and serology markers assessment. The treatment of RA still remain inadequate due to the lack of knowledge in disease development. Non-steroidal anti-inflammatory drugs, disease modifying anti rheumatic drugs and corticosteroid are the commercial drugs to reduce pain, swelling and suppressing several disease factors. Arthroscopy will be an useful method while severe degradation of joint tissues. Gene therapy is a major advancement in RA. Suppressor gene locus of inflammatory mediators and matrix degrading enzymes were inserted into the affected area to reduce the disease progression. To overcome the issues aroused from those therapies like side effects and expenses, phytocompounds have been investigated and certain compounds are proved for their anti-arthritic potential. Furthermore certain complementary alternative therapies like yoga, acupuncture, massage therapy and tai chi have also been proved for their capability in RA treatment. PMID:27044812
NASA Astrophysics Data System (ADS)
Tsutsumi, Morito; Seya, Hajime
2009-12-01
This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.
A Statistical Approach for the Concurrent Coupling of Molecular Dynamics and Finite Element Methods
NASA Technical Reports Server (NTRS)
Saether, E.; Yamakov, V.; Glaessgen, E.
2007-01-01
Molecular dynamics (MD) methods are opening new opportunities for simulating the fundamental processes of material behavior at the atomistic level. However, increasing the size of the MD domain quickly presents intractable computational demands. A robust approach to surmount this computational limitation has been to unite continuum modeling procedures such as the finite element method (FEM) with MD analyses thereby reducing the region of atomic scale refinement. The challenging problem is to seamlessly connect the two inherently different simulation techniques at their interface. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the typical boundary value problem used to define a coupled domain. The method uses statistical averaging of the atomistic MD domain to provide displacement interface boundary conditions to the surrounding continuum FEM region, which, in return, generates interface reaction forces applied as piecewise constant traction boundary conditions to the MD domain. The two systems are computationally disconnected and communicate only through a continuous update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM) as opposed to a direct coupling method where interface atoms and FEM nodes are individually related. The methodology is inherently applicable to three-dimensional domains, avoids discretization of the continuum model down to atomic scales, and permits arbitrary temperatures to be applied.
NASA Astrophysics Data System (ADS)
Wang, Lian-xing; Ju, Hua-lamg; Chem, Zhem-ming
1995-03-01
Eighty-three patients suffering from moderate or advanced malignant tumors were treated by combined chemotherapy and photodynamic therapy (PDT) in our hospital. The short term result of such management is very promising, the effectiveness seems to be nearly 100% and the general responsive rate is 79.5% (CR + PR). If compared with another group of 84 similar patients whom were treated with PDT alone, the short term efficacy is 85.7% while the general response rate is 54.7% (P < 0.01), there is a significant statistic. The better result of the combined approach is probably due to the action of the chemotherapeutic agent, potentially blocking the mitosis of the cellular cycle at certain phases of the cancer cells, making the cell membrane become more permeable to the photochemical agent, HPD, and eliciting a better cancerocidal effect.
Halpin, Peter F; Stam, Henderikus J
2006-01-01
The application of statistical testing in psychological research over the period of 1940-1960 is examined in order to address psychologists' reconciliation of the extant controversy between the Fisher and Neyman-Pearson approaches. Textbooks of psychological statistics and the psychological journal literature are reviewed to examine the presence of what Gigerenzer (1993) called a hybrid model of statistical testing. Such a model is present in the textbooks, although the mathematically incomplete character of this model precludes the appearance of a similarly hybridized approach to statistical testing in the research literature. The implications of this hybrid model for psychological research and the statistical testing controversy are discussed. PMID:17286092
A combinatorial approach to the discovery of advanced materials
NASA Astrophysics Data System (ADS)
Sun, Xiao-Dong
This thesis discusses the application of combinatorial methods to the search of advanced materials. The goal of this research is to develop a "parallel" or "fast sequential" methodology for both the synthesis and characterization of materials with novel electronic, magnetic and optical properties. Our hope is to dramatically accelerate the rate at which materials are generated and studied. We have developed two major combinatorial methodologies to this end. One involves generating thin film materials libraries using a combination of various thin film deposition and masking strategies with multi-layer thin film precursors. The second approach is to generate powder materials libraries with solution precursors delivered with a multi-nozzle inkjet system. The first step in this multistep combinatorial process involves the design and synthesis of high density libraries of diverse materials aimed at exploring a large segment of the compositional space of interest based on our understanding of the physical and structural properties of a particular class of materials. Rapid, sensitive measurements of one or more relevant physical properties of each library member result in the identification of a family of "lead" compositions with a desired property. These compositions are then optimized by continuously varying the stoichiometries of a more focused set of precursors. Materials with the optimal composition are then synthesized in quantities sufficient for detailed characterization of their structural and physical properties. Finally, the information obtained from this process should enhance our predictive ability in subsequent experiments. Combinatorial methods have been successfully used in the synthesis and discovery of materials with novel properties. For example, a class of cobaltite based giant magnetoresistance (GMR) ceramics was discovered; Application of this method to luminescence materials has resulted in the discovery of a few highly efficient tricolor
Land cover change using an energy transition paradigm in a statistical mechanics approach
NASA Astrophysics Data System (ADS)
Zachary, Daniel S.
2013-10-01
This paper explores a statistical mechanics approach as a means to better understand specific land cover changes on a continental scale. Integrated assessment models are used to calculate the impact of anthropogenic emissions via the coupling of technoeconomic and earth/atmospheric system models and they have often overlooked or oversimplified the evolution of land cover change. Different time scales and the uncertainties inherent in long term projections of land cover make their coupling to integrated assessment models difficult. The mainstream approach to land cover modelling is rule-based methodology and this necessarily implies that decision mechanisms are often removed from the physical geospatial realities, therefore a number of questions remain: How much of the predictive power of land cover change can be linked to the physical situation as opposed to social and policy realities? Can land cover change be understood using a statistical approach that includes only economic drivers and the availability of resources? In this paper, we use an energy transition paradigm as a means to predict this change. A cost function is applied to developed land covers for urban and agricultural areas. The counting of area is addressed using specific examples of a Pólya process involving Maxwell-Boltzmann and Bose-Einstein statistics. We apply an iterative counting method and compare the simulated statistics with fractional land cover data with a multi-national database. An energy level paradigm is used as a basis in a flow model for land cover change. The model is compared with tabulated land cover change in Europe for the period 1990-2000. The model post-predicts changes for each nation. When strong extraneous factors are absent, the model shows promise in reproducing data and can provide a means to test hypothesis for the standard rules-based algorithms.
NASA Astrophysics Data System (ADS)
Dhakal, Nirajan; Jain, Shaleen; Gray, Alexander; Dandy, Michael; Stancioff, Esperanza
2015-06-01
Changes in seasonality of extreme storms have important implications for public safety, storm water infrastructure, and, in general, adaptation strategies in a changing climate. While past research on this topic offers some approaches to characterize seasonality, the methods are somewhat limited in their ability to discern the diversity of distributional types for extreme precipitation dates. Herein, we present a comprehensive approach for assessment of temporal changes in the calendar dates for extreme precipitation within a circular statistics framework which entails: (a) three measures to summarize circular random variables (traditional approach), (b) four nonparametric statistical tests, and (c) a new nonparametric circular density method to provide a robust assessment of the nature of probability distribution and changes. Two 30 year blocks (1951-1980 and 1981-2010) of annual maximum daily precipitation from 10 stations across the state of Maine were used for our analysis. Assessment of seasonality based on nonparametric approach indicated nonstationarity; some stations exhibited shifts in significant mode toward Spring season for the recent time period while some other stations exhibited multimodal seasonal pattern for both the time periods. Nonparametric circular density method, used in this study, allows for an adaptive estimation of seasonal density. Despite the limitation of being sensitive to the smoothing parameter, this method can accurately characterize one or more modes of seasonal peaks, as well as pave the way toward assessment of changes in seasonality over time.
Carlsen, Michelle; Fu, Guifang; Bushman, Shaun; Corcoran, Christopher
2016-02-01
Genome-wide data with millions of single-nucleotide polymorphisms (SNPs) can be highly correlated due to linkage disequilibrium (LD). The ultrahigh dimensionality of big data brings unprecedented challenges to statistical modeling such as noise accumulation, the curse of dimensionality, computational burden, spurious correlations, and a processing and storing bottleneck. The traditional statistical approaches lose their power due to [Formula: see text] (n is the number of observations and p is the number of SNPs) and the complex correlation structure among SNPs. In this article, we propose an integrated distance correlation ridge regression (DCRR) approach to accommodate the ultrahigh dimensionality, joint polygenic effects of multiple loci, and the complex LD structures. Initially, a distance correlation (DC) screening approach is used to extensively remove noise, after which LD structure is addressed using a ridge penalized multiple logistic regression (LRR) model. The false discovery rate, true positive discovery rate, and computational cost were simultaneously assessed through a large number of simulations. A binary trait of Arabidopsis thaliana, the hypersensitive response to the bacterial elicitor AvrRpm1, was analyzed in 84 inbred lines (28 susceptibilities and 56 resistances) with 216,130 SNPs. Compared to previous SNP discovery methods implemented on the same data set, the DCRR approach successfully detected the causative SNP while dramatically reducing spurious associations and computational time. PMID:26661113
Bayesian approach for counting experiment statistics applied to a neutrino point source analysis
NASA Astrophysics Data System (ADS)
Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.
2013-12-01
In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.
TAMIS for rectal tumors: advancements of a new approach.
Rega, Daniela; Pace, Ugo; Niglio, Antonello; Scala, Dario; Sassaroli, Cinzia; Delrio, Paolo
2016-03-01
TAMIS allows transanal excision of rectal lesions by the means of a single-incision access port and traditional laparoscopic instruments. This technique represents a promising treatment of rectal neoplasms since it guarantees precise dissection and reproducible approaches. From May 2010 to September 2015, we performed excisions of rectal lesions in 55 patients using a SILS port. The pre-operative diagnosis was 26 tumours, 26 low and high grade displasias and 3 other benign neoplasias. 11 patients had a neoadjuvant treatment. Pneumorectum was established at a pressure of 15-20 mmHg CO2 with continuous insufflation, and ordinary laparoscopic instruments were used to perform full thickness resection of rectal neoplasm with a conventional 5-mm 30° laparoscopic camera. The average operative time was 78 min. Postoperative recovery was uneventful in 53 cases: in one case a Hartmann procedure was necessary at two postoperative days due to an intraoperative intraperitoneal perforation; in another case, a diverting colostomy was required at the five postoperative days due to an intraoperative perforation of the vaginal wall. Unclear resection margins were detected in six patients: thereafter five patients underwent radical surgery; the other patient was unfit for radical surgery, but is actually alive and well. Patients were discharged after a median of 3 days. Transanal minimally invasive surgery is an advanced transanal platform that provides a safe and effective method for low rectal tumors. The feasibility of TAMIS also for malignant lesions treated in a neoadjuvant setting could be cautiously evaluated in the future. PMID:27052544
Zhang, Jiang; Lanham, Kevin A; Heideman, Warren; Peterson, Richard E.; Li, Lingjun
2013-01-01
2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) is a persistent environmental pollutant and teratogen that produces cardiac toxicity in the developing zebrafish. Here we adopted a label free quantitative proteomic approach based on normalized spectral abundance factor (NSAF) to investigate the disturbance of the cardiac proteome induced by TCDD in the adult zebrafish heart. The protein expression level changes between heart samples from TCDD treated and control zebrafish were systematically evaluated by a large scale MudPIT analysis which incorporated triplicate analyses for both control and TCDD exposed heart proteomic samples to overcome the data dependant variation in shotgun proteomic experiments and obtain a statistically significant protein dataset with improved quantification confidence. A total of 519 and 443 proteins were identified in hearts collected from control and TCDD treated zebrafish, respectively, among which 106 proteins showed statistically significant expression changes. After correcting for the experimental variation between replicate analyses by statistical evaluation, 55 proteins exhibited NSAF ratio above 2 and 43 proteins displayed NSAF ratio smaller than 0.5, with statistical significance by t-test (p < 0.05). The proteins identified as altered by TCDD encompass a wide range of biological functions including calcium handling, myocardium cell architecture, energy production and metabolism, mitochondrial homeostasis, and stress response. Collectively, our results indicate that TCDD exposure alters the adult zebrafish heart in a way that could result in cardiac hypertrophy and heart failure, and suggests a potential mechanism for the diastolic dysfunction observed in TCDD exposed embryos. PMID:23682714
Griffith, Lauren E.; van den Heuvel, Edwin; Fortier, Isabel; Sohel, Nazmul; Hofer, Scott M.; Payette, Hélène; Wolfson, Christina; Belleville, Sylvie; Kenny, Meghan; Doiron, Dany; Raina, Parminder
2015-01-01
Objectives To identify statistical methods for harmonization which could be used in the context of summary data and individual participant data meta-analysis of cognitive measures. Study Design and Setting Environmental scan methods were used to conduct two reviews to identify: 1) studies that quantitatively combined data on cognition, and 2) general literature on statistical methods for data harmonization. Search results were rapidly screened to identify articles of relevance. Results All 33 meta-analyses combining cognition measures either restricted their analyses to a subset of studies using a common measure or combined standardized effect sizes across studies; none reported their harmonization steps prior to producing summary effects. In the second scan, three general classes of statistical harmonization models were identified: 1) standardization methods, 2) latent variable models, and 3) multiple imputation models; few publications compared methods. Conclusions Although it is an implicit part of conducting a meta-analysis or pooled analysis, the methods used to assess inferential equivalence of complex constructs are rarely reported or discussed. Progress in this area will be supported by guidelines for the conduct and reporting of the data harmonization and integration and by evaluating and developing statistical approaches to harmonization. PMID:25497980
Novel Method of Interconnect Worstcase Establishment with Statistically-Based Approaches
NASA Astrophysics Data System (ADS)
Jung, Won-Young; Kim, Hyungon; Kim, Yong-Ju; Wee, Jae-Kyung
In order for the interconnect effects due to process-induced variations to be applied to the designs in 0.13μm and below, it is necessary to determine and characterize the realistic interconnect worstcase models with high accuracy and speed. This paper proposes new statistically-based approaches to the characterization of realistic interconnect worstcase models which take into account process-induced variations. The Effective Common Geometry (ECG) and Accumulated Maximum Probability (AMP) algorithms have been developed and implemented into the new statistical interconnect worstcase design environment. To verify this statistical interconnect worstcase design environment, the 31-stage ring oscillators are fabricated and measured with UMC 0.13μm Logic process. The 15-stage ring oscillators are fabricated and measured with 0.18μm standard CMOS process for investigating its flexibility in other technologies. The results show that the relative errors of the new method are less than 1.00%, which is two times more accurate than the conventional worstcase method. Furthermore, the new interconnect worstcase design environment improves optimization speed by 29.61-32.01% compared to that of the conventional worstcase optimization. The new statistical interconnect worstcase design environment accurately predicts the worstcase and bestcase corners of non-normal distribution where conventional methods cannot do well.
Robust statistical approaches to assess the degree of agreement of clinical data
NASA Astrophysics Data System (ADS)
Grilo, Luís M.; Grilo, Helena L.
2016-06-01
To analyze the blood of patients who took vitamin B12 for a period of time, two different medicine measurement methods were used (one is the established method, with more human intervention, and the other method uses essentially machines). Given the non-normality of the differences between both measurement methods, the limits of agreement are estimated using also a non-parametric approach to assess the degree of agreement of the clinical data. The bootstrap resampling method is applied in order to obtain robust confidence intervals for mean and median of differences. The approaches used are easy to apply, running a friendly software, and their outputs are also easy to interpret. In this case study the results obtained with (non)parametric approaches lead us to different statistical conclusions, but the decision whether agreement is acceptable or not is always a clinical judgment.
Multi-level approach for statistical appearance models with probabilistic correspondences
NASA Astrophysics Data System (ADS)
Krüger, Julia; Ehrhardt, Jan; Handels, Heinz
2016-03-01
Statistical shape and appearance models are often based on the accurate identification of one-to-one correspondences in a training data set. At the same time, the determination of these corresponding landmarks is the most challenging part of such methods. Hufnagel et al.1 developed an alternative method using correspondence probabilities for a statistical shape model. In Krüuger et al.2, 3 we propose the use of probabilistic correspondences for statistical appearance models by incorporating appearance information into the framework. We employ a point-based representation of image data combining position and appearance information. The model is optimized and adapted by a maximum a-posteriori (MAP) approach deriving a single global optimization criterion with respect to model parameters and observation dependent parameters that directly affects shape and appearance information of the considered structures. Because initially unknown correspondence probabilities are used and a higher number of degrees of freedom is introduced to the model a regularization of the model generation process is advantageous. For this purpose we extend the derived global criterion by a regularization term which penalizes implausible topological changes. Furthermore, we propose a multi-level approach for the optimization, to increase the robustness of the model generation process.
Design of Complex Systems in the presence of Large Uncertainties: a statistical approach
Koutsourelakis, P
2007-07-31
The design or optimization of engineering systems is generally based on several assumptions related to the loading conditions, physical or mechanical properties, environmental effects, initial or boundary conditions etc. The effect of those assumptions to the optimum design or the design finally adopted is generally unknown particularly in large, complex systems. A rational recourse would be to cast the problem in a probabilistic framework which accounts for the various uncertainties but also allows to quantify their effect in the response/behavior/performance of the system. In such a framework the performance function(s) of interest are also random and optimization of the system with respect to the design variables has to be reformulated with respect to statistical properties of these objectives functions (e.g. probability of exceeding certain thresholds). Analysis tools are usually restricted to elaborate legacy codes which have been developed over a long period of time and are generally well-tested (e.g. Finite Elements). These do not however include any stochastic components and their alteration is impossible or ill-advised. Furthermore as the number of uncertainties and design variables grows, the problem quickly becomes computationally intractable. The present paper advocates the use of statistical learning in order to perform these tasks for any system of arbitrary complexity as long as a deterministic solver is available. The proposed computational framework consists of two components. Firstly advanced sampling techniques are employed in order to efficiently explore the dependence of the performance with respect to the uncertain and design variables. The proposed algorithm is directly parallelizable and attempts to maximize the amount of information extracted with the least possible number of calls to the deterministic solver. The output of this process is utilized by statistical classification procedures in order to derive the dependence of the performance
NASA Astrophysics Data System (ADS)
Appelhans, Tim; Mwangomo, Ephraim; Otte, Insa; Detsch, Florian; Nauss, Thomas; Hemp, Andreas; Ndyamkama, Jimmy
2015-04-01
This study introduces the set-up and characteristics of a meteorological station network on the southern slopes of Mt. Kilimanjaro, Tanzania. The set-up follows a hierarchical approach covering an elevational as well as a land-use disturbance gradient. The network consists of 52 basic stations measuring ambient air temperature and above ground air humidity and 11 precipitation measurement sites. We provide in depth descriptions of various machine learning and classical geo-statistical methods used to fill observation gaps and extend the spatial coverage of the network to a total of 60 research sites. Performance statistics for these methods indicate that the presented data sets provide reliable measurements of the meteorological reality at Mt. Kilimanjaro. These data provide an excellent basis for ecological studies and are also of great value for regional atmospheric numerical modelling studies for which such comprehensive in-situ validation observations are rare, especially in tropical regions of complex terrain.
Shelton, M.L.; Gregory, B.A. ); Doughty, R.L.; Kiss, T.; Moses, H.L. . Mechanical Engineering Dept.)
1993-07-01
In aircraft engine design (and in other applications), small improvements in turbine efficiency may be significant. Since analytical tools for predicting transonic turbine losses are still being developed, experimental efforts are required to evaluate various designs, calibrate design methods, and validate CFD analysis tools. However, these experimental efforts must be very accurate to measure the performance differences to the levels required by the highly competitive aircraft engine market. Due to the sensitivity of transonic and supersonic flow fields, it is often difficult to obtain the desired level of accuracy. In this paper, a statistical approach is applied to the experimental evaluation of transonic turbine airfoils in the VPI and SU transonic cascade facility in order to quantify the differences between three different transonic turbine airfoils. This study determines whether the measured performance differences between the three different airfoils are statistically significant. This study also assesses the degree of confidence in the transonic cascade testing process at VPI and SU.
Jacquin, Hugo; Shakhnovich, Eugene; Cocco, Simona; Monasson, Rémi
2016-01-01
Inverse statistical approaches to determine protein structure and function from Multiple Sequence Alignments (MSA) are emerging as powerful tools in computational biology. However the underlying assumptions of the relationship between the inferred effective Potts Hamiltonian and real protein structure and energetics remain untested so far. Here we use lattice protein model (LP) to benchmark those inverse statistical approaches. We build MSA of highly stable sequences in target LP structures, and infer the effective pairwise Potts Hamiltonians from those MSA. We find that inferred Potts Hamiltonians reproduce many important aspects of ‘true’ LP structures and energetics. Careful analysis reveals that effective pairwise couplings in inferred Potts Hamiltonians depend not only on the energetics of the native structure but also on competing folds; in particular, the coupling values reflect both positive design (stabilization of native conformation) and negative design (destabilization of competing folds). In addition to providing detailed structural information, the inferred Potts models used as protein Hamiltonian for design of new sequences are able to generate with high probability completely new sequences with the desired folds, which is not possible using independent-site models. Those are remarkable results as the effective LP Hamiltonians used to generate MSA are not simple pairwise models due to the competition between the folds. Our findings elucidate the reasons for the success of inverse approaches to the modelling of proteins from sequence data, and their limitations. PMID:27177270
A statistical approach for analyzing the development of 1H multiple-quantum coherence in solids.
Mogami, Yuuki; Noda, Yasuto; Ishikawa, Hiroto; Takegoshi, K
2013-05-21
A novel statistical approach for analyzing (1)H multiple-quantum (MQ) spin dynamics in so-called spin-counting solid-state NMR experiments is presented. The statistical approach is based on the percolation theory with Monte Carlo methods and is examined by applying it to the experimental results of three solid samples having unique hydrogen arrangement for 1-3 dimensions: the n-alkane/d-urea inclusion complex as a one-dimensional (1D) system, whose (1)H nuclei align approximately in 1D, and magnesium hydroxide and adamantane as a two-dimensional (2D) and a three-dimensional (3D) system, respectively. Four lattice models, linear, honeycomb, square and cubic, are used to represent the (1)H arrangement of the three samples. It is shown that the MQ dynamics in adamantane is consistent with that calculated using the cubic lattice and that in Mg(OH)2 with that calculated using the honeycomb and the square lattices. For n-C20H42/d-urea, these 4 lattice models fail to express its result. It is shown that a more realistic model representing the (1)H arrangement of n-C20H42/d-urea can describe the result. The present approach can thus be used to determine (1)H arrangement in solids. PMID:23580152
Predicting future protection of respirator users: Statistical approaches and practical implications.
Hu, Chengcheng; Harber, Philip; Su, Jing
2016-05-01
The purpose of this article is to describe a statistical approach for predicting a respirator user's fit factor in the future based upon results from initial tests. A statistical prediction model was developed based upon joint distribution of multiple fit factor measurements over time obtained from linear mixed effect models. The model accounts for within-subject correlation as well as short-term (within one day) and longer-term variability. As an example of applying this approach, model parameters were estimated from a research study in which volunteers were trained by three different modalities to use one of two types of respirators. They underwent two quantitative fit tests at the initial session and two on the same day approximately six months later. The fitted models demonstrated correlation and gave the estimated distribution of future fit test results conditional on past results for an individual worker. This approach can be applied to establishing a criterion value for passing an initial fit test to provide reasonable likelihood that a worker will be adequately protected in the future; and to optimizing the repeat fit factor test intervals individually for each user for cost-effective testing. PMID:26771896
Mougabure-Cueto, G; Sfara, V
2016-04-25
Dose-response relations can be obtained from systems at any structural level of biological matter, from the molecular to the organismic level. There are two types of approaches for analyzing dose-response curves: a deterministic approach, based on the law of mass action, and a statistical approach, based on the assumed probabilities distribution of phenotypic characters. Models based on the law of mass action have been proposed to analyze dose-response relations across the entire range of biological systems. The purpose of this paper is to discuss the principles that determine the dose-response relations. Dose-response curves of simple systems are the result of chemical interactions between reacting molecules, and therefore are supported by the law of mass action. In consequence, the shape of these curves is perfectly sustained by physicochemical features. However, dose-response curves of bioassays with quantal response are not explained by the simple collision of molecules but by phenotypic variations among individuals and can be interpreted as individual tolerances. The expression of tolerance is the result of many genetic and environmental factors and thus can be considered a random variable. In consequence, the shape of its associated dose-response curve has no physicochemical bearings; instead, they are originated from random biological variations. Due to the randomness of tolerance there is no reason to use deterministic equations for its analysis; on the contrary, statistical models are the appropriate tools for analyzing these dose-response relations. PMID:26952004
Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach.
Monti, Martin M
2011-01-01
Functional magnetic resonance imaging (fMRI) is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a general linear model (GLM) approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power, and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making. PMID:21442013
ERIC Educational Resources Information Center
McLoughlin, M. Padraig M. M.
2008-01-01
The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... HUMAN SERVICES Workshop: Advancing Research on Mixtures; New Perspectives and Approaches for Predicting... ``Advancing Research on Mixtures: New Perspectives and Approaches for Predicting Adverse Human Health Effects... Research and Training, NIEHS, P.O. Box 12233, MD K3-04, Research Triangle Park, NC 27709, (telephone)...
ERIC Educational Resources Information Center
Touchton, Michael
2015-01-01
I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…
a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems
NASA Astrophysics Data System (ADS)
Shao, Xiao; Chai, Li H.
As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.
Canadian Educational Approaches for the Advancement of Pharmacy Practice
Louizos, Christopher; Austin, Zubin
2014-01-01
Canadian faculties (schools) of pharmacy are actively engaged in the advancement and restructuring of their programs in response to the shift in pharmacy to pharmacists having/assuming an advanced practitioner role. Unfortunately, there is a paucity of evidence outlining optimal strategies for accomplishing this task. This review explores several educational changes proposed in the literature to aid in the advancement of pharmacy education such as program admission requirements, critical-thinking assessment and teaching methods, improvement of course content delivery, value of interprofessional education, advancement of practical experiential education, and mentorship strategies. Collectively, implementation of these improvements to pharmacy education will be crucial in determining the direction the profession will take. PMID:25258448
Shabbiri, Khadija; Adnan, Ahmad; Jamil, Sania; Ahmad, Waqar; Noor, Bushra; Rafique, H.M.
2012-01-01
Various cultivation parameters were optimized for the production of extra cellular protease by Brevibacterium linens DSM 20158 grown in solid state fermentation conditions using statistical approach. The cultivation variables were screened by the Plackett–Burman design and four significant variables (soybean meal, wheat bran, (NH4)2SO4 and inoculum size were further optimized via central composite design (CCD) using a response surface methodological approach. Using the optimal factors (soybean meal 12.0g, wheat bran 8.50g, (NH4)2SO4) 0.45g and inoculum size 3.50%), the rate of protease production was found to be twofold higher in the optimized medium as compared to the unoptimized reference medium. PMID:24031928
A statistical approach to close packing of elastic rods and to DNA packaging in viral capsids
Katzav, E.; Adda-Bedia, M.; Boudaoud, A.
2006-01-01
We propose a statistical approach for studying the close packing of elastic rods. This phenomenon belongs to the class of problems of confinement of low dimensional objects, such as DNA packaging in viral capsids. The method developed is based on Edwards' approach, which was successfully applied to polymer physics and to granular matter. We show that the confinement induces a configurational phase transition from a disordered (isotropic) phase to an ordered (nematic) phase. In each phase, we derive the pressure exerted by the rod (DNA) on the container (capsid) and the force necessary to inject (eject) the rod into (out of) the container. Finally, we discuss the relevance of the present results with respect to physical and biological problems. Regarding DNA packaging in viral capsids, these results establish the existence of ordered configurations, a hypothesis upon which previous calculations were built. They also show that such ordering can result from simple mechanical constraints. PMID:17146049
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Sivasamy, Aneetha Avalappampatty; Sundan, Bose
2015-01-01
The ever expanding communication requirements in today's world demand extensive and efficient network systems with equally efficient and reliable security features integrated for safe, confident, and secured communication and data transfer. Providing effective security protocols for any network environment, therefore, assumes paramount importance. Attempts are made continuously for designing more efficient and dynamic network intrusion detection models. In this work, an approach based on Hotelling's T(2) method, a multivariate statistical analysis technique, has been employed for intrusion detection, especially in network environments. Components such as preprocessing, multivariate statistical analysis, and attack detection have been incorporated in developing the multivariate Hotelling's T(2) statistical model and necessary profiles have been generated based on the T-square distance metrics. With a threshold range obtained using the central limit theorem, observed traffic profiles have been classified either as normal or attack types. Performance of the model, as evaluated through validation and testing using KDD Cup'99 dataset, has shown very high detection rates for all classes with low false alarm rates. Accuracy of the model presented in this work, in comparison with the existing models, has been found to be much better. PMID:26357668