Confounding in statistical mediation analysis: What it is and how to address it.
Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P
2017-11-01
Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
ERIC Educational Resources Information Center
Everson, Howard T.; And Others
This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…
Statistical reporting of clinical pharmacology research.
Ring, Arne; Schall, Robert; Loke, Yoon K; Day, Simon
2017-06-01
Research in clinical pharmacology covers a wide range of experiments, trials and investigations: clinical trials, systematic reviews and meta-analyses of drug usage after market approval, the investigation of pharmacokinetic-pharmacodynamic relationships, the search for mechanisms of action or for potential signals for efficacy and safety using biomarkers. Often these investigations are exploratory in nature, which has implications for the way the data should be analysed and presented. Here we summarize some of the statistical issues that are of particular importance in clinical pharmacology research. © 2017 The British Pharmacological Society.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Accelerated testing of space batteries
NASA Technical Reports Server (NTRS)
Mccallum, J.; Thomas, R. E.; Waite, J. H.
1973-01-01
An accelerated life test program for space batteries is presented that fully satisfies empirical, statistical, and physical criteria for validity. The program includes thermal and other nonmechanical stress analyses as well as mechanical stress, strain, and rate of strain measurements.
Dynamic systems approaches and levels of analysis in the nervous system
Parker, David; Srivastava, Vipin
2013-01-01
Various analyses are applied to physiological signals. While epistemological diversity is necessary to address effects at different levels, there is often a sense of competition between analyses rather than integration. This is evidenced by the differences in the criteria needed to claim understanding in different approaches. In the nervous system, neuronal analyses that attempt to explain network outputs in cellular and synaptic terms are rightly criticized as being insufficient to explain global effects, emergent or otherwise, while higher-level statistical and mathematical analyses can provide quantitative descriptions of outputs but can only hypothesize on their underlying mechanisms. The major gap in neuroscience is arguably our inability to translate what should be seen as complementary effects between levels. We thus ultimately need approaches that allow us to bridge between different spatial and temporal levels. Analytical approaches derived from critical phenomena in the physical sciences are increasingly being applied to physiological systems, including the nervous system, and claim to provide novel insight into physiological mechanisms and opportunities for their control. Analyses of criticality have suggested several important insights that should be considered in cellular analyses. However, there is a mismatch between lower-level neurophysiological approaches and statistical phenomenological analyses that assume that lower-level effects can be abstracted away, which means that these effects are unknown or inaccessible to experimentalists. As a result experimental designs often generate data that is insufficient for analyses of criticality. This review considers the relevance of insights from analyses of criticality to neuronal network analyses, and highlights that to move the analyses forward and close the gap between the theoretical and neurobiological levels, it is necessary to consider that effects at each level are complementary rather than in competition. PMID:23386835
Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B
2017-08-01
After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.
Memory matters: influence from a cognitive map on animal space use.
Gautestad, Arild O
2011-10-21
A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.
Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana
2015-11-01
The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.
Interrelationship of mechanical and corrosion-mechanical characteristics of type 12KhN4MF steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voronin, V.P.; Goncharov, A.F.; Maslov, V.A.
1985-11-01
Investigations presented include a comparative evaluation of the corrosionmechanical characteristics of specimens of high-strength chrome-nickelmolybdenum steel taking into consideration the different methods of melting of the original metal. A comparison of the corrosion-mechanical test results obtained with the results of acceptance tests are presented. A study of the fracture surfaces and the specimen material with the use of fractographic, macroscopic, and microscopic analyses is given. The systematization of the corrosion-mechanical test results with the use of methods of mathematical statistics are presented.
Atmospheric Convective Organization: Self-Organized Criticality or Homeostasis?
NASA Astrophysics Data System (ADS)
Yano, Jun-Ichi
2015-04-01
Atmospheric convection has a tendency organized on a hierarchy of scales ranging from the mesoscale to the planetary scales, with the latter especially manifested by the Madden-Julian oscillation. The present talk examines two major possible mechanisms of self-organization identified in wider literature from a phenomenological thermodynamic point of view by analysing a planetary-scale cloud-resolving model simulation. The first mechanism is self-organized criticality. A saturation tendency of precipitation rate with the increasing column-integrated water, reminiscence of critical phenomena, indicates self-organized criticality. The second is a self-regulation mechanism that is known as homeostasis in biology. A thermodynamic argument suggests that such self-regulation maintains the column-integrated water below a threshold by increasing the precipitation rate. Previous analyses of both observational data as well as cloud-resolving model (CRM) experiments give mixed results. A satellite data analysis suggests self-organized criticality. Some observational data as well as CRM experiments support homeostasis. Other analyses point to a combination of these two interpretations. In this study, a CRM experiment over a planetary-scale domain with a constant sea-surface temperature is analyzed. This analysis shows that the relation between the column-integrated total water and precipitation suggests self-organized criticality, whereas the one between the column-integrated water vapor and precipitation suggests homeostasis. The concurrent presence of these two mechanisms are further elaborated by detailed statistical and budget analyses. These statistics are scale invariant, reflecting a spatial scaling of precipitation processes. These self-organization mechanisms are most likely be best theoretically understood by the energy cycle of the convective systems consisting of the kinetic energy and the cloud-work function. The author has already investigated the behavior of this cycle system under a zero-dimensional configuration. Preliminary simulations of this cycle system over a two-dimensional domain will be presented.
ERIC Educational Resources Information Center
Bringuier, E.
2009-01-01
The paper analyses particle diffusion from a thermodynamic standpoint. The main goal of the paper is to highlight the conceptual connection between particle diffusion, which belongs to non-equilibrium statistical physics, and mechanics, which deals with particle motion, at the level of third-year university courses. We start out from the fact…
Terides, Matthew D; Dear, Blake F; Fogliati, Vincent J; Gandy, Milena; Karin, Eyal; Jones, Michael P; Titov, Nickolai
2018-01-01
Cognitive-behavioural therapy (CBT) is an effective treatment for clinical and subclinical symptoms of depression and general anxiety, and increases life satisfaction. Patients' usage of CBT skills is a core aspect of treatment but there is insufficient empirical evidence suggesting that skills usage behaviours are a mechanism of clinical change. This study investigated if an internet-delivered CBT (iCBT) intervention increased the frequency of CBT skills usage behaviours and if this statistically mediated reductions in symptoms and increased life satisfaction. A two-group randomised controlled trial was conducted comparing internet-delivered CBT (n = 65) with a waitlist control group (n = 75). Participants were individuals experiencing clinically significant symptoms of depression or general anxiety. Mixed-linear models analyses revealed that the treatment group reported a significantly higher frequency of skills usage, lower symptoms, and higher life satisfaction by the end of treatment compared with the control group. Results from bootstrapping mediation analyses revealed that the increased skills usage behaviours statistically mediated symptom reductions and increased life satisfaction. Although skills usage and symptom outcomes were assessed concurrently, these findings support the notion that iCBT increases the frequency of skills usage behaviours and suggest that this may be an important mechanism of change.
NASA Astrophysics Data System (ADS)
Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle
2017-08-01
In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.
Gandy, M; Karin, E; Jones, M P; McDonald, S; Sharpe, L; Titov, N; Dear, B F
2018-05-13
The evidence for Internet-delivered pain management programs for chronic pain is growing, but there is little empirical understanding of how they effect change. Understanding mechanisms of clinical response to these programs could inform their effective development and delivery. A large sample (n = 396) from a previous randomized controlled trial of a validated internet-delivered psychological pain management program, the Pain Course, was used to examine the influence of three potential psychological mechanisms (pain acceptance, pain self-efficacy, fear of movement/re-injury) on treatment-related change in disability, depression, anxiety and average pain. Analyses involved generalized estimating equation models for clinical outcomes that adjusted for co-occurring change in psychological variables. This was paired with cross-lagged analysis to assess for evidence of causality. Analyses involved two time points, pre-treatment and post-treatment. Changes in pain-acceptance were strongly associated with changes in three (depression, anxiety and average pain) of the four clinical outcomes. Changes in self-efficacy were also strongly associated with two (anxiety and average pain) clinical outcomes. These findings suggest that participants were unlikely to improve in these clinical outcomes without also experiencing increases in their pain self-efficacy and pain acceptance. However, there was no clear evidence from cross-lagged analyses to currently support these psychological variables as direct mechanisms of clinical improvements. There was only statistical evidence to suggest higher levels of self-efficacy moderated improvements in depression. The findings suggest that, while clinical improvements are closely associated with improvements in pain acceptance and self-efficacy, these psychological variables may not drive the treatment effects observed. This study employed robust statistical techniques to assess the psychological mechanisms of an established internet-delivered pain management program. While clinical improvements (e.g. depression, anxiety, pain) were closely associated with improvements in psychological variables (e.g. pain self-efficacy and pain acceptance), these variables do not appear to be treatment mechanisms. © 2018 European Pain Federation - EFIC®.
Emprechtinger, Robert; Piso, Brigitte; Ringleb, Peter A
2017-03-01
Mechanical thrombectomy with stent retrievers is an effective treatment for patients with ischemic stroke. Results of recent meta-analyses report that the treatment is safe. However, the endpoints recurrent stroke, vasospasms, and subarachnoid hemorrhage have not been evaluated sufficiently. Hence, we extracted data on these outcomes from the five recent thrombectomy trials (MR CLEAN, ESCAPE, REVASCAT, SWIFT PRIME, and EXTEND IA published in 2015). Subsequently, we conducted meta-analyses for each outcome. We report the results of the fixed, as well as the random effects model. Three studies reported data on recurrent strokes. While the results did not reach statistical significance in the random effects model (despite a three times elevated risk), the fixed effects model revealed a significantly higher rate of recurrent strokes after thrombectomy. Four studies reported data on subarachnoid hemorrhage. The higher pooled rates in the intervention groups were statistically significant in both, the fixed and the random effects model. One study reported on vasospasms. We recorded 14 events in the intervention group and none in the control group. The efficacy of mechanical thrombectomy is not questioned, yet our results indicate an increased risk for recurrent strokes, subarachnoid hemorrhage, and vasospasms post-treatment. Therefore, we strongly recommend a thoroughly surveillance, concerning these adverse events in future clinical trials and routine registries.
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
Sieve analysis in HIV-1 vaccine efficacy trials
Edlefsen, Paul T.; Gilbert, Peter B.; Rolland, Morgane
2013-01-01
Purpose of review The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. Recent findings The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 and RV144, led to numerous studies in the last five years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Summary Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons while correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection. PMID:23719202
Sieve analysis in HIV-1 vaccine efficacy trials.
Edlefsen, Paul T; Gilbert, Peter B; Rolland, Morgane
2013-09-01
The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 (HIV Vaccine Trials Network-502) and RV144, led to numerous studies in the last 5 years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons, whereas correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection.
Body Weight Reducing Effect of Oral Boric Acid Intake
Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut
2011-01-01
Background: Boric acid is widely used in biology, but its body weight reducing effect is not researched. Methods: Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Results: Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Conclusion: Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear. PMID:22135611
Body weight reducing effect of oral boric acid intake.
Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut
2011-01-01
Boric acid is widely used in biology, but its body weight reducing effect is not researched. Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear.
An evaluation of GTAW-P versus GTA welding of alloy 718
NASA Technical Reports Server (NTRS)
Gamwell, W. R.; Kurgan, C.; Malone, T. W.
1991-01-01
Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.
Tonelli, Adriano R.; Zein, Joe; Adams, Jacob; Ioannidis, John P.A.
2014-01-01
Purpose Multiple interventions have been tested in acute respiratory distress syndrome (ARDS). We examined the entire agenda of published randomized controlled trials (RCTs) in ARDS that reported on mortality and of respective meta-analyses. Methods We searched PubMed, the Cochrane Library and Web of Knowledge until July 2013. We included RCTs in ARDS published in English. We excluded trials of newborns and children; and those on short-term interventions, ARDS prevention or post-traumatic lung injury. We also reviewed all meta-analyses of RCTs in this field that addressed mortality. Treatment modalities were grouped in five categories: mechanical ventilation strategies and respiratory care, enteral or parenteral therapies, inhaled / intratracheal medications, nutritional support and hemodynamic monitoring. Results We identified 159 published RCTs of which 93 had overall mortality reported (n= 20,671 patients) - 44 trials (14,426 patients) reported mortality as a primary outcome. A statistically significant survival benefit was observed in 8 trials (7 interventions) and two trials reported an adverse effect on survival. Among RTCs with >50 deaths in at least 1 treatment arm (n=21), 2 showed a statistically significant mortality benefit of the intervention (lower tidal volumes and prone positioning), 1 showed a statistically significant mortality benefit only in adjusted analyses (cisatracurium) and 1 (high-frequency oscillatory ventilation) showed a significant detrimental effect. Across 29 meta-analyses, the most consistent evidence was seen for low tidal volumes and prone positioning in severe ARDS. Conclusions There is limited supportive evidence that specific interventions can decrease mortality in ARDS. While low tidal volumes and prone positioning in severe ARDS seem effective, most sporadic findings of interventions suggesting reduced mortality are not corroborated consistently in large-scale evidence including meta-analyses. PMID:24667919
On-Orbit System Identification
NASA Technical Reports Server (NTRS)
Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.
1987-01-01
Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.
A toolbox for determining subdiffusive mechanisms
NASA Astrophysics Data System (ADS)
Meroz, Yasmine; Sokolov, Igor M.
2015-04-01
Subdiffusive processes have become a field of great interest in the last decades, due to amounting experimental evidence of subdiffusive behavior in complex systems, and especially in biological systems. Different physical scenarios leading to subdiffusion differ in the details of the dynamics. These differences are what allow to theoretically reconstruct the underlying physics from the results of observations, and will be the topic of this review. We review the main statistical analyses available today to distinguish between these scenarios, categorizing them according to the relevant characteristics. We collect the available tools and statistical tests, presenting them within a broader perspective. We also consider possible complications such as the subordination of subdiffusive mechanisms. Due to the advances in single particle tracking experiments in recent years, we focus on the relevant case of where the available experimental data is scant, at the level of single trajectories.
Observational Word Learning: Beyond Propose-But-Verify and Associative Bean Counting.
Roembke, Tanja; McMurray, Bob
2016-04-01
Learning new words is difficult. In any naming situation, there are multiple possible interpretations of a novel word. Recent approaches suggest that learners may solve this problem by tracking co-occurrence statistics between words and referents across multiple naming situations (e.g. Yu & Smith, 2007), overcoming the ambiguity in any one situation. Yet, there remains debate around the underlying mechanisms. We conducted two experiments in which learners acquired eight word-object mappings using cross-situational statistics while eye-movements were tracked. These addressed four unresolved questions regarding the learning mechanism. First, eye-movements during learning showed evidence that listeners maintain multiple hypotheses for a given word and bring them all to bear in the moment of naming. Second, trial-by-trial analyses of accuracy suggested that listeners accumulate continuous statistics about word/object mappings, over and above prior hypotheses they have about a word. Third, consistent, probabilistic context can impede learning, as false associations between words and highly co-occurring referents are formed. Finally, a number of factors not previously considered in prior analysis impact observational word learning: knowledge of the foils, spatial consistency of the target object, and the number of trials between presentations of the same word. This evidence suggests that observational word learning may derive from a combination of gradual statistical or associative learning mechanisms and more rapid real-time processes such as competition, mutual exclusivity and even inference or hypothesis testing.
Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C
2016-05-20
Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.
Carter, Rebecca R; DiFeo, Analisa; Bogie, Kath; Zhang, Guo-Qiang; Sun, Jiayang
2014-01-01
Ovarian cancer is the most lethal gynecologic disease in the United States, with more women dying from this cancer than all gynecological cancers combined. Ovarian cancer has been termed the "silent killer" because some patients do not show clear symptoms at an early stage. Currently, there is a lack of approved and effective early diagnostic tools for ovarian cancer. There is also an apparent severe knowledge gap of ovarian cancer in general and of its indicative symptoms among both public and many health professionals. These factors have significantly contributed to the late stage diagnosis of most ovarian cancer patients (63% are diagnosed at Stage III or above), where the 5-year survival rate is less than 30%. The paucity of knowledge concerning ovarian cancer in the United States is unknown. The present investigation examined current public awareness and knowledge about ovarian cancer. The study implemented design strategies to develop an unbiased survey with quality control measures, including the modern application of multiple statistical analyses. The survey assessed a reasonable proxy of the US population by crowdsourcing participants through the online task marketplace Amazon Mechanical Turk, at a highly condensed rate of cost and time compared to traditional recruitment methods. Knowledge of ovarian cancer was compared to that of breast cancer using repeated measures, bias control and other quality control measures in the survey design. Analyses included multinomial logistic regression and categorical data analysis procedures such as correspondence analysis, among other statistics. We confirmed the relatively poor public knowledge of ovarian cancer among the US population. The simple, yet novel design should set an example for designing surveys to obtain quality data via Amazon Mechanical Turk with the associated analyses.
Association between sleep difficulties as well as duration and hypertension: is BMI a mediator?
Carrillo-Larco, R M; Bernabe-Ortiz, A; Sacksteder, K A; Diez-Canseco, F; Cárdenas, M K; Gilman, R H; Miranda, J J
2017-01-01
Sleep difficulties and short sleep duration have been associated with hypertension. Though body mass index (BMI) may be a mediator variable, the mediation effect has not been defined. We aimed to assess the association between sleep duration and sleep difficulties with hypertension, to determine if BMI is a mediator variable, and to quantify the mediation effect. We conducted a mediation analysis and calculated prevalence ratios with 95% confidence intervals. The exposure variables were sleep duration and sleep difficulties, and the outcome was hypertension. Sleep difficulties were statistically significantly associated with a 43% higher prevalence of hypertension in multivariable analyses; results were not statistically significant for sleep duration. In these analyses, and in sex-specific subgroup analyses, we found no strong evidence that BMI mediated the association between sleep indices and risk of hypertension. Our findings suggest that BMI does not appear to mediate the association between sleep patterns and hypertension. These results highlight the need to further study the mechanisms underlying the relationship between sleep patterns and cardiovascular risk factors.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Progressive statistics for studies in sports medicine and exercise science.
Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri
2009-01-01
Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.
Tempo-spatial analysis of Fennoscandian intraplate seismicity
NASA Astrophysics Data System (ADS)
Roberts, Roland; Lund, Björn
2017-04-01
Coupled spatial-temporal patterns of the occurrence of earthquakes in Fennoscandia are analysed using non-parametric methods. The occurrence of larger events is unambiguously and very strongly temporally clustered, with major implications for the assessment of seismic hazard in areas such as Fennoscandia. In addition, there is a clear pattern of geographical migration of activity. Data from the Swedish National Seismic Network and a collated international catalogue are analysed. Results show consistent patterns on different spatial and temporal scales. We are currently investigating these patterns in order to assess the statistical significance of the tempo-spatial patterns, and to what extent these may be consistent with stress transfer mechanism such as coulomb stress and pore fluid migration. Indications are that some further mechanism is necessary in order to explain the data, perhaps related to post-glacial uplift, which is up to 1cm/year.
Keith Jennings; Julia A. Jones
2015-01-01
This study tested multiple hydrologic mechanisms to explain snowpack dynamics in extreme rain-on-snow floods, which occur widely in the temperate and polar regions. We examined 26, 10 day large storm events over the period 1992â2012 in the H.J. Andrews Experimental Forest in western Oregon, using statistical analyses (regression, ANOVA, and wavelet coherence) of hourly...
Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549
Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.
Origin of the correlations between exit times in pedestrian flows through a bottleneck
NASA Astrophysics Data System (ADS)
Nicolas, Alexandre; Touloupas, Ioannis
2018-01-01
Robust statistical features have emerged from the microscopic analysis of dense pedestrian flows through a bottleneck, notably with respect to the time gaps between successive passages. We pinpoint the mechanisms at the origin of these features thanks to simple models that we develop and analyse quantitatively. We disprove the idea that anticorrelations between successive time gaps (i.e. an alternation between shorter ones and longer ones) are a hallmark of a zipper-like intercalation of pedestrian lines and show that they simply result from the possibility that pedestrians from distinct ‘lines’ or directions cross the bottleneck within a short time interval. A second feature concerns the bursts of escapes, i.e. egresses that come in fast succession. Despite the ubiquity of exponential distributions of burst sizes, entailed by a Poisson process, we argue that anomalous (power-law) statistics arise if the bottleneck is nearly congested, albeit only in a tiny portion of parameter space. The generality of the proposed mechanisms implies that similar statistical features should also be observed for other types of particulate flows.
Mars: Noachian hydrology by its statistics and topology
NASA Technical Reports Server (NTRS)
Cabrol, N. A.; Grin, E. A.
1993-01-01
Discrimination between fluvial features generated by surface drainage and subsurface aquifer discharges will provide clues to the understanding of early Mars' climatic history. Our approach is to define the process of formation of the oldest fluvial valleys by statistical and topological analyses. Formation of fluvial valley systems reached its highest statistical concentration during the Noachian Period. Nevertheless, they are a scarce phenomenom in Martian history, localized on the craterized upland, and subject to latitudinal distribution. They occur sparsely on Noachian geological units with a weak distribution density, and appear in reduced isolated surface (around 5 x 10(exp 3)(sq km)), filled by short streams (100-300 km length). Topological analysis of the internal organization of 71 surveyed Noachian fluvial valley networks also provides information on the mechanisms of formation.
Investigation of serum biomarkers in primary gout patients using iTRAQ-based screening.
Ying, Ying; Chen, Yong; Zhang, Shun; Huang, Haiyan; Zou, Rouxin; Li, Xiaoke; Chu, Zanbo; Huang, Xianqian; Peng, Yong; Gan, Minzhi; Geng, Baoqing; Zhu, Mengya; Ying, Yinyan; Huang, Zuoan
2018-03-21
Primary gout is a major disease that affects human health; however, its pathogenesis is not well known. The purpose of this study was to identify biomarkers to explore the underlying mechanisms of primary gout. We used the isobaric tags for relative and absolute quantitation (iTRAQ) technique combined with liquid chromatography-tandem mass spectrometry to screen differentially expressed proteins between gout patients and controls. We also identified proteins potentially involved in gout pathogenesis by analysing biological processes, cellular components, molecular functions, Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and protein-protein interactions. We further verified some samples using enzyme-linked immunosorbent assay (ELISA). Statistical analyses were carried out using SPSS v. 20.0 and ROC (receiver operating characterstic) curve analyses were carried out using Medcalc software. Two-sided p-values <0.05 were deemed to be statistically significant for all analyses. We identified 95 differentially expressed proteins (50 up-regulated and 45 down-regulated), and selected nine proteins (α-enolase (ENOA), glyceraldehyde-3-phosphate dehydrogenase (G3P), complement component C9 (CO9), profilin-1 (PROF1), lipopolysaccharide-binding protein (LBP), tubulin beta-4A chain (TBB4A), phosphoglycerate kinase (PGK1), glucose-6-phosphate isomerase (G6PI), and transketolase (TKT)) for verification. This showed that the level of TBB4A was significantly higher in primary gout than in controls (p=0.023). iTRAQ technology was useful in the selection of differentially expressed proteins from proteomes, and provides a strong theoretical basis for the study of biomarkers and mechanisms in primary gout. In addition, TBB4A protein may be associated with primary gout.
Chang, Suchi; Shi, Jindong; Fu, Cuiping; Wu, Xu; Li, Shanqun
2016-01-01
Background COPD is the third leading cause of death worldwide. Acute exacerbations of COPD may cause respiratory failure, requiring intensive care unit admission and mechanical ventilation. Intensive care unit patients with acute exacerbations of COPD requiring mechanical ventilation have higher mortality rates than other hospitalized patients. Although mechanical ventilation is the most effective intervention for these conditions, invasive ventilation techniques have yielded variable effects. Objective We evaluated pressure-regulated volume control (PRVC) ventilation treatment efficacy and preventive effects on pulmonary barotrauma in elderly COPD patients with respiratory failure. Patients and methods Thirty-nine intubated patients were divided into experimental and control groups and treated with the PRVC and synchronized intermittent mandatory ventilation – volume control methods, respectively. Vital signs, respiratory mechanics, and arterial blood gas analyses were monitored for 2–4 hours and 48 hours. Results Both groups showed rapidly improved pH, partial pressure of oxygen (PaO2), and PaO2 per fraction of inspired O2 levels and lower partial pressure of carbon dioxide (PaCO2) levels. The pH and PaCO2 levels at 2–4 hours were lower and higher, respectively, in the test group than those in the control group (P<0.05 for both); after 48 hours, blood gas analyses showed no statistical difference in any marker (P>0.05). Vital signs during 2–4 hours and 48 hours of treatment showed no statistical difference in either group (P>0.05). The level of peak inspiratory pressure in the experimental group after mechanical ventilation for 2–4 hours and 48 hours was significantly lower than that in the control group (P<0.05), while other variables were not significantly different between groups (P>0.05). Conclusion Among elderly COPD patients with respiratory failure, application of PRVC resulted in rapid improvement in arterial blood gas analyses while maintaining a low peak inspiratory pressure. PRVC can reduce pulmonary barotrauma risk, making it a safer protective ventilation mode than synchronized intermittent mandatory ventilation – volume control. PMID:27274223
Chang, Suchi; Shi, Jindong; Fu, Cuiping; Wu, Xu; Li, Shanqun
2016-01-01
COPD is the third leading cause of death worldwide. Acute exacerbations of COPD may cause respiratory failure, requiring intensive care unit admission and mechanical ventilation. Intensive care unit patients with acute exacerbations of COPD requiring mechanical ventilation have higher mortality rates than other hospitalized patients. Although mechanical ventilation is the most effective intervention for these conditions, invasive ventilation techniques have yielded variable effects. We evaluated pressure-regulated volume control (PRVC) ventilation treatment efficacy and preventive effects on pulmonary barotrauma in elderly COPD patients with respiratory failure. Thirty-nine intubated patients were divided into experimental and control groups and treated with the PRVC and synchronized intermittent mandatory ventilation - volume control methods, respectively. Vital signs, respiratory mechanics, and arterial blood gas analyses were monitored for 2-4 hours and 48 hours. Both groups showed rapidly improved pH, partial pressure of oxygen (PaO2), and PaO2 per fraction of inspired O2 levels and lower partial pressure of carbon dioxide (PaCO2) levels. The pH and PaCO2 levels at 2-4 hours were lower and higher, respectively, in the test group than those in the control group (P<0.05 for both); after 48 hours, blood gas analyses showed no statistical difference in any marker (P>0.05). Vital signs during 2-4 hours and 48 hours of treatment showed no statistical difference in either group (P>0.05). The level of peak inspiratory pressure in the experimental group after mechanical ventilation for 2-4 hours and 48 hours was significantly lower than that in the control group (P<0.05), while other variables were not significantly different between groups (P>0.05). Among elderly COPD patients with respiratory failure, application of PRVC resulted in rapid improvement in arterial blood gas analyses while maintaining a low peak inspiratory pressure. PRVC can reduce pulmonary barotrauma risk, making it a safer protective ventilation mode than synchronized intermittent mandatory ventilation - volume control.
Index of mechanical work in gait of children with cerebral palsy.
Dziuba, Alicja Katarzyna; Tylkowska, Małgorzata; Jaroszczuk, Sebastian
2014-01-01
The pathological gait of children with cerebral palsy involves higher mechanical work, which limits their ability to function properly in society. Mechanical work is directly related to walking speed and, although a number of studies have been carried out in this field, few of them analysed the effect of the speed. The study aimed to develop standards for mechanical work during gait of children with cerebral palsy depending on the walking speed. The study covered 18 children with cerebral palsy and 14 healthy children. The BTS Smart software and the author's software were used to evaluate mechanical work, kinetic, potential and rotational energy connected with motion of the children body during walk. Compared to healthy subjects, mechanical work in children with cerebral palsy increases with the degree of disability. It can be expressed as a linear function of walking speed and shows strong and statistically significant correlations with walking gait. A negative statistically significant correlation between the degree of disability and walking speed can be observed. The highest contribution to the total mechanical energy during gait is from mechanical energy of the feet. Instantaneous value of rotational energy is 700 times lower than the instantaneous mechanical energy. An increase in walking speed causes the increase in the effect of the index of kinetic energy on total mechanical work. The method described can provide an objective supplementation for doctors and physical therapists to perform a simple and immediate diagnosis without much technical knowledge.
Meghezi, Sébastien; Couet, Frédéric; Chevallier, Pascale; Mantovani, Diego
2012-01-01
Vascular tissue engineering focuses on the replacement of diseased small-diameter blood vessels with a diameter less than 6 mm for which adequate substitutes still do not exist. One approach to vascular tissue engineering is to culture vascular cells on a scaffold in a bioreactor. The bioreactor establishes pseudophysiological conditions for culture (medium culture, 37°C, mechanical stimulation). Collagen gels are widely used as scaffolds for tissue regeneration due to their biological properties; however, they exhibit low mechanical properties. Mechanical characterization of these scaffolds requires establishing the conditions of testing in regard to the conditions set in the bioreactor. The effects of different parameters used during mechanical testing on the collagen gels were evaluated in terms of mechanical and viscoelastic properties. Thus, a factorial experiment was adopted, and three relevant factors were considered: temperature (23°C or 37°C), hydration (aqueous saline solution or air), and mechanical preconditioning (with or without). Statistical analyses showed significant effects of these factors on the mechanical properties which were assessed by tensile tests as well as stress relaxation tests. The last tests provide a more consistent understanding of the gels' viscoelastic properties. Therefore, performing mechanical analyses on hydrogels requires setting an adequate environment in terms of temperature and aqueous saline solution as well as choosing the adequate test. PMID:22844285
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Dishon-Brown, Amanda; Golder, Seana; Renn, Tanya; Winham, Katherine; Higgins, George E; Logan, T K
2017-06-01
Justice-involved women report high rates of victimization across their life span, and these experiences contribute to their involvement in the criminal justice (CJ) system. Within this population, research has identified an overlap among victimization and substance use, a high-risk coping mechanism. Furthermore, research indicates attachment style is related to coping and high-risk behaviors. Research is needed to understand the relationship among these mechanisms as they relate to intimate partner violence (IPV). To address this gap, this study investigated the relationship between attachment, coping, childhood victimization, substance use, and IPV among 406 victimized women on probation/parole. Results of 6 multivariate regression analyses were statistically significant, accounting for 8%-13% of the variance in IPV. Particularly, childhood sexual victimization and negative coping were significant in all analyses. Findings provide practitioners, administrators, and policymakers information about the specific needs of justice-involved women.
Laufer, Vincent A; Chen, Jake Y; Langefeld, Carl D; Bridges, S Louis
2017-08-01
The use of high-throughput omics may help to understand the contribution of genetic variants to the pathogenesis of rheumatic diseases. We discuss the concept of missing heritability: that genetic variants do not explain the heritability of rheumatoid arthritis and related rheumatologic conditions. In addition to an overview of how integrative data analysis can lead to novel insights into mechanisms of rheumatic diseases, we describe statistical approaches to prioritizing genetic variants for future functional analyses. We illustrate how analyses of large datasets provide hope for improved approaches to the diagnosis, treatment, and prevention of rheumatic diseases. Copyright © 2017 Elsevier Inc. All rights reserved.
Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake
NASA Astrophysics Data System (ADS)
Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo
2018-02-01
In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.
Climate sensitivity to the lower stratospheric ozone variations
NASA Astrophysics Data System (ADS)
Kilifarska, N. A.
2012-12-01
The strong sensitivity of the Earth's radiation balance to variations in the lower stratospheric ozone—reported previously—is analysed here by the use of non-linear statistical methods. Our non-linear model of the land air temperature (T)—driven by the measured Arosa total ozone (TOZ)—explains 75% of total variability of Earth's T variations during the period 1926-2011. We have analysed also the factors which could influence the TOZ variability and found that the strongest impact belongs to the multi-decadal variations of galactic cosmic rays. Constructing a statistical model of the ozone variability, we have been able to predict the tendency in the land air T evolution till the end of the current decade. Results show that Earth is facing a weak cooling of the surface T by 0.05-0.25 K (depending on the ozone model) until the end of the current solar cycle. A new mechanism for O3 influence on climate is proposed.
A marked correlation function for constraining modified gravity models
NASA Astrophysics Data System (ADS)
White, Martin
2016-11-01
Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Ito, Reio; Shinoda, Masamichi; Honda, Kuniya; Urata, Kentaro; Lee, Jun; Maruno, Mitsuru; Soma, Kumi; Okada, Shinji; Gionhaku, Nobuhito; Iwata, Koichi
To determine the involvement of tumor necrosis factor alpha (TNFα) signaling in the trigeminal ganglion (TG) in the mechanical hypersensitivity of the masseter muscle during temporomandibular joint (TMJ) inflammation. A total of 55 male Sprague-Dawley rats were used. Following injection of Complete Freund's Adjuvant into the TMJ, the mechanical sensitivities of the masseter muscle and the overlying facial skin were measured. Satellite glial cell (SGC) activation and TNFα expression in the TG were investigated immunohistochemically, and the effects of their inhibition on the mechanical hypersensitivity of the masseter muscle were also examined. Student t test or two-way repeated-measures analysis of variance followed by Bonferroni multiple comparisons test were used for statistical analyses. P < .05 was considered to reflect statistical significance. Mechanical allodynia in the masseter muscle was induced without any inflammatory cell infiltration in the muscle after TMJ inflammation. SGC activation and an increased number of TNFα-immunoreactive cells were induced in the TG following TMJ inflammation. Intra-TG administration of an inhibitor of SGC activity or of TNFα-neutralizing antibody depressed both the increased number of TG cells encircled by activated SGCs and the mechanical hypersensitivity of the masseter following TMJ inflammation. These findings suggest that persistent masseter hypersensitivity associated with TMJ inflammation was mediated by SGC-TG neuron interactions via TNFα signaling in the TG.
Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi
2017-01-01
Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.
Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi
2014-09-01
Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.
Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H
2003-02-01
The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.
Grotjahn, Richard; Black, Robert; Leung, Ruby; ...
2015-05-22
This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less
Effect of tulle on the mechanical properties of a maxillofacial silicone elastomer.
Gunay, Yumushan; Kurtoglu, Cem; Atay, Arzu; Karayazgan, Banu; Gurbuz, Cihan Cem
2008-11-01
The purpose of this research was to investigate if physical properties could be improved by incorporating a tulle reinforcement material into a maxillofacial silicone elastomer. A-2186 silicone elastomer was used in this study. The study group consisted of 20 elastomer specimens incorporated with tulle and fabricated in dumbbell-shaped silicone patterns using ASTM D412 and D624 standards. The control group consisted of 20 elastomer specimens fabricated without tulle. Tensile strength, ultimate elongation, and tear strength of all specimens were measured and analyzed. Statistical analyses were performed using Mann-Whitney U test with a statistical significance at 95% confidence level. It was found that the tensile and tear strengths of tulle-incorporated maxillofacial silicone elastomer were higher than those without tulle incorporation (p < 0.05). Therefore, findings of this study suggested that tulle successfully reinforced a maxillofacial silicone elastomer by providing it with better mechanical properties and augmented strength--especially for the delicate edges of maxillofacial prostheses.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
Modeling Cross-Situational Word–Referent Learning: Prior Questions
Yu, Chen; Smith, Linda B.
2013-01-01
Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490
DMINDA: an integrated web server for DNA motif identification and analyses
Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying
2014-01-01
DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. PMID:24753419
Statistical modeling implicates neuroanatomical circuit mediating stress relief by ‘comfort’ food
Ulrich-Lai, Yvonne M.; Christiansen, Anne M.; Wang, Xia; Song, Seongho; Herman, James P.
2015-01-01
A history of eating highly-palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30% sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such ‘comfort’ foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala - medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological data sets. PMID:26246177
Statistical modeling implicates neuroanatomical circuit mediating stress relief by 'comfort' food.
Ulrich-Lai, Yvonne M; Christiansen, Anne M; Wang, Xia; Song, Seongho; Herman, James P
2016-07-01
A history of eating highly palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30 % sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such 'comfort' foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala-medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological datasets.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
Arizpe, Joseph; Kravitz, Dwight J; Walsh, Vincent; Yovel, Galit; Baker, Chris I
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis.
Arizpe, Joseph; Kravitz, Dwight J.; Walsh, Vincent; Yovel, Galit; Baker, Chris I.
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis. PMID:26849447
The mechanics of state dependent neural correlations
Doiron, Brent; Litwin-Kumar, Ashok; Rosenbaum, Robert; Ocker, Gabriel K.; Josić, Krešimir
2016-01-01
Simultaneous recordings from large neural populations are becoming increasingly common. An important feature of the population activity are the trial-to-trial correlated fluctuations of the spike train outputs of recorded neuron pairs. Like the firing rate of single neurons, correlated activity can be modulated by a number of factors, from changes in arousal and attentional state to learning and task engagement. However, the network mechanisms that underlie these changes are not fully understood. We review recent theoretical results that identify three separate biophysical mechanisms that modulate spike train correlations: changes in input correlations, internal fluctuations, and the transfer function of single neurons. We first examine these mechanisms in feedforward pathways, and then show how the same approach can explain the modulation of correlations in recurrent networks. Such mechanistic constraints on the modulation of population activity will be important in statistical analyses of high dimensional neural data. PMID:26906505
NASA Astrophysics Data System (ADS)
Coletta, Vincent P.; Phillips, Jeffrey A.; Savinainen, Antti; Steinert, Jeffrey J.
2008-09-01
In a recent article, Ates and Cataloglu (2007 Eur. J. Phys. 28 1161-71), in analysing results for a course in introductory mechanics for prospective science teachers, found no statistically significant correlation between students' pre-instruction scores on the Lawson classroom test of scientific reasoning ability (CTSR) and post-instruction scores on the force concept inventory (FCI). As a possible explanation, the authors suggest that the FCI does not probe for skills required to determine reasoning abilities. Our previously published research directly contradicts the authors' finding. We summarize our research and present a likely explanation for their observation of no correlation.
Bi, Xiaohong; Grafe, Ingo; Ding, Hao; Flores, Rene; Munivez, Elda; Jiang, Ming Ming; Dawson, Brian; Lee, Brendan; Ambrose, Catherine G
2017-02-01
Osteogenesis imperfecta (OI) is a group of genetic disorders characterized by brittle bones that are prone to fracture. Although previous studies in animal models investigated the mechanical properties and material composition of OI bone, little work has been conducted to statistically correlate these parameters to identify key compositional contributors to the impaired bone mechanical behaviors in OI. Further, although increased TGF-β signaling has been demonstrated as a contributing mechanism to the bone pathology in OI models, the relationship between mechanical properties and bone composition after anti-TGF-β treatment in OI has not been studied. Here, we performed follow-up analyses of femurs collected in an earlier study from OI mice with and without anti-TGF-β treatment from both recessive (Crtap -/- ) and dominant (Col1a2 +/P.G610C ) OI mouse models and WT mice. Mechanical properties were determined using three-point bending tests and evaluated for statistical correlation with molecular composition in bone tissue assessed by Raman spectroscopy. Statistical regression analysis was conducted to determine significant compositional determinants of mechanical integrity. Interestingly, we found differences in the relationships between bone composition and mechanical properties and in the response to anti-TGF-β treatment. Femurs of both OI models exhibited increased brittleness, which was associated with reduced collagen content and carbonate substitution. In the Col1a2 +/P.G610C femurs, reduced hydroxyapatite crystallinity was also found to be associated with increased brittleness, and increased mineral-to-collagen ratio was correlated with increased ultimate strength, elastic modulus, and bone brittleness. In both models of OI, regression analysis demonstrated that collagen content was an important predictor of the increased brittleness. In summary, this work provides new insights into the relationships between bone composition and material properties in models of OI, identifies key bone compositional parameters that correlate with the impaired mechanical integrity of OI bone, and explores the effects of anti-TGF-β treatment on bone-quality parameters in these models. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.
NASA Astrophysics Data System (ADS)
Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio
2015-04-01
The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.
Pereira, G K R; Fraga, S; Montagner, A F; Soares, F Z M; Kleverlaan, C J; Valandro, L F
2016-10-01
The aim of this study was to systematically review the literature to assess the effect of grinding on the mechanical properties, structural stability and superficial characteristics of Y-TZP ceramics. The MEDLINE via PubMed and Web of Science (ISI - Web of Knowledge) electronic databases were searched with included peer-reviewed publications in English language and with no publication year limit. From 342 potentially eligible studies, 73 were selected for full-text analysis, 30 were included in the systematic review with 20 considered in the meta-analysis. Two reviewers independently selected the studies, extracted the data, and assessed the risk of bias. Statistical analyses were performed using RevMan 5.1, with random effects model, at a significance level of 0.05. A descriptive analysis considering phase transformation, Y-TZP grain size, Vickers hardness, residual stress and aging of all included studies were executed. Four outcomes were considered in the meta-analyses (factor: grinding x as-sintered) in global and subgroups analyses (grinding tool, grit-size and cooling) for flexural strength and roughness (Ra) data. A significant difference (p<0.05) was observed in the global analysis for strength, favoring as-sintered; subgroup analyses revealed that different parameters lead to different effects on strength. In the global analysis for roughness, a significant difference (p<0.05) was observed between conditions, favoring grinding; subgroup analyses revealed that different parameters also lead to different effects on roughness. High heterogeneity was found in some comparisons. Generally grinding promotes decrease in strength and increase in roughness of Y-TZP ceramics. However, the use of a grinding tool that allows greater accuracy of the movement (i.e. contra angle hand-pieces coupled to slowspeed turbines), small grit size (<50μm) and the use of plenty coolant seem to be the main factors to decrease the defect introduction and allow the occurrence of the toughening transformation mechanism, decreasing the risk of deleterious impact on Y-TZP mechanical properties. Copyright © 2016 Elsevier Ltd. All rights reserved.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Statistical learning of novel graphotactic constraints in children and adults.
Samara, Anna; Caravolas, Markéta
2014-05-01
The current study explored statistical learning processes in the acquisition of orthographic knowledge in school-aged children and skilled adults. Learning of novel graphotactic constraints on the position and context of letter distributions was induced by means of a two-phase learning task adapted from Onishi, Chambers, and Fisher (Cognition, 83 (2002) B13-B23). Following incidental exposure to pattern-embedding stimuli in Phase 1, participants' learning generalization was tested in Phase 2 with legality judgments about novel conforming/nonconforming word-like strings. Test phase performance was above chance, suggesting that both types of constraints were reliably learned even after relatively brief exposure. As hypothesized, signal detection theory d' analyses confirmed that learning permissible letter positions (d'=0.97) was easier than permissible neighboring letter contexts (d'=0.19). Adults were more accurate than children in all but a strict analysis of the contextual constraints condition. Consistent with the statistical learning perspective in literacy, our results suggest that statistical learning mechanisms contribute to children's and adults' acquisition of knowledge about graphotactic constraints similar to those existing in their orthography. Copyright © 2013 Elsevier Inc. All rights reserved.
Hart, Corey B.; Giszter, Simon F.
2013-01-01
We present and apply a method that uses point process statistics to discriminate the forms of synergies in motor pattern data, prior to explicit synergy extraction. The method uses electromyogram (EMG) pulse peak timing or onset timing. Peak timing is preferable in complex patterns where pulse onsets may be overlapping. An interval statistic derived from the point processes of EMG peak timings distinguishes time-varying synergies from synchronous synergies (SS). Model data shows that the statistic is robust for most conditions. Its application to both frog hindlimb EMG and rat locomotion hindlimb EMG show data from these preparations is clearly most consistent with synchronous synergy models (p < 0.001). Additional direct tests of pulse and interval relations in frog data further bolster the support for synchronous synergy mechanisms in these data. Our method and analyses support separated control of rhythm and pattern of motor primitives, with the low level execution primitives comprising pulsed SS in both frog and rat, and both episodic and rhythmic behaviors. PMID:23675341
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
A canonical neural mechanism for behavioral variability
NASA Astrophysics Data System (ADS)
Darshan, Ran; Wood, William E.; Peters, Susan; Leblois, Arthur; Hansel, David
2017-05-01
The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these `universal' statistics.
Quantum-mechanical analysis of low-gain free-electron laser oscillators
NASA Astrophysics Data System (ADS)
Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.
2018-05-01
In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.
Online incidental statistical learning of audiovisual word sequences in adults: a registered report.
Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy
2018-02-01
Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.
Online incidental statistical learning of audiovisual word sequences in adults: a registered report
Duta, Mihaela; Thompson, Paul
2018-01-01
Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876
DMINDA: an integrated web server for DNA motif identification and analyses.
Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying
2014-07-01
DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
Generalized statistical mechanics of cosmic rays: Application to positron-electron spectral indices.
Yalcin, G Cigdem; Beck, Christian
2018-01-29
Cosmic ray energy spectra exhibit power law distributions over many orders of magnitude that are very well described by the predictions of q-generalized statistical mechanics, based on a q-generalized Hagedorn theory for transverse momentum spectra and hard QCD scattering processes. QCD at largest center of mass energies predicts the entropic index to be [Formula: see text]. Here we show that the escort duality of the nonextensive thermodynamic formalism predicts an energy split of effective temperature given by Δ [Formula: see text] MeV, where T H is the Hagedorn temperature. We carefully analyse the measured data of the AMS-02 collaboration and provide evidence that the predicted temperature split is indeed observed, leading to a different energy dependence of the e + and e - spectral indices. We also observe a distinguished energy scale E * ≈ 50 GeV where the e + and e - spectral indices differ the most. Linear combinations of the escort and non-escort q-generalized canonical distributions yield excellent agreement with the measured AMS-02 data in the entire energy range.
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Generalized Models for Rock Joint Surface Shapes
Du, Shigui; Hu, Yunjin; Hu, Xiaofei
2014-01-01
Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901
Groundwater monitoring in the Savannah River Plant Low Level Waste Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlton, W.H.
1983-12-31
This document describes chemical mechanisms that may affect trace-level radionuclide migration through acidic sandy clay soils in a humid environment, and summarizes the extensive chemical and radiochemical analyses of the groundwater directly below the SRP Low-Level Waste (LLW) Burial Ground (643-G). Anomalies were identified in the chemistry of individual wells which appear to be related to small amounts of fission product activity that have reached the water table. The chemical properties which were statistically related to trace level transport of Cs-137 and Sr-90 were iron, potassium, sodium and calcium. Concentrations on the order of 100 ppM appear sufficient to affectmore » nuclide migration. Several complexation mechanisms for plutonium migration were investigated.« less
Design solutions for the solar cell interconnect fatigue fracture problem
NASA Technical Reports Server (NTRS)
Mon, G. R.; Ross, R. G., Jr.
1982-01-01
Mechanical fatigue of solar cell interconnects is a major failure mechanism in photovoltaic arrays. A comprehensive approach to the reliability design of interconnects, together with extensive design data for the fatigue properties of copper interconnects, has been published. This paper extends the previous work, developing failure prediction (fatigue) data for additional interconnect material choices, including aluminum and a variety of copper-Invar and copper-steel claddings. An improved global fatigue function is used to model the probability-of-failure statistics of each material as a function of level and number of cycles of applied strain. Life-cycle economic analyses are used to evaluate the relative merits of each material choce. The copper-Invar clad composites demonstrate superior performance over pure copper. Aluminum results are disappointing.
Gautestad, Arild O
2013-03-01
The flow of GPS data on animal space is challenging old paradigms, such as the issue of the scale-free Lévy walk versus scale-specific Brownian motion. Since these movement classes often require different protocols with respect to ecological analyses, further theoretical development in this field is important. I describe central concepts such as scale-specific versus scale-free movement and the difference between mechanistic and statistical-mechanical levels of analysis. Next, I report how a specific sampling scheme may have produced much confusion: a Lévy walk may be wrongly categorized as Brownian motion if the duration of a move, or bout, is used as a proxy for step length and a move is subjectively defined. Hence, the categorization and recategorization of movement class compliance surrounding the Lévy walk controversy may have been based on a statistical artifact. This issue may be avoided by collecting relocations at a fixed rate at a temporal scale that minimizes over- and undersampling.
Statistical summaries of fatigue data for design purposes
NASA Technical Reports Server (NTRS)
Wirsching, P. H.
1983-01-01
Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.
The Statistical Basis of Chemical Equilibria.
ERIC Educational Resources Information Center
Hauptmann, Siegfried; Menger, Eva
1978-01-01
Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian
2016-07-22
Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Effect of curing mode on the micro-mechanical properties of dual-cured self-adhesive resin cements.
Ilie, Nicoleta; Simon, Alexander
2012-04-01
Light supplying to luting resin cements is impeded in several clinical situations, causing us to question whether materials can properly be cured to achieve adequately (or adequate) mechanical properties. The aim of this study was therefore to analyse the effect of light on the micro-mechanical properties of eight popular dual-cured self-adhesive resin cements by comparing them with two conventional, also dual-cured, resin cements. Four different curing procedures were applied: auto-polymerisation (dark curing) and light curing (LED unit, Freelight 2, 20 s) by applying the unit directly on the samples' surface, at a distance of 5 and 10 mm. Twenty minutes after curing, the samples were stored for 1 week at 37°C in a water-saturated atmosphere. The micro-mechanical properties-Vickers hardness, modulus of elasticity, creep and elastic/plastic deformation-were measured. Data were analysed with multivariate ANOVA followed by Tukey's test and partial eta-squared statistics (p < 0.05). A very strong influence of the material as well as filler volume and weight on the micro-mechanical properties was measured, whereas the influence of the curing procedure and type of cement-conventional or self-adhesive-was generally low. The influence of light on the polymerisation process was material dependent, with four different behaviour patterns to be distinguished. As a material category, significantly higher micro-mechanical properties were measured for the conventional compared to the self-adhesive resin cements, although this difference was low. Within the self-adhesive resin cements group, the variation in micro-mechanical properties was high. The selection of suitable resin cements should be done by considering, besides its adhesive properties, its micro-mechanical properties and curing behaviour also.
Satellite disintegration dynamics
NASA Technical Reports Server (NTRS)
Dasenbrock, R. R.; Kaufman, B.; Heard, W. B.
1975-01-01
The subject of satellite disintegration is examined in detail. Elements of the orbits of individual fragments, determined by DOD space surveillance systems, are used to accurately predict the time and place of fragmentation. Dual time independent and time dependent analyses are performed for simulated and real breakups. Methods of statistical mechanics are used to study the evolution of the fragment clouds. The fragments are treated as an ensemble of non-interacting particles. A solution of Liouville's equation is obtained which enables the spatial density to be calculated as a function of position, time and initial velocity distribution.
Cumulative sum control charts for assessing performance in arterial surgery.
Beiles, C Barry; Morton, Anthony P
2004-03-01
The Melbourne Vascular Surgical Association (Melbourne, Australia) undertakes surveillance of mortality following aortic aneurysm surgery, patency at discharge following infrainguinal bypass and stroke and death following carotid endarterectomy. Quality improvement protocol employing the Deming cycle requires that the system for performing surgery first be analysed and optimized. Then process and outcome data are collected and these data require careful analysis. There must be a mechanism so that the causes of unsatisfactory outcomes can be determined and a good feedback mechanism must exist so that good performance is acknowledged and unsatisfactory performance corrected. A simple method for analysing these data that detects changes in average outcome rates is available using cumulative sum statistical control charts. Data have been analysed both retrospectively from 1999 to 2001, and prospectively during 2002 using cumulative sum control methods. A pathway to deal with control chart signals has been developed. The standard of arterial surgery in Victoria, Australia, is high. In one case a safe and satisfactory outcome was achieved by following the pathway developed by the audit committee. Cumulative sum control charts are a simple and effective tool for the identification of variations in performance standards in arterial surgery. The establishment of a pathway to manage problem performance is a vital part of audit activity.
Relating triggering processes in lab experiments with earthquakes.
NASA Astrophysics Data System (ADS)
Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.
2016-12-01
Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.
Didia, E E; Akon, A B; Thiam, A; Djeredou, K B
2010-03-01
One of the concerns of the dental surgeon in the realization of any operational act is the durability of this one. The mechanical resistance of the provisional prostheses contributes in a large part to the durability of those. The resins in general, have weak mechanical properties. The purpose of this study is to evaluate the resistance in inflection of temporary bridges reinforced with glass fibre. To remedy the weak mechanical properties of resins, we thought in this study, to reinforce them with glass fibres. For this purpose, we realized with two different resins, four groups of temporary bridges of 3 elements, including two groups reinforced fibreglass and the others not. Tests of inflection 3 points have been made on these bridges and resistance to fracture was analysed. The statistical tests showed a significant difference in four groups with better resistance for the reinforced bridges.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
The effect of mechanical vibration on orthodontically induced root resorption.
Yadav, Sumit; Dobie, Thomas; Assefnia, Amir; Kalajzic, Zana; Nanda, Ravindra
2016-09-01
To investigate the effect of low-frequency mechanical vibration (LFMV) on orthodontically induced root resorption. Forty male CD1, 12-week-old mice were used for the study. The mice were randomly divided into five groups: group 1 (baseline)-no spring and no mechanical vibration, group 2-orthodontic spring but no vibration, group 3-orthodontic spring and 5 Hz of vibration applied to the maxillary first molar, group 4-orthodontic spring and 10 Hz of vibration applied to maxillary first molar, and group 5-orthodontic spring and 20 Hz of vibration applied to maxillary first molar. In the different experimental groups, the first molar was moved mesially for 2 weeks using a nickel-titanium coil spring delivering 10 g of force. LFMVs were applied at 5 Hz, 10 Hz, and 20 Hz. Microfocus X-ray computed tomography imaging was used to analyze root resorption. Additionally, to understand the mechanism, we applied LFMV to MC3T3 cells, and gene expression analyses were done for receptor activator of nuclear factor kappa-B ligand (RANKL) and osteoprotegerin (OPG). Orthodontic tooth movement leads to decreased root volume (increased root resorption craters). Our in vivo experiments showed a trend toward increase in root volume with different frequencies of mechanical vibration. In vitro gene expression analyses showed that with 20 Hz of mechanical vibration, there was a significant decrease in RANKL and a significant increase in OPG expression. There was a trend toward decreased root resorption with different LFMVs (5 Hz, 10 Hz, and 20 Hz); however, it was not more statistically significant than the orthodontic-spring-only group.
A canonical neural mechanism for behavioral variability
Darshan, Ran; Wood, William E.; Peters, Susan; Leblois, Arthur; Hansel, David
2017-01-01
The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5–6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these ‘universal' statistics. PMID:28530225
Statistical Analyses of Hydrophobic Interactions: A Mini-Review
Pratt, Lawrence R.; Chaudhari, Mangesh I.; Rempe, Susan B.
2016-07-14
Here this review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. First, the inverse temperature phenomenology of hydrophobic interactions, i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Second, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, nontrivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accuratemore » in this application, but molecular quasi-chemical theory shows promise. Lastly, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an interpretive danger.« less
Micromechanical investigation of sand migration in gas hydrate-bearing sediments
NASA Astrophysics Data System (ADS)
Uchida, S.; Klar, A.; Cohen, E.
2017-12-01
Past field gas production tests from hydrate bearing sediments have indicated that sand migration is an important phenomenon that needs to be considered for successful long-term gas production. The authors previously developed the continuum based analytical thermo-hydro-mechanical sand migration model that can be applied to predict wellbore responses during gas production. However, the model parameters involved in the model still needs to be calibrated and studied thoroughly and it still remains a challenge to conduct well-defined laboratory experiments of sand migration, especially in hydrate-bearing sediments. Taking the advantage of capability of micromechanical modelling approach through discrete element method (DEM), this work presents a first step towards quantifying one of the model parameters that governs stresses reduction due to grain detachment. Grains represented by DEM particles are randomly removed from an isotropically loaded DEM specimen and statistical analyses reveal that linear proportionality exists between the normalized volume of detached solids and normalized reduced stresses. The DEM specimen with different porosities (different packing densities) are also considered and statistical analyses show that there is a clear transition between loose sand behavior and dense sand behavior, characterized by the relative density.
Helmholtz and Gibbs ensembles, thermodynamic limit and bistability in polymer lattice models
NASA Astrophysics Data System (ADS)
Giordano, Stefano
2017-12-01
Representing polymers by random walks on a lattice is a fruitful approach largely exploited to study configurational statistics of polymer chains and to develop efficient Monte Carlo algorithms. Nevertheless, the stretching and the folding/unfolding of polymer chains within the Gibbs (isotensional) and the Helmholtz (isometric) ensembles of the statistical mechanics have not been yet thoroughly analysed by means of the lattice methodology. This topic, motivated by the recent introduction of several single-molecule force spectroscopy techniques, is investigated in the present paper. In particular, we analyse the force-extension curves under the Gibbs and Helmholtz conditions and we give a proof of the ensembles equivalence in the thermodynamic limit for polymers represented by a standard random walk on a lattice. Then, we generalize these concepts for lattice polymers that can undergo conformational transitions or, equivalently, for chains composed of bistable or two-state elements (that can be either folded or unfolded). In this case, the isotensional condition leads to a plateau-like force-extension response, whereas the isometric condition causes a sawtooth-like force-extension curve, as predicted by numerous experiments. The equivalence of the ensembles is finally proved also for lattice polymer systems exhibiting conformational transitions.
NASA Astrophysics Data System (ADS)
Queirós, S. M. D.; Tsallis, C.
2005-11-01
The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Elnaghy, A M; Elsaka, S E
2017-08-01
To assess and compare the mechanical properties of TRUShape (TRS) with several nickel-titanium rotary instruments. Cyclic fatigue, torsional resistance, flexibility and surface microhardness of TRS (size 25, 0.06v taper), ProTaper Next X2 (PTN X2, size 25, 0.06 taper), ProTaper Gold (PTG F2; size 25, 0.08 taper) and ProTaper Universal (PTU F2; size 25, 0.08 taper) instruments were evaluated. The topographical structures of the fracture surfaces of instruments were assessed using a scanning electron microscope. The cyclic fatigue resistance, torsional resistance and microhardness data were analysed using one-way analysis of variance (anova) and Tukey's post hoc tests. The fragment length and bending resistance data were analysed statistically with the Kruskal-Wallis H-test and Mann-Whitney U-tests. The statistical significance level was set at P < 0.05. PTN and PTG instruments revealed significantly higher resistance to cyclic fatigue than TRS and PTU instruments (P < 0.001). PTN instruments revealed significantly higher torsional resistance compared with the other instruments (P < 0.001). PTG instrument had significantly higher flexibility than the other tested brands (P < 0.05). However, for microhardness, the PTU had significantly higher surface microhardness values compared with other tested brands (P < 0.05). TRS instruments had lower resistance to cyclic fatigue and lower flexibility compared with PTG and PTN instruments. TRS, PTG and PTU instruments had lower resistance to torsional stress than PTN instruments. TRS and PTG instruments had comparable surface microhardness. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Kasenda, Benjamin; Sauerbrei, Willi; Royston, Patrick; Briel, Matthias
2014-05-20
Categorizing an inherently continuous predictor in prognostic analyses raises several critical methodological issues: dependence of the statistical significance on the number and position of the chosen cut-point(s), loss of statistical power, and faulty interpretation of the results if a non-linear association is incorrectly assumed to be linear. This also applies to a therapeutic context where investigators of randomized clinical trials (RCTs) are interested in interactions between treatment assignment and one or more continuous predictors. Our goal is to apply the multivariable fractional polynomial interaction (MFPI) approach to investigate interactions between continuous patient baseline variables and the allocated treatment in an individual patient data meta-analysis of three RCTs (N = 2,299) from the intensive care field. For each study, MFPI will provide a continuous treatment effect function. Functions from each of the three studies will be averaged by a novel meta-analysis approach for functions. We will plot treatment effect functions separately for each study and also the averaged function. The averaged function with a related confidence interval will provide a suitable basis to assess whether a continuous patient characteristic modifies the treatment comparison and may be relevant for clinical decision-making. The compared interventions will be a higher or lower positive end-expiratory pressure (PEEP) ventilation strategy in patients requiring mechanical ventilation. The continuous baseline variables body mass index, PaO2/FiO2, respiratory compliance, and oxygenation index will be the investigated potential effect modifiers. Clinical outcomes for this analysis will be in-hospital mortality, time to death, time to unassisted breathing, and pneumothorax. This project will be the first meta-analysis to combine continuous treatment effect functions derived by the MFPI procedure separately in each of several RCTs. Such an approach requires individual patient data (IPD). They are available from an earlier IPD meta-analysis using different methods for analysis. This new analysis strategy allows assessing whether treatment effects interact with continuous baseline patient characteristics and avoids categorization-based subgroup analyses. These interaction analyses of the present study will be exploratory in nature. However, they may help to foster future research using the MFPI approach to improve interaction analyses of continuous predictors in RCTs and IPD meta-analyses. This study is registered in PROSPERO (CRD42012003129).
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
An investigation on seismo-ionospheric precursors in various earthquake zones
NASA Astrophysics Data System (ADS)
Su, Y.; Liu, J. G.; Chen, M.
2011-12-01
Y. C. Su1, J. Y. Liu1 and M. Q. Chen1 1Institute of Space Science, National Central University, Chung-Li,Taiwan. This paper examines the relationships between the ionosphere and earthquakes occurring in different earthquake zones e.g. Malaysia area, Tibet plateau, mid-ocean ridge, Andes, etc., to reveal the possible seismo-ionospheric precursors for these area. Because the lithology, focal mechanism of earthquakes and electrodynamics in the ionosphere at different area are different, it is probable to have diverse ionospheric reactions before large earthquakes occurring in these areas. In addition to statistical analyses on increase or decrease anomalies of the ionospheric electron density few days before large earthquakes, we focus on the seismo-ionospheric precursors for oceanic and land earthquakes as well as for earthquakes with different focal mechanisms.
Haebig, Eileen; Saffran, Jenny R; Ellis Weismer, Susan
2017-11-01
Word learning is an important component of language development that influences child outcomes across multiple domains. Despite the importance of word knowledge, word-learning mechanisms are poorly understood in children with specific language impairment (SLI) and children with autism spectrum disorder (ASD). This study examined underlying mechanisms of word learning, specifically, statistical learning and fast-mapping, in school-aged children with typical and atypical development. Statistical learning was assessed through a word segmentation task and fast-mapping was examined in an object-label association task. We also examined children's ability to map meaning onto newly segmented words in a third task that combined exposure to an artificial language and a fast-mapping task. Children with SLI had poorer performance on the word segmentation and fast-mapping tasks relative to the typically developing and ASD groups, who did not differ from one another. However, when children with SLI were exposed to an artificial language with phonemes used in the subsequent fast-mapping task, they successfully learned more words than in the isolated fast-mapping task. There was some evidence that word segmentation abilities are associated with word learning in school-aged children with typical development and ASD, but not SLI. Follow-up analyses also examined performance in children with ASD who did and did not have a language impairment. Children with ASD with language impairment evidenced intact statistical learning abilities, but subtle weaknesses in fast-mapping abilities. As the Procedural Deficit Hypothesis (PDH) predicts, children with SLI have impairments in statistical learning. However, children with SLI also have impairments in fast-mapping. Nonetheless, they are able to take advantage of additional phonological exposure to boost subsequent word-learning performance. In contrast to the PDH, children with ASD appear to have intact statistical learning, regardless of language status; however, fast-mapping abilities differ according to broader language skills. © 2017 Association for Child and Adolescent Mental Health.
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
Statistical mechanics based on fractional classical and quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Martinez-Riaza, Luis; Herrero-Gonzalez, Helena; Lopez-Alcorocho, Juan M; Guillen-Garcia, Pedro; Fernandez-Jaen, Tomas F
2016-01-01
Futsal started being played in 1930 and the number of futsal players has increased all over the world ever since. Nonetheless, despite the fact that Spain is one of the most relevant national teams worldwide, information on the incidence of injuries and their anthropometric characteristics is sparse in this country. To analyse medical assistance provided to players in their prematch concentration camps with the Spanish national team over five seasons, from 2010-2011 to 2014-2015, and also to collect data regarding anthropometric characteristics. This is a retrospective and detailed study of injuries players suffered over these five seasons. All variables were registered on an Excel spreadsheet and later analysed statistically. 411 injuries were studied in total. The dominant somatotype was mesomorph and the injured pivots were both the most endomorphic and the most mesomorphic. The most injured body structure was the hamstring muscles, occurring due to training and intrinsic mechanisms, where fatigue was the most frequent diagnosis. Only a few complementary examinations were carried out and prematch withdrawal was rare. The skinfold test total sum was lower than that of the Spanish 11-a-side players or than that in the lower category futsal Spanish players. In various research studies analysing exclusively injuries occurring in matches, the most frequent injury is ligament injury by extrinsic mechanism. The body mass index was not a useful parameter when assessing players' appropriate weight. Most injuries occurred in training sessions, mostly by intrinsic mechanism; the highest percentage of traumatic injuries occurred in official matches.
Epidemiology of injuries in the Spanish national futsal male team: a five-season retrospective study
Martinez-Riaza, Luis; Herrero-Gonzalez, Helena; Lopez-Alcorocho, Juan M; Guillen-Garcia, Pedro; Fernandez-Jaen, Tomas F
2016-01-01
Background Futsal started being played in 1930 and the number of futsal players has increased all over the world ever since. Nonetheless, despite the fact that Spain is one of the most relevant national teams worldwide, information on the incidence of injuries and their anthropometric characteristics is sparse in this country. Aim To analyse medical assistance provided to players in their prematch concentration camps with the Spanish national team over five seasons, from 2010–2011 to 2014–2015, and also to collect data regarding anthropometric characteristics. Materials and methods This is a retrospective and detailed study of injuries players suffered over these five seasons. All variables were registered on an Excel spreadsheet and later analysed statistically. Results 411 injuries were studied in total. The dominant somatotype was mesomorph and the injured pivots were both the most endomorphic and the most mesomorphic. The most injured body structure was the hamstring muscles, occurring due to training and intrinsic mechanisms, where fatigue was the most frequent diagnosis. Only a few complementary examinations were carried out and prematch withdrawal was rare. Discussion The skinfold test total sum was lower than that of the Spanish 11-a-side players or than that in the lower category futsal Spanish players. In various research studies analysing exclusively injuries occurring in matches, the most frequent injury is ligament injury by extrinsic mechanism. The body mass index was not a useful parameter when assessing players’ appropriate weight. Most injuries occurred in training sessions, mostly by intrinsic mechanism; the highest percentage of traumatic injuries occurred in official matches. PMID:28879032
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Many-Body Localization and Thermalization in Quantum Statistical Mechanics
NASA Astrophysics Data System (ADS)
Nandkishore, Rahul; Huse, David A.
2015-03-01
We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Gillespie, Paddy; O'Shea, Eamon; Smith, Susan M; Cupples, Margaret E; Murphy, Andrew W
2016-12-01
Data on health care utilization may be collected using a variety of mechanisms within research studies, each of which may have implications for cost and cost effectiveness. The aim of this observational study is to compare data collected from medical records searches and self-report questionnaires for the cost analysis of a cardiac secondary prevention intervention. Secondary data analysis of the Secondary Prevention of Heart Disease in General Practice (SPHERE) randomized controlled trial (RCT). Resource use data for a range of health care services were collected by research nurse searches of medical records and self-report questionnaires and costs of care estimated for each data collection mechanism. A series of statistical analyses were conducted to compare the mean costs for medical records data versus questionnaire data and to conduct incremental analyses for the intervention and control arms in the trial. Data were available to estimate costs for 95% of patients in the intervention and 96% of patients in the control using the medical records data compared to 65% and 66%, respectively, using the questionnaire data. The incremental analysis revealed a statistically significant difference in mean cost of -€796 (95% CI: -1447, -144; P-value: 0.017) for the intervention relative to the control. This compared to no significant difference in mean cost (95% CI: -1446, 860; P-value: 0.619) for the questionnaire analysis. Our findings illustrate the importance of the choice of health care utilization data collection mechanism for the conduct of economic evaluation alongside randomized trials in primary care. This choice will have implications for the costing methodology employed and potentially, for the cost and cost effectiveness outcomes generated. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Neonatal Risk Factors for Treatment-Demanding Retinopathy of Prematurity: A Danish National Study.
Slidsborg, Carina; Jensen, Aksel; Forman, Julie Lyng; Rasmussen, Steen; Bangsgaard, Regitze; Fledelius, Hans Callø; Greisen, Gorm; la Cour, Morten
2016-04-01
One goal of the study was to identify "new" statistically independent risk factors for treatment-demanding retinopathy of prematurity (ROP). Another goal was to evaluate whether any new risk factors could explain the increase in the incidence of treatment-demanding ROP over time in Denmark. A retrospective, register-based cohort study. The study included premature infants (n = 6490) born in Denmark from 1997 to 2008. The study sample and the 31 candidate risk factors were identified in 3 national registers. Data were linked through a unique civil registration number. Each of the 31 candidate risk factors were evaluated in univariate analyses, while adjusted for known risk factors (i.e., gestational age [GA] at delivery, small for gestational age [SGA], multiple births, and male sex). Significant outcomes were analyzed thereafter in a backward selection multiple logistic regression model. Treatment-demanding ROP and its associations to candidate risk factors. Mechanical ventilation (odds ratio [OR], 2.84; 95% confidence interval [CI], 1.99-4.08; P < 0.01) and blood transfusion (OR, 1.97; 95% CI, 1.20-3.14; P = 0.01) were the only new statistically independent risk factors, in addition to GA at delivery, SGA, multiple births, and male sex. Modification in these prognostic factors for ROP did not cause an increase in treatment-demanding ROP. In a large study population, blood transfusion and mechanical ventilation were the only new statistically independent risk factors to predict the development of treatment-demanding ROP. Modification in the neonatal treatment with mechanical ventilation or blood transfusion did not cause the observed increase in the incidence of preterm infants with treatment-demanding ROP during a recent birth period (2003-2008). Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
A climatology of total ozone mapping spectrometer data using rotated principal component analysis
NASA Astrophysics Data System (ADS)
Eder, Brian K.; Leduc, Sharon K.; Sickles, Joseph E.
1999-02-01
The spatial and temporal variability of total column ozone (Ω) obtained from the total ozone mapping spectrometer (TOMS version 7.0) during the period 1980-1992 was examined through the use of a multivariate statistical technique called rotated principal component analysis. Utilization of Kaiser's varimax orthogonal rotation led to the identification of 14, mostly contiguous subregions that together accounted for more than 70% of the total Ω variance. Each subregion displayed statistically unique Ω characteristics that were further examined through time series and spectral density analyses, revealing significant periodicities on semiannual, annual, quasi-biennial, and longer term time frames. This analysis facilitated identification of the probable mechanisms responsible for the variability of Ω within the 14 homogeneous subregions. The mechanisms were either dynamical in nature (i.e., advection associated with baroclinic waves, the quasi-biennial oscillation, or El Niño-Southern Oscillation) or photochemical in nature (i.e., production of odd oxygen (O or O3) associated with the annual progression of the Sun). The analysis has also revealed that the influence of a data retrieval artifact, found in equatorial latitudes of version 6.0 of the TOMS data, has been reduced in version 7.0.
Sud, Sachin; Friedrich, Jan O; Adhikari, Neill K J; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A Q; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude
2014-07-08
Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. We identified 11 RCTs (n=2341) that met our inclusion criteria. In the 6 trials (n=1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59-0.95; I2=29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2<50%) for most of the clinical and physiologic outcomes. Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. © 2014 Canadian Medical Association or its licensors.
Sud, Sachin; Friedrich, Jan O.; Adhikari, Neill K. J.; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A.Q.; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude
2014-01-01
Background: Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. Methods: We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. Results: We identified 11 RCTs (n = 2341) that met our inclusion criteria. In the 6 trials (n = 1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59–0.95; I2 = 29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2 < 50%) for most of the clinical and physiologic outcomes. Interpretation: Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. PMID:24863923
Dickerson, James H.; Krejci, Alex J.; Garcia, Adriana -Mendoza; ...
2015-08-01
Ordered assemblies of nanoparticles remain challenging to fabricate, yet could open the door to many potential applications of nanomaterials. Here, we demonstrate that locally ordered arrays of nanoparticles, using electrophoretic deposition, can be extended to produce long-range order among the constituents. Voronoi tessellations along with multiple statistical analyses show dramatic increases in order compared with previously reported assemblies formed through electric field-assisted assembly. As a result, based on subsequent physical measurements of the nanoparticles and the deposition system, the underlying mechanisms that generate increased order are inferred.
Luo, Sean X; Wall, Melanie; Covey, Lirio; Hu, Mei-Chen; Scodes, Jennifer M; Levin, Frances R; Nunes, Edward V; Winhusen, Theresa
2018-01-25
A double blind, placebo-controlled randomized trial (NCT00253747) evaluating osmotic-release oral system methylphenidate (OROS-MPH) for smoking-cessation revealed a significant interaction effect in which participants with higher baseline ADHD severity had better abstinence outcomes with OROS-MPH while participants with lower baseline ADHD severity had worse outcomes. This current report examines secondary outcomes that might bear on the mechanism for this differential treatment effect. Longitudinal analyses were conducted to evaluate the effect of OROS-MPH on three secondary outcomes (ADHD symptom severity, nicotine craving, and withdrawal) in the total sample (N = 255, 56% Male), and in the high (N = 134) and low (N = 121) baseline ADHD severity groups. OROS-MPH significantly improved ADHD symptoms and nicotine withdrawal symptoms in the total sample, and exploratory analyses showed that in both higher and lower baseline severity groups, OROS-MPH statistically significantly improved these two outcomes. No effect on craving overall was detected, though exploratory analyses showed statistically significantly decreased craving in the high ADHD severity participants on OROS-MPH. No treatment by ADHD baseline severity interaction was detected for the outcomes. Methylphenidate improved secondary outcomes during smoking cessation independent of baseline ADHD severity, with no evident treatment-baseline severity interaction. Our results suggest divergent responses to smoking cessation treatment in the higher and lower severity groups cannot be explained by concordant divergence in craving, withdrawal and ADHD symptom severity, and alternative hypotheses may need to be identified.
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
Universal properties of mythological networks
NASA Astrophysics Data System (ADS)
Mac Carron, Pádraig; Kenna, Ralph
2012-07-01
As in statistical physics, the concept of universality plays an important, albeit qualitative, role in the field of comparative mythology. Here we apply statistical mechanical tools to analyse the networks underlying three iconic mythological narratives with a view to identifying common and distinguishing quantitative features. Of the three narratives, an Anglo-Saxon and a Greek text are mostly believed by antiquarians to be partly historically based while the third, an Irish epic, is often considered to be fictional. Here we use network analysis in an attempt to discriminate real from imaginary social networks and place mythological narratives on the spectrum between them. This suggests that the perceived artificiality of the Irish narrative can be traced back to anomalous features associated with six characters. Speculating that these are amalgams of several entities or proxies, renders the plausibility of the Irish text comparable to the others from a network-theoretic point of view.
Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics
NASA Astrophysics Data System (ADS)
Wang, Min; Wang, Jun
A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Zhang, Baoping; Li, Long; Li, Zhiqiang; Liu, Yang; Zhang, Hong; Wang, Jizeng
2016-01-01
A apoptotic model was established based on the results of five hepatocellular carcinoma cell (HCC) lines irradiated with carbon ions to investigate the coupling interplay between apoptotic signaling and morphological and mechanical cellular remodeling. The expression levels of key apoptotic proteins and the changes in morphological characteristics and mechanical properties were systematically examined in the irradiated HCC lines. We observed that caspase-3 was activated and that the Bax/Bcl-2 ratio was significantly increased over time. Cellular morphology and mechanics analyses indicated monotonic decreases in spatial sizes, an increase in surface roughness, a considerable reduction in stiffness, and disassembly of the cytoskeletal architecture. A theoretical model of apoptosis revealed that mechanical changes in cells induce the characteristic cellular budding of apoptotic bodies. Statistical analysis indicated that the projected area, stiffness, and cytoskeletal density of the irradiated cells were positively correlated, whereas stiffness and caspase-3 expression were negatively correlated, suggesting a tight coupling interplay between the cellular structures, mechanical properties, and apoptotic protein levels. These results help to clarify a novel arbitration mechanism of cellular demise induced by carbon ions. This biomechanics strategy for evaluating apoptosis contributes to our understanding of cancer-killing mechanisms in the context of carbon ion radiotherapy. PMID:27731354
NASA Astrophysics Data System (ADS)
Kaplan, D. A.; Casey, S. T.; Cohen, M. J.; Acharya, S.; Jawitz, J. W.
2016-12-01
A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority, but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long time scales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield patch elongation in the direction of historical flow (a central feature of ridge-slough patterning), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch-size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape-scale. This finding challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.
Sander, Edward A; Lynch, Kaari A; Boyce, Steven T
2014-05-01
Engineered skin substitutes (ESSs) have been reported to close full-thickness burn wounds but are subject to loss from mechanical shear due to their deficiencies in tensile strength and elasticity. Hypothetically, if the mechanical properties of ESS matched those of native skin, losses due to shear or fracture could be reduced. To consider modifications of the composition of ESS to improve homology with native skin, biomechanical analyses of the current composition of ESS were performed. ESSs consist of a degradable biopolymer scaffold of type I collagen and chondroitin-sulfate (CGS) that is populated sequentially with cultured human dermal fibroblasts (hF) and epidermal keratinocytes (hK). In the current study, the hydrated biopolymer scaffold (CGS), the scaffold populated with hF dermal skin substitute (DSS), or the complete ESS were evaluated mechanically for linear stiffness (N/mm), ultimate tensile load at failure (N), maximum extension at failure (mm), and energy absorbed up to the point of failure (N-mm). These biomechanical end points were also used to evaluate ESS at six weeks after grafting to full-thickness skin wounds in athymic mice and compared to murine autograft or excised murine skin. The data showed statistically significant differences (p <0.05) between ESS in vitro and after grafting for all four structural properties. Grafted ESS differed statistically from murine autograft with respect to maximum extension at failure, and from intact murine skin with respect to linear stiffness and maximum extension. These results demonstrate rapid changes in mechanical properties of ESS after grafting that are comparable to murine autograft. These values provide instruction for improvement of the biomechanical properties of ESS in vitro that may reduce clinical morbidity from graft loss.
Hayhoe, Richard P G; Lentjes, Marleen A H; Luben, Robert N; Khaw, Kay-Tee; Welch, Ailsa A
2015-08-01
In our aging population, maintenance of bone health is critical to reduce the risk of osteoporosis and potentially debilitating consequences of fractures in older individuals. Among modifiable lifestyle and dietary factors, dietary magnesium and potassium intakes are postulated to influence bone quality and osteoporosis, principally via calcium-dependent alteration of bone structure and turnover. We investigated the influence of dietary magnesium and potassium intakes, as well as circulating magnesium, on bone density status and fracture risk in an adult population in the United Kingdom. A random subset of 4000 individuals from the European Prospective Investigation into Cancer and Nutrition-Norfolk cohort of 25,639 men and women with baseline data was used for bone density cross-sectional analyses and combined with fracture cases (n = 1502) for fracture case-cohort longitudinal analyses (mean follow-up 13.4 y). Relevant biological, lifestyle, and dietary covariates were used in multivariate regression analyses to determine associations between dietary magnesium and potassium intakes and calcaneal broadband ultrasound attenuation (BUA), as well as in Prentice-weighted Cox regression to determine associated risk of fracture. Separate analyses, excluding dietary covariates, investigated associations of BUA and fractures with serum magnesium concentration. Statistically significant positive trends in calcaneal BUA for women (n = 1360) but not men (n = 968) were apparent across increasing quintiles of magnesium plus potassium (Mg+K) z score intake (P = 0.03) or potassium intake alone (P = 0.04). Reduced hip fracture risk in both men (n = 1958) and women (n = 2755) was evident for individuals in specific Mg+K z score intake quintiles compared with the lowest. Statistically significant trends in fracture risk in men across serum magnesium concentration groups were apparent for spine fractures (P = 0.02) and total hip, spine, and wrist fractures (P = 0.02). None of these individual statistically significant associations remained after adjustment for multiple testing. These findings enhance the limited literature studying the association of magnesium and potassium with bone density and demonstrate that further investigation is warranted into the mechanisms involved and the potential protective role against osteoporosis. © 2015 American Society for Nutrition.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Transfer of mechanical energy during the shot put.
Błażkiewicz, Michalina; Łysoń, Barbara; Chmielewski, Adam; Wit, Andrzej
2016-09-01
The aim of this study was to analyse transfer of mechanical energy between body segments during the glide shot put. A group of eight elite throwers from the Polish National Team was analysed in the study. Motion analysis of each throw was recorded using an optoelectronic Vicon system composed of nine infrared camcorders and Kistler force plates. The power and energy were computed for the phase of final acceleration of the glide shot put. The data were normalized with respect to time using the algorithm of the fifth order spline and their values were interpolated with respect to the percentage of total time, assuming that the time of the final weight acceleration movement was different for each putter. Statistically significant transfer was found in the study group between the following segments: Right Knee - Right Hip (p = 0.0035), Left Hip - Torso (p = 0.0201), Torso - Right Shoulder (p = 0.0122) and Right Elbow - Right Wrist (p = 0.0001). Furthermore, the results of cluster analysis showed that the kinetic chain used during the final shot acceleration movement had two different models. Differences between the groups were revealed mainly in the energy generated by the hips and trunk.
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Josefsson, Torbjörn; Ivarsson, Andreas; Lindwall, Magnus; Gustafsson, Henrik; Stenling, Andreas; Böröy, Jan; Mattsson, Emil; Carnebratt, Jakob; Sevholt, Simon; Falkevik, Emil
2017-01-01
The main objective of the project was to examine a proposed theoretical model of mindfulness mechanisms in sports. We conducted two studies (the first study using a cross-sectional design and the second a longitudinal design) to investigate if rumination and emotion regulation mediate the relation between dispositional mindfulness and sport-specific coping. Two hundred and forty-two young elite athletes, drawn from various sports, were recruited for the cross-sectional study. For the longitudinal study, 65 elite athletes were recruited. All analyses were performed using Bayesian statistics. The path analyses showed credible indirect effects of dispositional mindfulness on coping via rumination and emotion regulation in both the cross-sectional study and the longitudinal study. Additionally, the results in both studies showed credible direct effects of dispositional mindfulness on rumination and emotion regulation. Further, credible direct effects of emotion regulation as well as rumination on coping were also found in both studies. Our findings support the theoretical model, indicating that rumination and emotion regulation function as essential mechanisms in the relation between dispositional mindfulness and sport-specific coping skills. Increased dispositional mindfulness in competitive athletes (i.e. by practicing mindfulness) may lead to reductions in rumination, as well as an improved capacity to regulate negative emotions. By doing so, athletes may improve their sport-related coping skills, and thereby enhance athletic performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dharmarajan, Guha; Beasley, James C.; Beatty, William S.
Many aspects of parasite biology critically depend on their hosts, and understanding how host-parasite populations are co-structured can help improve our understanding of the ecology of parasites, their hosts, and host-parasite interactions. Here, this study utilized genetic data collected from raccoons (Procyon lotor), and a specialist parasite, the raccoon tick (Ixodes texanus), to test for genetic co-structuring of host-parasite populations at both landscape and host scales. At the landscape scale, our analyses revealed a significant correlation between genetic and geographic distance matrices (i.e., isolation by distance) in ticks, but not their hosts. While there are several mechanisms that could leadmore » to a stronger pattern of isolation by distance in tick vs. raccoon datasets, our analyses suggest that at least one reason for the above pattern is the substantial increase in statistical power (due to the ≈8-fold increase in sample size) afforded by sampling parasites. Host-scale analyses indicated higher relatedness between ticks sampled from related vs. unrelated raccoons trapped within the same habitat patch, a pattern likely driven by increased contact rates between related hosts. Lastly, by utilizing fine-scale genetic data from both parasites and hosts, our analyses help improve our understanding of epidemiology and host ecology.« less
Chung, Dongjun; Kim, Hang J; Zhao, Hongyu
2017-02-01
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.
NASA Astrophysics Data System (ADS)
Corten-Gualtieri, Pascale; Ritter, Christian; Plumat, Jim; Keunings, Roland; Lebrun, Marcel; Raucent, Benoit
2016-07-01
Most students enter their first university physics course with a system of beliefs and intuitions which are often inconsistent with the Newtonian frame of reference. This article presents an experiment of collaborative learning aiming at helping first-year students in an engineering programme to transition from their naïve intuition about dynamics to the Newtonian way of thinking. In a first activity, students were asked to critically analyse the contents of two video clips from the point of view of Newtonian mechanics. In a second activity, students had to design and realise their own video clip to illustrate a given aspect of Newtonian mechanics. The preparation of the scenario for the second activity required looking up and assimilating scientific knowledge. The efficiency of the activity was assessed on an enhanced version of the statistical analysis method proposed by Hestenes and Halloun, which relies on a pre-test and a post-test to measure individual learning.
de Blasio, Birgitte Freiesleben; Seierstad, Taral Guldahl; Aalen, Odd O
2011-01-01
Preferential attachment is a proportionate growth process in networks, where nodes receive new links in proportion to their current degree. Preferential attachment is a popular generative mechanism to explain the widespread observation of power-law-distributed networks. An alternative explanation for the phenomenon is a randomly grown network with large individual variation in growth rates among the nodes (frailty). We derive analytically the distribution of individual rates, which will reproduce the connectivity distribution that is obtained from a general preferential attachment process (Yule process), and the structural differences between the two types of graphs are examined by simulations. We present a statistical test to distinguish the two generative mechanisms from each other and we apply the test to both simulated data and two real data sets of scientific citation and sexual partner networks. The findings from the latter analyses argue for frailty effects as an important mechanism underlying the dynamics of complex networks. PMID:21572513
Evaluation of mechanical and thermal properties of commonly used denture base resins.
Phoenix, Rodney D; Mansueto, Michael A; Ackerman, Neal A; Jones, Robert E
2004-03-01
The purpose of this investigation was to evaluate and compare the mechanical and thermal properties of 6 commonly used polymethyl methacrylate denture base resins. Sorption, solubility, color stability, adaptation, flexural stiffness, and hardness were assessed to determine compliance with ADA Specification No. 12. Thermal assessments were performed using differential scanning calorimetry and dynamic mechanical analysis. Results were assessed using statistical and observational analyses. All materials satisfied ADA requirements for sorption, solubility, and color stability. Adaptation testing indicated that microwave-activated systems provided better adaptation to associated casts than conventional heat-activated resins. According to flexural testing results, microwaveable resins were relatively stiff, while rubber-modified resins were more flexible. Differential scanning calorimetry indicated that microwave-activated systems were more completely polymerized than conventional heat-activated materials. The microwaveable resins displayed better adaptation, greater stiffness, and greater surface hardness than other denture base resins included in this investigation. Elastomeric toughening agents yielded decreased stiffness, decreased surface hardness, and decreased glass transition temperatures.
NASA Astrophysics Data System (ADS)
Coulibaly, S.; Clerc, M. G.; Selmi, F.; Barbay, S.
2017-02-01
The occurrence of extreme events in a spatially extended microcavity laser has been recently reported [Selmi et al., Phys. Rev. Lett. 116, 013901 (2016), 10.1103/PhysRevLett.116.013901] to be correlated to emergence of spatiotemporal chaos. In this dissipative system, the role of spatial coupling through diffraction is essential to observe the onset of spatiotemporal complexity. We investigate further the formation mechanism of extreme events by comparing the statistical and dynamical analyses. Experimental measurements together with numerical simulations allow us to assign the quasiperiodicity mechanism as the route to spatiotemporal chaos in this system. Moreover, by investigating the fine structure of the maximum Lyapunov exponent, of the Lyapunov spectrum, and of the Kaplan-Yorke dimension of the chaotic attractor, we are able to deduce that intermittency plays a key role in the proportion of extreme events measured. We assign the observed mechanism of generation of extreme events to quasiperiodic extended spatiotemporal intermittency.
Rotenone and paraquat perturb dopamine metabolism: a computational analysis of pesticide toxicity
Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.
2014-01-01
Pesticides, such as rotenone and paraquat, are suspected in the pathogenesis of Parkinson’s disease (PD), whose hallmark is the progressive loss of dopaminergic neurons in the substantia nigra pars compacta. Thus, compounds expected to play a role in the pathogenesis of PD will likely impact the function of dopaminergic neurons. To explore the relationship between pesticide exposure and dopaminergic toxicity, we developed a custom-tailored mathematical model of dopamine metabolism and utilized it to infer potential mechanisms underlying the toxicity of rotenone and paraquat, asking how these pesticides perturb specific processes. We performed two types of analyses, which are conceptually different and complement each other. The first analysis, a purely algebraic reverse engineering approach, analytically and deterministically computes the altered profile of enzyme activities that characterize the effects of a pesticide. The second method consists of large-scale Monte Carlo simulations that statistically reveal possible mechanisms of pesticides. The results from the reverse engineering approach show that rotenone and paraquat exposures lead to distinctly different flux perturbations. Rotenone seems to affect all fluxes associated with dopamine compartmentalization, whereas paraquat exposure perturbs fluxes associated with dopamine and its breakdown metabolites. The statistical results of the Monte-Carlo analysis suggest several specific mechanisms. The findings are interesting, because no a priori assumptions are made regarding specific pesticide actions, and all parameters characterizing the processes in the dopamine model are treated in an unbiased manner. Our results show how approaches from computational systems biology can help identify mechanisms underlying the toxicity of pesticide exposure. PMID:24269752
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Mortality inequality in two native population groups.
Saarela, Jan; Finnäs, Fjalar
2005-11-01
A sample of people aged 40-67 years, taken from a longitudinal register compiled by Statistics Finland, is used to analyse mortality differences between Swedish speakers and Finnish speakers in Finland. Finnish speakers are known to have higher death rates than Swedish speakers. The purpose is to explore whether labour-market experience and partnership status, treated as proxies for measures of variation in health-related characteristics, are related to the mortality differential. Persons who are single, disability pensioners, and those having experienced unemployment are found to have substantially higher death rates than those with a partner and employed persons. Swedish speakers have a more favourable distribution on both variables, which thus notably helps to reduce the Finnish-Swedish mortality gradient. A conclusion from this study is that future analyses on the topic should focus on mechanisms that bring a greater proportion of Finnish speakers into the groups with poor health or supposed unhealthy behaviour.
Real-time quality assurance testing using photonic techniques: Application to iodine water system
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Hatcher, Richard; Garlington, Yadilett; Harwell, Jack; Everett, Tracey
1990-01-01
A feasibility study of the use of inspection systems incorporating photonic sensors and multivariate analyses to provide an instrumentation system that in real-time assures quality and that the system in control has been conducted. A system is in control when the near future of the product quality is predictable. Off-line chemical analyses can be used for a chemical process when slow kinetics allows time to take a sample to the laboratory and the system provides a recovery mechanism that returns the system to statistical control without intervention of the operator. The objective for this study has been the implementation of do-it-right-the-first-time and just-in-time philosophies. The Environment Control and Life Support Systems (ECLSS) water reclamation system that adds iodine for biocidal control is an ideal candidate for the study and implementation of do-it-right-the-first-time technologies.
Ice tracking techniques, implementation, performance, and applications
NASA Technical Reports Server (NTRS)
Rothrock, D. A.; Carsey, F. D.; Curlander, J. C.; Holt, B.; Kwok, R.; Weeks, W. F.
1992-01-01
Present techniques of ice tracking make use both of cross-correlation and of edge tracking, the former being more successful in heavy pack ice, the latter being critical for the broken ice of the pack margins. Algorithms must assume some constraints on the spatial variations of displacements to eliminate fliers, but must avoid introducing any errors into the spatial statistics of the measured displacement field. We draw our illustrations from the implementation of an automated tracking system for kinematic analyses of ERS-1 and JERS-1 SAR imagery at the University of Alaska - the Alaska SAR Facility's Geophysical Processor System. Analyses of the ice kinematic data that might have some general interest to analysts of cloud-derived wind fields are the spatial structure of the fields, and the evaluation and variability of average deformation and its invariants: divergence, vorticity and shear. Many problems in sea ice dynamics and mechanics can be addressed with the kinematic data from SAR.
Timing, Emission, and Spectral Studies of Rotating Radio Transients
NASA Astrophysics Data System (ADS)
Cui, Bingyi; McLaughlin, Maura
2018-01-01
Rotating Radio Transients (RRATs) are a class of pulsars with unusually sporadic pulse emissions which were discovered only through their single pulses. We report in new timing solutions, pulse amplitude measurements, and spectral measurements for a number of RRATs. Timing solutions provide derived physical properties of these sources, allowing comparison with other classes of neutron stars. Analyses of single pulse properties also contribute to this study by measuring composite profiles and flux density distributions, which can constrain the RRATs' emission mechanism. We make statistical comparisons between RRATs and canonical pulsars and show that with the same spin period, RRATs are more likely to have larger period derivatives, which may indicate a higher magnetic field. Spectral analyses were also performed in order to compare spectra with those of other source classes. We describe this work and plans for application to much larger numbers of sources in the future.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
O'Hara, Jane Kathryn; Armitage, Gerry; Reynolds, Caroline; Coulson, Claire; Thorp, Liz; Din, Ikhlaq; Watt, Ian; Wright, John
2017-01-01
Emergent evidence suggests that patients can identify and report safety issues while in hospital. However, little is known about the best method for collecting information from patients about safety concerns. This study presents an exploratory pilot of three mechanisms for collecting data on safety concerns from patients during their hospital stay. Three mechanisms for capturing safety concerns were coproduced with healthcare professionals and patients, before being tested in an exploratory trial using cluster randomisation at the ward level. Nine wards participated, with each mechanism being tested over a 3-month study period. Patients were asked to feed back safety concerns via the mechanism on their ward (interviewing at their bedside, paper-based form or patient safety 'hotline'). Safety concerns were subjected to a two-stage review process to identify those that would meet the definition of a patient safety incident. Differences between mechanisms on a range of outcomes were analysed using inferential statistics. Safety concerns were thematically analysed to develop reporting categories. 178 patients were recruited. Patients in the face-to-face interviewing condition provided significantly more safety concerns per patient (1.91) compared with the paper-based form (0.92) and the patient safety hotline (0.43). They were also significantly more likely to report one or more concerns, with 64% reporting via the face-to-face mechanism, compared with 41% via the paper-based form and 19% via the patient safety hotline. No mechanism differed significantly in the number of classified patient safety incidents or physician-rated preventability and severity. Interviewing at the patient's bedside is likely to be the most effective means of gathering safety concerns from inpatients, potentially providing an opportunity for health services to gather patient feedback about safety from their perspective. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Education on invasive mechanical ventilation involving intensive care nurses: a systematic review.
Guilhermino, Michelle C; Inder, Kerry J; Sundin, Deborah
2018-03-26
Intensive care unit nurses are critical for managing mechanical ventilation. Continuing education is essential in building and maintaining nurses' knowledge and skills, potentially improving patient outcomes. The aim of this study was to determine whether continuing education programmes on invasive mechanical ventilation involving intensive care unit nurses are effective in improving patient outcomes. Five electronic databases were searched from 2001 to 2016 using keywords such as mechanical ventilation, nursing and education. Inclusion criteria were invasive mechanical ventilation continuing education programmes that involved nurses and measured patient outcomes. Primary outcomes were intensive care unit mortality and in-hospital mortality. Secondary outcomes included hospital and intensive care unit length of stay, length of intubation, failed weaning trials, re-intubation incidence, ventilation-associated pneumonia rate and lung-protective ventilator strategies. Studies were excluded if they excluded nurses, patients were ventilated for less than 24 h, the education content focused on protocol implementation or oral care exclusively or the outcomes were participant satisfaction. Quality was assessed by two reviewers using an education intervention critical appraisal worksheet and a risk of bias assessment tool. Data were extracted independently by two reviewers and analysed narratively due to heterogeneity. Twelve studies met the inclusion criteria for full review: 11 pre- and post-intervention observational and 1 quasi-experimental design. Studies reported statistically significant reductions in hospital length of stay, length of intubation, ventilator-associated pneumonia rates, failed weaning trials and improvements in lung-protective ventilation compliance. Non-statistically significant results were reported for in-hospital and intensive care unit mortality, re-intubation and intensive care unit length of stay. Limited evidence of the effectiveness of continuing education programmes on mechanical ventilation involving nurses in improving patient outcomes exists. Comprehensive continuing education is required. Well-designed trials are required to confirm that comprehensive continuing education involving intensive care nurses about mechanical ventilation improves patient outcomes. © 2018 British Association of Critical Care Nurses.
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.
Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis
2015-01-01
Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
The development of ensemble theory. A new glimpse at the history of statistical mechanics
NASA Astrophysics Data System (ADS)
Inaba, Hajime
2015-12-01
This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.
Surface and mechanical analysis of explanted Poly Implant Prosthèse silicone breast implants.
Yildirimer, L; Seifalian, A M; Butler, P E
2013-05-01
The recent events surrounding Poly Implant Prosthèse (PIP) breast implants have renewed the debate about the safety profile of silicone implants. The intentional use of industrial-grade instead of certified medical-grade silicone is thought to be responsible for reportedly higher frequencies of implant rupture in vivo. The differences in mechanical and viscoelastic properties between PIP and medical-grade silicone implant shells were investigated. Surface characterization of shells and gels was carried out to determine structural changes occurring after implantation. Breast implants were obtained from women at the Royal Free Hospital (London, UK). PIP implants were compared with medical-grade control silicone implants. Tensile strength, tear resistance and elongation at break were assessed using a tensile tester. Surfaces were analysed using attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopy. Spearman correlation analyses and Kruskal-Wallis one-way statistical tests were performed for mechanical data. There were 18 PIP and four medical-grade silicone implants. PIP silicone shells had significantly weaker mechanical strength than control shells (P < 0·009). There were negative correlations between mechanical properties of PIP shells and implantation times, indicative of deterioration of PIP shells over time in vivo (r(s) = -0·75, P = 0·009 for tensile strength; r(s) = -0·76, P = 0·001 for maximal strain). Comparison of ATR-FTIR spectra of PIP and control silicones demonstrated changes in material characteristics during the period of implantation suggestive of time-dependent bond breakage and degradation of the material. This study demonstrated an increased weakness of PIP shells with time and therefore supports the argument for prophylactic removal of PIP breast implants. © 2013 British Journal of Surgery Society Ltd. Published by John Wiley & Sons Ltd.
Fuller-Thomson, Esme; Dalton, Angela D
2011-05-15
This study used a large, nationally representative sample to examine the gender-specific association between parental divorce and the cumulative lifetime incidence of suicidal ideation. Known risk factors for suicidal ideation, such as childhood stressors, socioeconomic factors, adult health behaviors and stressors, marital status, and any history of mood and/or anxiety disorders were controlled. Gender-specific analyses revealed that for men, the parental divorce-suicidal ideation relationship remained statistically significant even when the above-listed cluster of risk factors were included in the analyses (odds ratio (OR)=2.36, 95% confidence interval (CI)=1.56, 3.58). For women, the association between parental divorce and suicidal ideation was reduced to non-significance when other adverse childhood experiences were included in the analyses (full adjustment OR=1.04, 95% CI=0.72, 1.50). These findings indicate a need for screening of suicidal ideation among individuals, particularly men and those with mood and/or anxiety disorders, who have experienced parental divorce. Future research should focus on the mechanisms linking parental divorce and suicidal ideation. Copyright © 2010 Elsevier Ltd. All rights reserved.
The History of the AutoChemist®: From Vision to Reality.
Peterson, H E; Jungner, I
2014-05-22
This paper discusses the early history and development of a clinical analyser system in Sweden (AutoChemist, 1965). It highlights the importance of such high capacity system both for clinical use and health care screening. The device was developed to assure the quality of results and to automatically handle the orders, store the results in digital form for later statistical analyses and distribute the results to the patients' physicians by using the computer used for the analyser. The most important result of the construction of an analyser able to produce analytical results on a mass scale was the development of a mechanical multi-channel analyser for clinical laboratories that handled discrete sample technology and could prevent carry-over to the next test samples while incorporating computer technology to improve the quality of test results. The AutoChemist could handle 135 samples per hour in an 8-hour shift and up to 24 possible analyses channels resulting in 3,200 results per hour. Later versions would double this capacity. Some customers used the equipment 24 hours per day. With a capacity of 3,000 to 6,000 analyses per hour, pneumatic driven pipettes, special units for corrosive liquids or special activities, and an integrated computer, the AutoChemist system was unique and the largest of its kind for many years. Its follower - The AutoChemist PRISMA (PRogrammable Individually Selective Modular Analyzer) - was smaller in size but had a higher capacity. Both analysers established new standards of operation for clinical laboratories and encouraged others to use new technologies for building new analysers.
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
Statistical mechanics explanation for the structure of ocean eddies and currents
NASA Astrophysics Data System (ADS)
Venaille, A.; Bouchet, F.
2010-12-01
The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Bynum, T E; Koch, G G
1991-08-08
We sought to compare the efficacy of sucralfate to placebo for the prevention of duodenal ulcer recurrence and to determine that the efficacy of sucralfate was due to a true reduction in ulcer prevalence and not due to secondary effects such as analgesic activity or accelerated healing. This was a double-blind, randomized, placebo-controlled, parallel groups, multicenter clinical study with 254 patients. All patients had a past history of at least two duodenal ulcers with at least one ulcer diagnosed by endoscopic examination 3 months or less before the start of the study. Complete ulcer healing without erosions was required to enter the study. Sucralfate or placebo were dosed as a 1-g tablet twice a day for 4 months, or until ulcer recurrence. Endoscopic examinations once a month and when symptoms developed determined the presence or absence of duodenal ulcers. If a patient developed an ulcer between monthly scheduled visits, the patient was dosed with a 1-g sucralfate tablet twice a day until the next scheduled visit. Statistical analyses of the results determined the efficacy of sucralfate compared with placebo for preventing duodenal ulcer recurrence. Comparisons of therapeutic agents for preventing duodenal ulcers have usually been made by testing for statistical differences in the cumulative rates for all ulcers developed during a follow-up period, regardless of the time of detection. Statistical experts at the United States Food and Drug Administration (FDA) and on the FDA Advisory Panel expressed doubts about clinical study results based on this type of analysis. They suggested three possible mechanisms for reducing the number of observed ulcers: (a) analgesic effects, (b) accelerated healing, and (c) true ulcer prevention. Traditional ulcer analysis could miss recurring ulcers due to an analgesic effect or accelerated healing. Point-prevalence analysis could miss recurring ulcers due to accelerated healing between endoscopic examinations. Maximum ulcer analyses, a novel statistical method, eliminated analgesic effects by regularly scheduled endoscopies and accelerated healing of recurring ulcers by frequent endoscopies and an open-label phase. Maximum ulcer analysis reflects true ulcer recurrence and prevention. Sucralfate was significantly superior to placebo in reducing ulcer prevalence by all analyses. Significance (p less than 0.05) was found at months 3 and 4 for all analyses. All months were significant in the traditional analysis, months 2-4 in point-prevalence analysis, and months 3-4 in the maximal ulcer prevalence analysis. Sucralfate was shown to be effective for the prevention of duodenal ulcer recurrence by a true reduction in new ulcer development.
NASA Astrophysics Data System (ADS)
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely the never married and women. The estimated parameter for never-married men's incomes is significantly different from the parameter estimated for never-married women, implying that either the combined distribution is not exponential or that the individual distributions are not exponential. However, it substantiates the existence of a persistent gender income gap among the never-married. References: Reich, M., D. M. Gordon, and R. C. Edwards (1973). A Theory of Labor Market Segmentation. Quarterly Journal of Economics 63, 359-365. Yakovenko, V. M. (2009). Econophysics, Statistical Mechanics Approach to. In R. A. Meyers (Ed.), Encyclopedia of Complexity and System Science. Springer.
Mechanical properties of amyloid-like fibrils defined by secondary structures
NASA Astrophysics Data System (ADS)
Bortolini, C.; Jones, N. C.; Hoffmann, S. V.; Wang, C.; Besenbacher, F.; Dong, M.
2015-04-01
Amyloid and amyloid-like fibrils represent a generic class of highly ordered nanostructures that are implicated in some of the most fatal neurodegenerative diseases. On the other hand, amyloids, by possessing outstanding mechanical robustness, have also been successfully employed as functional biomaterials. For these reasons, physical and chemical factors driving fibril self-assembly and morphology are extensively studied - among these parameters, the secondary structures and the pH have been revealed to be crucial, since a variation in pH changes the fibril morphology and net chirality during protein aggregation. It is important to quantify the mechanical properties of these fibrils in order to help the design of effective strategies for treating diseases related to the presence of amyloid fibrils. In this work, we show that by changing pH the mechanical properties of amyloid-like fibrils vary as well. In particular, we reveal that these mechanical properties are strongly related to the content of secondary structures. We analysed and estimated the Young's modulus (E) by comparing the persistence length (Lp) - measured from the observation of TEM images by using statistical mechanics arguments - with the mechanical information provided by peak force quantitative nanomechanical property mapping (PF-QNM). The secondary structure content and the chirality are investigated by means of synchrotron radiation circular dichroism (SR-CD). Results arising from this study could be fruitfully used as a protocol to investigate other medical or engineering relevant peptide fibrils.Amyloid and amyloid-like fibrils represent a generic class of highly ordered nanostructures that are implicated in some of the most fatal neurodegenerative diseases. On the other hand, amyloids, by possessing outstanding mechanical robustness, have also been successfully employed as functional biomaterials. For these reasons, physical and chemical factors driving fibril self-assembly and morphology are extensively studied - among these parameters, the secondary structures and the pH have been revealed to be crucial, since a variation in pH changes the fibril morphology and net chirality during protein aggregation. It is important to quantify the mechanical properties of these fibrils in order to help the design of effective strategies for treating diseases related to the presence of amyloid fibrils. In this work, we show that by changing pH the mechanical properties of amyloid-like fibrils vary as well. In particular, we reveal that these mechanical properties are strongly related to the content of secondary structures. We analysed and estimated the Young's modulus (E) by comparing the persistence length (Lp) - measured from the observation of TEM images by using statistical mechanics arguments - with the mechanical information provided by peak force quantitative nanomechanical property mapping (PF-QNM). The secondary structure content and the chirality are investigated by means of synchrotron radiation circular dichroism (SR-CD). Results arising from this study could be fruitfully used as a protocol to investigate other medical or engineering relevant peptide fibrils. Electronic supplementary information (ESI) available: A molecular model for the peptide studied and the charge chart associated to it. In addition, an AFM image of pH 4 fibrils is presented. See DOI: 10.1039/c4nr05109b
The Lidcombe Program and child language development: Long-term assessment.
Imeson, Juliet; Lowe, Robyn; Onslow, Mark; Munro, Natalie; Heard, Rob; O'Brian, Sue; Arnott, Simone
2018-03-15
This study was driven by the need to understand the mechanisms underlying Lidcombe Program treatment efficacy. The aim of the present study was to extend existing data exploring whether stuttering reductions observed when children successfully treated with the Lidcombe Program are associated with restricted language development. Audio recordings of 10-min parent-child conversations at home were transcribed verbatim for 11 pre-school-age children with various stuttering severities. Language samples from three assessments-pre-treatment, 9 and 18 months after beginning treatment-were analysed using SALT software for lexical diversity, utterance length and sentence complexity. At 18 months posttreatment commencement, the children had attained and maintained statistically significant stuttering reductions. During that period, there was no evidence that Lidcombe Program treatment was associated with restricted language development. The continued search for the mechanisms underlying this successful treatment needs to focus on other domains.
Anomalous Diffusion of Single Particles in Cytoplasm
Regner, Benjamin M.; Vučinić, Dejan; Domnisoru, Cristina; Bartol, Thomas M.; Hetzer, Martin W.; Tartakovsky, Daniel M.; Sejnowski, Terrence J.
2013-01-01
The crowded intracellular environment poses a formidable challenge to experimental and theoretical analyses of intracellular transport mechanisms. Our measurements of single-particle trajectories in cytoplasm and their random-walk interpretations elucidate two of these mechanisms: molecular diffusion in crowded environments and cytoskeletal transport along microtubules. We employed acousto-optic deflector microscopy to map out the three-dimensional trajectories of microspheres migrating in the cytosolic fraction of a cellular extract. Classical Brownian motion (BM), continuous time random walk, and fractional BM were alternatively used to represent these trajectories. The comparison of the experimental and numerical data demonstrates that cytoskeletal transport along microtubules and diffusion in the cytosolic fraction exhibit anomalous (nonFickian) behavior and posses statistically distinct signatures. Among the three random-walk models used, continuous time random walk provides the best representation of diffusion, whereas microtubular transport is accurately modeled with fractional BM. PMID:23601312
Analysis of Failures of High Speed Shaft Bearing System in a Wind Turbine
NASA Astrophysics Data System (ADS)
Wasilczuk, Michał; Gawarkiewicz, Rafał; Bastian, Bartosz
2018-01-01
During the operation of wind turbines with gearbox of traditional configuration, consisting of one planetary stage and two helical stages high failure rate of high speed shaft bearings is observed. Such a high failures frequency is not reflected in the results of standard calculations of bearing durability. Most probably it can be attributed to atypical failure mechanism. The authors studied problems in 1.5 MW wind turbines of one of Polish wind farms. The analysis showed that the problems of high failure rate are commonly met all over the world and that the statistics for the analysed turbines were very similar. After the study of potential failure mechanism and its potential reasons, modification of the existing bearing system was proposed. Various options, with different bearing types were investigated. Different versions were examined for: expected durability increase, extent of necessary gearbox modifications and possibility to solve existing problems in operation.
Waste management CDM projects barriers NVivo 10® qualitative dataset.
Bufoni, André Luiz; de Sousa Ferreira, Aracéli Cristina; Oliveira, Luciano Basto
2017-12-01
This article contains one NVivo 10® file with the complete 432 projects design documents (PDD) of seven waste management sector industries registered as Clean Development Mechanism (CDM) under United Nations Framework Convention on Climate Change (UNFCCC) Kyoto Protocol Initiative from 2004 to 2014. All data analyses and sample statistics made during the research remain in the file. We coded PDDs in 890 fragments of text, classified in five categories of barriers (nodes): technological, financial, human resources, regulatory, socio-political. The data supports the findings of author thesis [1] and other two indexed publication in Waste Management Journal: "The financial attractiveness assessment of large waste management projects registered as clean development mechanism" and "The declared barriers of the large developing countries waste management projects: The STAR model" [2], [3]. The data allows any computer assisted qualitative content analysis (CAQCA) on the sector and it is available at Mendeley [4].
Roos, Per M; Vesterberg, Olof; Syversen, Tore; Flaten, Trond Peder; Nordberg, Monica
2013-02-01
Amyotrophic lateral sclerosis (ALS) is a progressive and fatal degenerative disorder of motor neurons. The cause of this degeneration is unknown, and different causal hypotheses include genetic, viral, traumatic and environmental mechanisms. In this study, we have analyzed metal concentrations in cerebrospinal fluid (CSF) and blood plasma in a well-defined cohort (n = 17) of ALS patients diagnosed with quantitative electromyography. Metal analyses were performed with high-resolution inductively coupled plasma mass spectrometry. Statistically significant higher concentrations of manganese, aluminium, cadmium, cobalt, copper, zinc, lead, vanadium and uranium were found in ALS CSF compared to control CSF. We also report higher concentrations of these metals in ALS CSF than in ALS blood plasma, which indicate mechanisms of accumulation, e.g. inward directed transport. A pattern of multiple toxic metals is seen in ALS CSF. The results support the hypothesis that metals with neurotoxic effects are involved in the pathogenesis of ALS.
Zhang, Zhefeng; Xian, Jiahui; Zhang, Chunyong; Fu, Degang
2017-09-01
This study investigated the degradation performance and mechanism of creatinine (a urine metabolite) with boron-doped diamond (BDD) anodes. Experiments were performed using a synthetic creatinine solution containing two supporting electrolytes (NaCl and Na 2 SO 4 ). A three-level central composite design was adopted to optimize the degradation process, a mathematical model was thus constructed and used to explore the optimum operating conditions. A maximum mineralization percentage of 80% following with full creatinine removal had been achieved within 120 min of electrolysis, confirming the strong oxidation capability of BDD anodes. Moreover, the results obtained suggested that supporting electrolyte concentration should be listed as one of the most important parameters in BDD technology. Lastly, based on the results from quantum chemistry calculations and LC/MS analyses, two different reaction pathways which governed the electrocatalytic oxidation of creatinine irrespective of the supporting electrolytes were identified. Copyright © 2017 Elsevier Ltd. All rights reserved.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Ergodic theorem, ergodic theory, and statistical mechanics
Moore, Calvin C.
2015-01-01
This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
A statistical mechanical approach to restricted integer partition functions
NASA Astrophysics Data System (ADS)
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-05-01
The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.
Insights into Corona Formation through Statistical Analyses
NASA Technical Reports Server (NTRS)
Glaze, L. S.; Stofan, E. R.; Smrekar, S. E.; Baloga, S. M.
2002-01-01
Statistical analysis of an expanded database of coronae on Venus indicates that the populations of Type 1 (with fracture annuli) and 2 (without fracture annuli) corona diameters are statistically indistinguishable, and therefore we have no basis for assuming different formation mechanisms. Analysis of the topography and diameters of coronae shows that coronae that are depressions, rimmed depressions, and domes tend to be significantly smaller than those that are plateaus, rimmed plateaus, or domes with surrounding rims. This is consistent with the model of Smrekar and Stofan and inconsistent with predictions of the spreading drop model of Koch and Manga. The diameter range for domes, the initial stage of corona formation, provides a broad constraint on the buoyancy of corona-forming plumes. Coronae are only slightly more likely to be topographically raised than depressions, with Type 1 coronae most frequently occurring as rimmed depressions and Type 2 coronae most frequently occuring with flat interiors and raised rims. Most Type 1 coronae are located along chasmata systems or fracture belts, while Type 2 coronas are found predominantly as isolated features in the plains. Coronae at hotspot rises tend to be significantly larger than coronae in other settings, consistent with a hotter upper mantle at hotspot rises and their active state.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
How big should a mammal be? A macroecological look at mammalian body size over space and time
Smith, Felisa A.; Lyons, S. Kathleen
2011-01-01
Macroecology was developed as a big picture statistical approach to the study of ecology and evolution. By focusing on broadly occurring patterns and processes operating at large spatial and temporal scales rather than on localized and/or fine-scaled details, macroecology aims to uncover general mechanisms operating at organism, population, and ecosystem levels of organization. Macroecological studies typically involve the statistical analysis of fundamental species-level traits, such as body size, area of geographical range, and average density and/or abundance. Here, we briefly review the history of macroecology and use the body size of mammals as a case study to highlight current developments in the field, including the increasing linkage with biogeography and other disciplines. Characterizing the factors underlying the spatial and temporal patterns of body size variation in mammals is a daunting task and moreover, one not readily amenable to traditional statistical analyses. Our results clearly illustrate remarkable regularities in the distribution and variation of mammalian body size across both geographical space and evolutionary time that are related to ecology and trophic dynamics and that would not be apparent without a broader perspective. PMID:21768152
Reproducibility of ZrO2-based freeze casting for biomaterials.
Naleway, Steven E; Fickas, Kate C; Maker, Yajur N; Meyers, Marc A; McKittrick, Joanna
2016-04-01
The processing technique of freeze casting has been intensely researched for its potential to create porous scaffold and infiltrated composite materials for biomedical implants and structural materials. However, in order for this technique to be employed medically or commercially, it must be able to reliably produce materials in great quantities with similar microstructures and properties. Here we investigate the reproducibility of the freeze casting process by independently fabricating three sets of eight ZrO2-epoxy composite scaffolds with the same processing conditions but varying solid loading (10, 15 and 20 vol.%). Statistical analyses (One-way ANOVA and Tukey's HSD tests) run upon measurements of the microstructural dimensions of these composite scaffold sets show that, while the majority of microstructures are similar, in all cases the composite scaffolds display statistically significant variability. In addition, composite scaffolds where mechanically compressed and statistically analyzed. Similar to the microstructures, almost all of their resultant properties displayed significant variability though most composite scaffolds were similar. These results suggest that additional research to improve control of the freeze casting technique is required before scaffolds and composite scaffolds can reliably be reproduced for commercial or medical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
dos Reis, Andréa Cândido; de Castro, Denise Tornavoi; Schiavon, Marco Antônio; da Silva, Leandro Jardel; Agnelli, José Augusto Marcondes
2013-01-01
The aim of this study was to investigate the influence of accelerated artificial aging (AAA) on the microstructure and mechanical properties of the Filtek Z250, Filtek Supreme, 4 Seasons, Herculite, P60, Tetric Ceram, Charisma and Filtek Z100. composite resins. The composites were characterized by Fourier-transform Infrared spectroscopy (FTIR) and thermal analyses (Differential Scanning Calorimetry - DSC and Thermogravimetry - TG). The microstructure of the materials was examined by scanning electron microscopy. Surface hardness and compressive strength data of the resins were recorded and the mean values were analyzed statistically by ANOVA and Tukey's test (α=0.05). The results showed significant differences among the commercial brands for surface hardness (F=86.74, p<0.0001) and compressive strength (F=40.31, p<0.0001), but AAA did not affect the properties (surface hardness: F=0.39, p=0.53; compressive strength: F=2.82, p=0.09) of any of the composite resins. FTIR, DSC and TG analyses showed that resin polymerization was complete, and there were no differences between the spectra and thermal curve profiles of the materials obtained before and after AAA. TG confirmed the absence of volatile compounds and evidenced good thermal stability up to 200 °C, and similar amounts of residues were found in all resins evaluated before and after AAA. The AAA treatment did not significantly affect resin surface. Therefore, regardless of the resin brand, AAA did not influence the microstructure or the mechanical properties.
Impact of implementation choices on quantitative predictions of cell-based computational models
NASA Astrophysics Data System (ADS)
Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.
2017-09-01
'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.
The shape of CMB temperature and polarization peaks on the sphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcos-Caballero, A.; Fernández-Cobos, R.; Martínez-González, E.
2016-04-01
We present a theoretical study of CMB temperature peaks, including its effect over the polarization field, and allowing nonzero eccentricity. The formalism is developed in harmonic space and using the covariant derivative on the sphere, which guarantees that the expressions obtained are completely valid at large scales (i.e., no flat approximation). The expected patterns induced by the peak, either in temperature or polarization, are calculated, as well as their covariances. It is found that the eccentricity introduces a quadrupolar dependence in the peak shape, which is proportional to a complex bias parameter b {sub ε}, characterizing the peak asymmetry andmore » orientation. In addition, the one-point statistics of the variables defining the peak on the sphere is reviewed, finding some differences with respect to the flat case for large peaks. Finally, we present a mechanism to simulate constrained CMB maps with a particular peak on the field, which is an interesting tool for analysing the statistical properties of the peaks present in the data.« less
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
Statistical Learning of Phonetic Categories: Insights from a Computational Approach
ERIC Educational Resources Information Center
McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.
2009-01-01
Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Wright, Aidan G C; Simms, Leonard J
2014-01-01
The current study examines the relations among contemporary models of pathological and normal range personality traits. Specifically, we report on (a) conjoint exploratory factor analyses of the Computerized Adaptive Test of Personality Disorder static form (CAT-PD-SF) with the Personality Inventory for the Diagnostic and Statistical Manual of Mental Disorders, fifth edition and NEO Personality Inventory-3 First Half, and (b) unfolding hierarchical analyses of the three measures in a large general psychiatric outpatient sample (n = 628; 64% Female). A five-factor solution provided conceptually coherent alignment among the CAT-PD-SF, PID-5, and NEO-PI-3FH scales. Hierarchical solutions suggested that higher-order factors bear strong resemblance to dimensions that emerge from structural models of psychopathology (e.g., Internalizing and Externalizing spectra). These results demonstrate that the CAT-PD-SF adheres to the consensual structure of broad trait domains at the five-factor level. Additionally, patterns of scale loadings further inform questions of structure and bipolarity of facet and domain level constructs. Finally, hierarchical analyses strengthen the argument for using broad dimensions that span normative and pathological functioning to scaffold a quantitatively derived phenotypic structure of psychopathology to orient future research on explanatory, etiological, and maintenance mechanisms.
Genetic co-structuring in host-parasite systems: Empirical data from raccoons and raccoon ticks
Dharmarajan, Guha; Beasley, James C.; Beatty, William S.; ...
2016-03-31
Many aspects of parasite biology critically depend on their hosts, and understanding how host-parasite populations are co-structured can help improve our understanding of the ecology of parasites, their hosts, and host-parasite interactions. Here, this study utilized genetic data collected from raccoons (Procyon lotor), and a specialist parasite, the raccoon tick (Ixodes texanus), to test for genetic co-structuring of host-parasite populations at both landscape and host scales. At the landscape scale, our analyses revealed a significant correlation between genetic and geographic distance matrices (i.e., isolation by distance) in ticks, but not their hosts. While there are several mechanisms that could leadmore » to a stronger pattern of isolation by distance in tick vs. raccoon datasets, our analyses suggest that at least one reason for the above pattern is the substantial increase in statistical power (due to the ≈8-fold increase in sample size) afforded by sampling parasites. Host-scale analyses indicated higher relatedness between ticks sampled from related vs. unrelated raccoons trapped within the same habitat patch, a pattern likely driven by increased contact rates between related hosts. Lastly, by utilizing fine-scale genetic data from both parasites and hosts, our analyses help improve our understanding of epidemiology and host ecology.« less
Elastin: a representative ideal protein elastomer.
Urry, D W; Hugel, T; Seitz, M; Gaub, H E; Sheiba, L; Dea, J; Xu, J; Parker, T
2002-01-01
During the last half century, identification of an ideal (predominantly entropic) protein elastomer was generally thought to require that the ideal protein elastomer be a random chain network. Here, we report two new sets of data and review previous data. The first set of new data utilizes atomic force microscopy to report single-chain force-extension curves for (GVGVP)(251) and (GVGIP)(260), and provides evidence for single-chain ideal elasticity. The second class of new data provides a direct contrast between low-frequency sound absorption (0.1-10 kHz) exhibited by random-chain network elastomers and by elastin protein-based polymers. Earlier composition, dielectric relaxation (1-1000 MHz), thermoelasticity, molecular mechanics and dynamics calculations and thermodynamic and statistical mechanical analyses are presented, that combine with the new data to contrast with random-chain network rubbers and to detail the presence of regular non-random structural elements of the elastin-based systems that lose entropic elastomeric force upon thermal denaturation. The data and analyses affirm an earlier contrary argument that components of elastin, the elastic protein of the mammalian elastic fibre, and purified elastin fibre itself contain dynamic, non-random, regularly repeating structures that exhibit dominantly entropic elasticity by means of a damping of internal chain dynamics on extension. PMID:11911774
Apparent Interfacial Fracture Toughness of Resin/Ceramic Systems
Della Bona, A.; Anusavice, K.J.; Mecholsky, J.J.
2008-01-01
We suggest that the apparent interfacial fracture toughness (KA) may be estimated by fracture mechanics and fractography. This study tested the hypothesis that the KA of the adhesion zone of resin/ceramic systems is affected by the ceramic microstructure. Lithia disilicate-based (Empress2-E2) and leucite-based (Empress-E1) ceramics were surface-treated with hydrofluoric acid (HF) and/or silane (S), followed by an adhesive resin. Microtensile test specimens (n = 30; area of 1 ± 0.01 mm2) were indented (9.8 N) at the interface and loaded to failure in tension. We used tensile strength (σ) and the critical crack size (c) to calculate KA (KA = Yσc1/2) (Y = 1.65). ANOVA and Weibull analyses were used for statistical analyses. Mean KA (MPa•m1/2) values were: (E1HF) 0.26 ± 0.06; (E1S) 0.23 ± 0.06; (E1HFS) 0.30 ± 0.06; (E2HF) 0.31 ± 0.06; (E2S) 0.13 ± 0.05; and (E2HFS) 0.41 ± 0.07. All fractures originated from indentation sites. Estimation of interfacial toughness was feasible by fracture mechanics and fractography. The KA for the systems tested was affected by the ceramic microstructure and surface treatment. PMID:17062746
Refined elasticity sampling for Monte Carlo-based identification of stabilizing network patterns.
Childs, Dorothee; Grimbs, Sergio; Selbig, Joachim
2015-06-15
Structural kinetic modelling (SKM) is a framework to analyse whether a metabolic steady state remains stable under perturbation, without requiring detailed knowledge about individual rate equations. It provides a representation of the system's Jacobian matrix that depends solely on the network structure, steady state measurements, and the elasticities at the steady state. For a measured steady state, stability criteria can be derived by generating a large number of SKMs with randomly sampled elasticities and evaluating the resulting Jacobian matrices. The elasticity space can be analysed statistically in order to detect network positions that contribute significantly to the perturbation response. Here, we extend this approach by examining the kinetic feasibility of the elasticity combinations created during Monte Carlo sampling. Using a set of small example systems, we show that the majority of sampled SKMs would yield negative kinetic parameters if they were translated back into kinetic models. To overcome this problem, a simple criterion is formulated that mitigates such infeasible models. After evaluating the small example pathways, the methodology was used to study two steady states of the neuronal TCA cycle and the intrinsic mechanisms responsible for their stability or instability. The findings of the statistical elasticity analysis confirm that several elasticities are jointly coordinated to control stability and that the main source for potential instabilities are mutations in the enzyme alpha-ketoglutarate dehydrogenase. © The Author 2015. Published by Oxford University Press.
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Yin, Xiaoyan; Subramanian, Subha; Hwang, Shih-Jen; O’Donnell, Christopher J.; Fox, Caroline S.; Courchesne, Paul; Muntendam, Pieter; Adourian, Aram; Juhasz, Peter; Larson, Martin G.; Levy, Daniel
2014-01-01
Objective Incorporation of novel plasma protein biomarkers may improve current models for prediction of atherosclerotic cardiovascular disease (ASCVD) risk. Approach and Results We utilized discovery mass spectrometry (MS) to determine plasma concentrations of 861 proteins in 135 myocardial infarction (MI) cases and 135 matched controls. We then measured 59markers by targeted MS in 336 ASCVD case-control pairs. Associations with MI or ASCVD were tested in single marker and multimarker analyses adjusted for established ASCVD risk factors. Twelve single markers from discovery MS were associated with MI incidence (at p<0.01) adjusting for clinical risk factors. Seven proteins in aggregate (cyclophilin A, CD5 antigen-like, cell surface glycoprotein MUC18, collagen-alpha 1 [XVIII] chain, salivary alpha-amylase 1, C-reactive protein, and multimerin-2) were highly associated with MI (p<0.0001) and significantly improved its prediction compared to a model with clinical risk factors alone (C-statistic of 0.71 vs. 0.84). Through targeted MS, twelve single proteins were predictors of ASCVD (at p<0.05) after adjusting for established risk factors. In multimarker analyses, four proteins in combination (alpha-1-acid glycoprotein 1, paraoxonase 1, tetranectin, and CD5 antigen-like, predicted incident ASCVD (p<0.0001) and moderately improved the C-statistic from the model with clinical covariates alone (C-statistic of 0.69 vs. 0.73). Conclusions Proteomics profiling identified single and multimarker protein panels that are associated with new onset ASCVD and may lead to a better understanding of underlying disease mechanisms. Our findings include many novel protein biomarkers that, if externally validated, may improve risk assessment for MI and ASCVD. PMID:24526693
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic
Guillas, S.; Georgiopoulou, A.; Dias, F.
2017-01-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained. PMID:28484339
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic.
Salmanidou, D M; Guillas, S; Georgiopoulou, A; Dias, F
2017-04-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained.
Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses
Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.
2017-01-01
Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
What controls the variability of oxygen in the subpolar North Pacific?
NASA Astrophysics Data System (ADS)
Takano, Yohei
Dissolved oxygen is a widely observed chemical quantity in the oceans along with temperature and salinity. Changes in the dissolved oxygen have been observed over the world oceans. Observed oxygen in the Ocean Station Papa (OSP, 50°N, 145°W) in the Gulf of Alaska exhibits strong variability over interannual and decadal timescales, however, the mechanisms driving the observed variability are not yet fully understood. Furthermore, irregular sampling frequency and relatively short record length make it difficult to detect a low-frequency variability. Motivated by these observations, we investigate the mechanisms driving the low-frequency variability of oxygen in the subpolar North Pacific. The specific purposes of this study are (1) to evaluate the robustness of the observed low-frequency variability of dissolved oxygen and (2) to determine the mechanisms driving the observed variability using statistical data analysis and numerical simulations. To evaluate the robustness of the low-frequency variability, we conducted spectral analyses on the observed oxygen at OSP. To address the irregular sampling frequency we randomly sub-sampled the raw data to form 500 ensemble members with a regular time interval, and then performed spectral analyses. The resulting power spectrum of oxygen exhibits a robust low-frequency variability and a statistically significant spectral peak is identified at a timescale of 15--20 years. The wintertime oceanic barotropic streamfunction is significantly correlated with the observed oxygen anomaly at OSP with a north-south dipole structure over the North Pacific. We hypothesize that the observed low-frequency variability is primarily driven by the variability of large-scale ocean circulation in the North Pacific. To test this hypothesis, we simulate the three-dimensional distribution of oxygen anomaly between 1952 to 2001 using data-constrained circulation fields. The simulated oxygen anomaly shows an outstanding variability in the Gulf of Alaska, showing that this region is a hotspot of oxygen fluctuation. Anomalous advection acting on the climatological mean oxygen gradient is the source of oxygen variability in this simulation. Empirical Orthogonal Function (EOF) analyses of the simulated oxygen show that the two dominant modes of the oxygen anomaly explains more than 50% of oxygen variance over the North Pacific, that are closely related to the dominant modes of climate variability in the North Pacific (Pacific Decadal Oscillation and North Pacific Oscillation). Our results imply the important link between large-scale climate fluctuations, ocean circulation and biogeochemical tracers in the North Pacific.
Predicting the binding preference of transcription factors to individual DNA k-mers.
Alleyne, Trevis M; Peña-Castillo, Lourdes; Badis, Gwenael; Talukder, Shaheynoor; Berger, Michael F; Gehrke, Andrew R; Philippakis, Anthony A; Bulyk, Martha L; Morris, Quaid D; Hughes, Timothy R
2009-04-15
Recognition of specific DNA sequences is a central mechanism by which transcription factors (TFs) control gene expression. Many TF-binding preferences, however, are unknown or poorly characterized, in part due to the difficulty associated with determining their specificity experimentally, and an incomplete understanding of the mechanisms governing sequence specificity. New techniques that estimate the affinity of TFs to all possible k-mers provide a new opportunity to study DNA-protein interaction mechanisms, and may facilitate inference of binding preferences for members of a given TF family when such information is available for other family members. We employed a new dataset consisting of the relative preferences of mouse homeodomains for all eight-base DNA sequences in order to ask how well we can predict the binding profiles of homeodomains when only their protein sequences are given. We evaluated a panel of standard statistical inference techniques, as well as variations of the protein features considered. Nearest neighbour among functionally important residues emerged among the most effective methods. Our results underscore the complexity of TF-DNA recognition, and suggest a rational approach for future analyses of TF families.
Izquierdo, Paula P; de Biasi, Ronaldo S; Elias, Carlos N; Nojima, Lincoln I
2010-12-01
Our purpose was to study the mechanical properties and phase transformations of orthodontic wires submitted to in-vivo exposure in the mouth for different periods of time. Stainless steel wires were tied to fixed orthodontic appliances of 30 patients from the orthodontics clinic of Universidade Federal do Rio de Janeiro School of Dentistry in Brazil. According to the duration of the clinical treatment, the patients were divided into 3 groups. After in-vivo exposure, the samples were studied by mechanical testing (torsion) and ferromagnetic resonance. Statistical analyses were carried out to evaluate the correlation between time of exposure, mechanical properties, and austenite-to-martensite transformation among the groups. The results were compared with as-received control samples. The torque values increased as time in the mouth increased. The increase in torque resistance showed high correlations with time of exposure (P = 0.005) and austenite-martensite phase transformation. The resistance of stainless steel orthodontic wires increases as the time in the mouth increases; this effect is attributed to the austenite-to-martensite transformation. Copyright © 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Three-dimensional cellular deformation analysis with a two-photon magnetic manipulator workstation.
Huang, Hayden; Dong, Chen Y; Kwon, Hyuk-Sang; Sutin, Jason D; Kamm, Roger D; So, Peter T C
2002-04-01
The ability to apply quantifiable mechanical stresses at the microscopic scale is critical for studying cellular responses to mechanical forces. This necessitates the use of force transducers that can apply precisely controlled forces to cells while monitoring the responses noninvasively. This paper describes the development of a micromanipulation workstation integrating two-photon, three-dimensional imaging with a high-force, uniform-gradient magnetic manipulator. The uniform-gradient magnetic field applies nearly uniform forces to a large cell population, permitting statistical quantification of select molecular responses to mechanical stresses. The magnetic transducer design is capable of exerting over 200 pN of force on 4.5-microm-diameter paramagnetic particles and over 800 pN on 5.0-microm ferromagnetic particles. These forces vary within +/-10% over an area 500 x 500 microm2. The compatibility with the use of high numerical aperture (approximately 1.0) objectives is an integral part of the workstation design allowing submicron-resolution, three-dimensional, two-photon imaging. Three-dimensional analyses of cellular deformation under localized mechanical strain are reported. These measurements indicate that the response of cells to large focal stresses may contain three-dimensional global deformations and show the suitability of this workstation to further studying cellular response to mechanical stresses.
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
[Biomechanical significance of the acetabular roof and its reaction to mechanical injury].
Domazet, N; Starović, D; Nedeljković, R
1999-01-01
The introduction of morphometry into the quantitative analysis of the bone system and functional adaptation of acetabulum to mechanical damages and injuries enabled a relatively simple and acceptable examination of morphological acetabular changes in patients with damaged hip joints. Measurements of the depth and form of acetabulum can be done by radiological methods, computerized tomography and ultrasound (1-9). The aim of the study was to obtain data on the behaviour of acetabular roof, the so-called "eyebrow", by morphometric analyses during different mechanical injuries. Clinical studies of the effect of different loads on acetabular roof were carried out in 741 patients. Radiographic findings of 400 men and 341 women were analysed. The control group was composed of 148 patients with normal hip joints. Average age of the patients was 54.7 years and that of control subjects 52.0 years. Data processing was done for all examined patients. On the basis of our measurements the average size of female "eyebrow" ranged from 24.8 mm to 31.5 mm with standard deviation of 0.93 and in men from 29.4 mm to 40.3 mm with standard deviation of 1.54. The average size in the whole population was 32.1 mm with standard deviation of 15.61. Statistical analyses revealed high correlation coefficients between the age and "eyebrow" size in men (r = 0.124; p < 0.05); it was statically in inverse proportion (Graph 1). However, in female patients the correlation coefficient was statistically significant (r = 0.060; p > 0.05). The examination of the size of collodiaphysial angle and length of "eyebrow" revealed that "eyebrow" length was in inverse proportion to the size of collodiaphysial angle (r = 0.113; p < 0.05). The average "eyebrow" length in relation to the size of collodiaphysial angle ranged from 21.3 mm to 35.2 mm with standard deviation of 1.60. There was no statistically significant correlation between the "eyebrow" size and Wiberg's angle in male (r = 0.049; p > 0.05) and female (r = 0.005; p > 0.05) patients. The "eyebrow" length was proportionally dependent on the size of the shortened extremity in all examined subjects. This dependence was statistically significant both in female (r = 0.208; p < 0.05) and male (r = 0.193; p < 0.05) patients. The study revealed that fossa acetabuli was forward and downward laterally directed. The size, form and cross-section of acetabulum changed during different loads. Dimensions and morphological changes in acetabulum showed some but unimportant changes in comparison to that in the control group. These findings are graphically presented in Figure 5 and numerically in Tables 1 and 2. The study of spatial orientation among hip joints revealed that fossa acetabuli was forward and downward laterally directed; this was in accordance with results other authors (1, 7, 9, 15, 18). There was a statistically significant difference in relation to the "eyebrow" size between patients and normal subjects (t = 3.88; p < 0.05). The average difference of "eyebrow" size was 6.892 mm. A larger "eyebrow" was found in patients with normally loaded hip. There was also a significant difference in "eyebrow" size between patients and healthy female subjects (t = 4.605; p < 0.05). A larger "eyebrow" of 8.79 mm was found in female subjects with normally loaded hip. On the basis of our study it can be concluded that the findings related to changes in acetabular roof, the so-called "eyebrow", are important in diagnosis, follow-up and therapy of pathogenetic processes of these disorders.
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
Kyong, Jin Burm; Lee, Yelin; D’Souza, Malcolm John; Kevill, Dennis Neil; Kevill, Dennis Neil
2012-01-01
The “parent” tertiary alkyl chloroformate, tert-butyl chloroformate, is unstable, but the tert-butyl chlorothioformate (1) is of increased stability and a kinetic investigation of the solvolyses is presented. Analyses in terms of the simple and extended Grunwald-Winstein equations are carried out. The original one-term equation satisfactorily correlates the data with a sensitivity towards changes in solvent ionizing power of 0.73 ±0.03. When the two-term equation is applied, the sensitivity towards changes in solvent nucleophilicity of 0.13 ± 0.09 is associated with a high (0.17) probability that the term that it governs is not statistically significant. PMID:23538747
Laser surface texturing of polypropylene to increase adhesive bonding
NASA Astrophysics Data System (ADS)
Mandolfino, Chiara; Pizzorni, Marco; Lertora, Enrico; Gambaro, Carla
2018-05-01
In this paper, the main parameters of laser surface texturing of polymeric substrates have been studied. The final aim of the texturing is to increase the performance of bonded joints of grey-pigmented polypropylene substrates. The experimental investigation was carried out starting from the identification of the most effective treatment parameters, in order to achieve a good texture without compromising the characteristics of the bulk material. For each of these parameters, three values were individuated and 27 sets of samples were realised. The surface treatment was analysed and related to the mechanical characteristics of the bonded joints performing lap-shear tests. A statistical analysis in order to find the most influential parameter completed the work.
Correlation of Mechanical Properties with Diameter and Cooling Rate of 1080 Wire-Rod
NASA Astrophysics Data System (ADS)
Kohli, A.; Poirier, D. R.
2017-12-01
More than 540 heats of 1080 wire-rod were statistically analyzed by regression analyses to see whether tensile strength and percent reduction in area (%RA) relate to wire-rod diameter and composition. As diameter increases from 5.6 to 12.7 mm, the trend in %RA shows a decrease with negligible effect on the trend of the tensile strength. It was found that the estimated cooling rate at 700 °C during controlled cooling is responsible for the "diameter effect." The effect of composition on %RA is minor when contrasted to the "diameter effect." In particular, the effect of the concentrations of the residual elements on %RA within the compositional range studied is negligible.
A novel alkaloid isolated from Crotalaria paulina and identified by NMR and DFT calculations
NASA Astrophysics Data System (ADS)
Oliveira, Ramon Prata; Demuner, Antonio Jacinto; Alvarenga, Elson Santiago; Barbosa, Luiz Claudio Almeida; de Melo Silva, Thiago
2018-01-01
Pyrrolizidine alkaloids (PAs) are secondary metabolites found in Crotalaria genus and are known to have several biological activities. A novel macrocycle bislactone alkaloid, coined ethylcrotaline, was isolated and purified from the aerial parts of Crotalaria paulina. The novel macrocycle was identified with the aid of high resolution mass spectrometry and advanced nuclear magnetic resonance techniques. The relative stereochemistry of the alkaloid was defined by comparing the calculated quantum mechanical hydrogen and carbon chemical shifts of eight candidate structures with the experimental NMR data. The best fit between the eight candidate structures and the experimental NMR chemical shifts was defined by the DP4 statistical analyses and the Mean Absolute Error (MAE) calculations.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Holocaust exposure and subsequent suicide risk: a population-based study.
Bursztein Lipsicas, Cendrine; Levav, Itzhak; Levine, Stephen Z
2017-03-01
To examine the association between the extent of genocide exposure and subsequent suicide risk among Holocaust survivors. Persons born in Holocaust-exposed European countries during the years 1922-1945 that immigrated to Israel by 1965 were identified in the Population Registry (N = 209,429), and followed up for suicide (1950-2014). They were divided into three groups based on likely exposure to Nazi persecution: those who immigrated before (indirect; n = 20,229; 10%), during (partial direct; n = 17,189; 8%), and after (full direct; n = 172,061; 82%) World War II. Groups were contrasted for suicide risk, accounting for the extent of genocide in their respective countries of origin, high (>70%) or lower levels (<50%). Cox model survival analyses were computed examining calendar year at suicide. Sensitivity analyses were recomputed for two additional suicide-associated variables (age and years since immigration) for each exposure group. All analyses were adjusted for confounders. Survival analysis showed that compared to the indirect exposure group, the partial direct exposure group from countries with high genocide level had a statistically significant (P < .05) increased suicide risk for the main outcome (calendar year: HR 1.78, 95% CI 1.09, 2.90). This effect significantly (P < .05) replicated in two sensitivity analyses for countries with higher relative levels of genocide (age: HR 1.77, 95% CI 1.09, 2.89; years since immigration: HR 1.85, 95% CI 1.14, 3.02). The full direct exposure group was not at significant suicide risk compared to the indirect exposure group. Suicide associations for groups from countries with relative lower level of genocide were not statistically significant. This study partly converges with findings identifying Holocaust survivors (full direct exposure) as a resilient group. A tentative mechanism for higher vulnerability to suicide risk of the partial direct exposure group from countries with higher genocide exposure includes protracted guilt feelings, having directly witnessed atrocities and escaped death.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Spatially explicit spectral analysis of point clouds and geospatial data
Buscombe, Daniel D.
2015-01-01
The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described, and its functionality illustrated with an example of a high-resolution bathymetric point cloud data collected with multibeam echosounder.
Adaptation of Chain Event Graphs for use with Case-Control Studies in Epidemiology.
Keeble, Claire; Thwaites, Peter Adam; Barber, Stuart; Law, Graham Richard; Baxter, Paul David
2017-09-26
Case-control studies are used in epidemiology to try to uncover the causes of diseases, but are a retrospective study design known to suffer from non-participation and recall bias, which may explain their decreased popularity in recent years. Traditional analyses report usually only the odds ratio for given exposures and the binary disease status. Chain event graphs are a graphical representation of a statistical model derived from event trees which have been developed in artificial intelligence and statistics, and only recently introduced to the epidemiology literature. They are a modern Bayesian technique which enable prior knowledge to be incorporated into the data analysis using the agglomerative hierarchical clustering algorithm, used to form a suitable chain event graph. Additionally, they can account for missing data and be used to explore missingness mechanisms. Here we adapt the chain event graph framework to suit scenarios often encountered in case-control studies, to strengthen this study design which is time and financially efficient. We demonstrate eight adaptations to the graphs, which consist of two suitable for full case-control study analysis, four which can be used in interim analyses to explore biases, and two which aim to improve the ease and accuracy of analyses. The adaptations are illustrated with complete, reproducible, fully-interpreted examples, including the event tree and chain event graph. Chain event graphs are used here for the first time to summarise non-participation, data collection techniques, data reliability, and disease severity in case-control studies. We demonstrate how these features of a case-control study can be incorporated into the analysis to provide further insight, which can help to identify potential biases and lead to more accurate study results.
Statistical Thermodynamics and Microscale Thermophysics
NASA Astrophysics Data System (ADS)
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
Acoustic Features Influence Musical Choices Across Multiple Genres.
Barone, Michael D; Bansal, Jotthi; Woolhouse, Matthew H
2017-01-01
Based on a large behavioral dataset of music downloads, two analyses investigate whether the acoustic features of listeners' preferred musical genres influence their choice of tracks within non-preferred, secondary musical styles. Analysis 1 identifies feature distributions for pairs of genre-defined subgroups that are distinct. Using correlation analysis, these distributions are used to test the degree of similarity between subgroups' main genres and the other music within their download collections. Analysis 2 explores the issue of main-to-secondary genre influence through the production of 10 feature-influence matrices, one per acoustic feature, in which cell values indicate the percentage change in features for genres and subgroups compared to overall population averages. In total, 10 acoustic features and 10 genre-defined subgroups are explored within the two analyses. Results strongly indicate that the acoustic features of people's main genres influence the tracks they download within non-preferred, secondary musical styles. The nature of this influence and its possible actuating mechanisms are discussed with respect to research on musical preference, personality, and statistical learning.
NASA Technical Reports Server (NTRS)
Erickson, J. D.; Macdonald, R. B. (Principal Investigator)
1982-01-01
A "quick look" investigation of the initial LANDSAT-4, thematic mapper (TM) scene received from Goddard Space Flight Center was performed to gain early insight into the characteristics of TM data. The initial scene, containing only the first four bands of the seven bands recorded by the TM, was acquired over the Detroit, Michigan, area on July 20, 1982. It yielded abundant information for scientific investigation. A wide variety of studies were conducted to assess all aspects of TM data. They ranged from manual analyses of image products to detect obvious optical, electronic, or mechanical defects to detailed machine analyses of the digital data content for evaluation of spectral separability of vegetative/nonvegetative classes. These studies were applied to several segments extracted from the full scene. No attempt was made to perform end-to-end statistical evaluations. However, the output of these studies do identify a degree of positive performance from the TM and its potential for advancing state-of-the-art crop inventory and condition assessment technology.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
Hydrologic controls on aperiodic spatial organization of the ridge-slough patterned landscape
NASA Astrophysics Data System (ADS)
Casey, Stephen T.; Cohen, Matthew J.; Acharya, Subodh; Kaplan, David A.; Jawitz, James W.
2016-11-01
A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long timescales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield a central feature of ridge-slough patterning (patch elongation in the direction of historical flow), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Mean water depth explained significant variation in ridge density, total perimeter, and length : width ratios, illustrating an important pattern response to existing hydrologic gradients. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape scale. Critically, this challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
Does educational status impact adult mortality in Denmark? A twin approach.
Madsen, Mia; Andersen, Anne-Marie Nybo; Christensen, Kaare; Andersen, Per Kragh; Osler, Merete
2010-07-15
To disentangle an independent effect of educational status on mortality risk from direct and indirect selection mechanisms, the authors used a discordant twin pair design, which allowed them to isolate the effect of education by means of adjustment for genetic and environmental confounding per design. The study is based on data from the Danish Twin Registry and Statistics Denmark. Using Cox regression, they estimated hazard ratios for mortality according to the highest attained education among 5,260 monozygotic and 11,088 dizygotic same-sex twin pairs born during 1921-1950 and followed during 1980-2008. Both standard cohort and intrapair analyses were conducted separately for zygosity, gender, and birth cohort. Educational differences in mortality were demonstrated in the standard cohort analyses but attenuated in the intrapair analyses in all subgroups but men born during 1921-1935, and no effect modification by zygosity was observed. Hence, the results are most compatible with an effect of early family environment in explaining the educational inequality in mortality. However, large educational differences were still reflected in mortality risk differences within twin pairs, thus supporting some degree of independent effect of education. In addition, the effect of education may be more pronounced in older cohorts of Danish men.
Cross-Sectional Analysis of Longitudinal Mediation Processes.
O'Laughlin, Kristine D; Martin, Monica J; Ferrer, Emilio
2018-01-01
Statistical mediation analysis can help to identify and explain the mechanisms behind psychological processes. Examining a set of variables for mediation effects is a ubiquitous process in the social sciences literature; however, despite evidence suggesting that cross-sectional data can misrepresent the mediation of longitudinal processes, cross-sectional analyses continue to be used in this manner. Alternative longitudinal mediation models, including those rooted in a structural equation modeling framework (cross-lagged panel, latent growth curve, and latent difference score models) are currently available and may provide a better representation of mediation processes for longitudinal data. The purpose of this paper is twofold: first, we provide a comparison of cross-sectional and longitudinal mediation models; second, we advocate using models to evaluate mediation effects that capture the temporal sequence of the process under study. Two separate empirical examples are presented to illustrate differences in the conclusions drawn from cross-sectional and longitudinal mediation analyses. Findings from these examples yielded substantial differences in interpretations between the cross-sectional and longitudinal mediation models considered here. Based on these observations, researchers should use caution when attempting to use cross-sectional data in place of longitudinal data for mediation analyses.
Intracolonial genetic variation in the scleractinian coral Seriatopora hystrix
NASA Astrophysics Data System (ADS)
Maier, E.; Buckenmaier, A.; Tollrian, R.; Nürnberger, B.
2012-06-01
In recent years, increasing numbers of studies revealed intraorganismal genetic variation, primarily in modular organisms like plants or colonial marine invertebrates. Two underlying mechanisms are distinguished: Mosaicism is caused by somatic mutation, whereas chimerism originates from allogeneic fusion. We investigated the occurrence of intracolonial genetic variation at microsatellite loci in five natural populations of the scleractinian coral Seriatopora hystrix on the Great Barrier Reef. This coral is a widely distributed, brooding species that is at present a target of intensive population genetic research on reproduction and dispersal patterns. From each of 155 S. hystrix colonies, either two or three samples were genotyped at five or six loci. Twenty-seven (~17%) genetically heterogeneous colonies were found. Statistical analyses indicated the occurrence of both mosaicism and chimerism. In most cases, intracolonial variation was found only at a single allele. Our analyses suggest that somatic mutations present a major source of genetic heterogeneity within a single colony. Moreover, we observed large, apparently stable chimeric colonies that harbored clearly distinct genotypes and contrast these findings with the patterns typically observed in laboratory-based experiments. We discuss the error that mosaicism and chimerism introduce into population genetic analyses.
Pompei-Reynolds, Renée C; Kanavakis, Georgios
2014-08-01
The manufacturing process for copper-nickel-titanium archwires is technique sensitive. The primary aim of this investigation was to examine the interlot consistency of the mechanical properties of copper-nickel-titanium wires from 2 manufacturers. Wires of 2 sizes (0.016 and 0.016 × 0.022 in) and 3 advertised austenite finish temperatures (27°C, 35°C, and 40°C) from 2 manufacturers were tested for transition temperature ranges and force delivery using differential scanning calorimetry and the 3-point bend test, respectively. Variations of these properties were analyzed for statistical significance by calculating the F statistic for equality of variances for transition temperature and force delivery in each group of wires. All statistical analyses were performed at the 0.05 level of significance. Statistically significant interlot variations in austenite finish were found for the 0.016 in/27°C (P = 0.041) and 0.016 × 0.022 in/35°C (P = 0.048) wire categories, and in austenite start for the 0.016 × 0.022 in/35°C wire category (P = 0.01). In addition, significant variations in force delivery were found between the 2 manufacturers for the 0.016 in/27°C (P = 0.002), 0.016 in/35.0°C (P = 0.049), and 0.016 × 0.022 in/35°C (P = 0.031) wires. Orthodontic wires of the same material, dimension, and manufacturer but from different production lots do not always have similar mechanical properties. Clinicians should be aware that copper-nickel-titanium wires might not always deliver the expected force, even when they come from the same manufacturer, because of interlot variations in the performance of the material. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.
2010-01-01
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Statistical Analysis on the Mechanical Properties of Magnesium Alloys
Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng
2017-01-01
Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
NASA Astrophysics Data System (ADS)
Gregoire, Alexandre David
2011-07-01
The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.
Huang, Qi; Lv, Xin; He, Yushuang; Wei, Xing; Ma, Meigang; Liao, Yuhan; Qin, Chao; Wu, Yuan
2017-12-01
Patients with epilepsy (PWE) are more likely to suffer from migraine attack, and aberrant white matter (WM) organization may be the mechanism underlying this phenomenon. This study aimed to use diffusion tensor imaging (DTI) technique to quantify WM structural differences in PWE with interictal migraine. Diffusion tensor imaging data were acquired in 13 PWE with migraine and 12 PWE without migraine. Diffusion metrics were analyzed using tract-atlas-based spatial statistics analysis. Atlas-based and tract-based spatial statistical analyses were conducted for robustness analysis. Correlation was explored between altered DTI metrics and clinical parameters. The main results are as follows: (i) Axonal damage plays a key role in PWE with interictal migraine. (ii) Significant diffusing alterations included higher fractional anisotropy (FA) in the fornix, higher mean diffusivity (MD) in the middle cerebellar peduncle (CP), left superior CP, and right uncinate fasciculus, and higher axial diffusivity (AD) in the middle CP and right medial lemniscus. (iii) Diffusion tensor imaging metrics has the tendency of correlation with seizure/migraine type and duration. Results indicate that characteristic structural impairments exist in PWE with interictal migraine. Epilepsy may contribute to migraine by altering WMs in the brain stem. White matter tracts in the fornix and right uncinate fasciculus also mediate migraine after epilepsy. This finding may improve our understanding of the pathological mechanisms underlying migraine attack after epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
Scaling and universality in the human voice.
Luque, Jordi; Luque, Bartolo; Lacasa, Lucas
2015-04-06
Speech is a distinctive complex feature of human capabilities. In order to understand the physics underlying speech production, in this work, we empirically analyse the statistics of large human speech datasets ranging several languages. We first show that during speech, the energy is unevenly released and power-law distributed, reporting a universal robust Gutenberg-Richter-like law in speech. We further show that such 'earthquakes in speech' show temporal correlations, as the interevent statistics are again power-law distributed. As this feature takes place in the intraphoneme range, we conjecture that the process responsible for this complex phenomenon is not cognitive, but it resides in the physiological (mechanical) mechanisms of speech production. Moreover, we show that these waiting time distributions are scale invariant under a renormalization group transformation, suggesting that the process of speech generation is indeed operating close to a critical point. These results are put in contrast with current paradigms in speech processing, which point towards low dimensional deterministic chaos as the origin of nonlinear traits in speech fluctuations. As these latter fluctuations are indeed the aspects that humanize synthetic speech, these findings may have an impact in future speech synthesis technologies. Results are robust and independent of the communication language or the number of speakers, pointing towards a universal pattern and yet another hint of complexity in human speech. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Bobaru, F.
2007-07-01
The peridynamic method is used here to analyse the effect of van der Waals forces on the mechanical behaviour and strength and toughness properties of three-dimensional nanofibre networks under imposed stretch deformation. The peridynamic formulation allows for a natural inclusion of long-range forces (such as van der Waals forces) by considering all interactions as 'long-range'. We use van der Waals interactions only between different fibres and do not need to model individual atoms. Fracture is introduced at the microstructural (peridynamic bond) level for the microelastic type bonds, while van der Waals bonds can reform at any time. We conduct statistical studies to determine a certain volume element for which the network of randomly oriented fibres becomes quasi-isotropic and insensitive to statistical variations. This qualitative study shows that the presence of van der Waals interactions and of heterogeneities (sacrificial bonds) in the strength of the bonds at the crosslinks between fibres can help in increasing the strength and toughness of the nanofibre network. Two main mechanisms appear to control the deformation of nanofibre networks: fibre reorientation (caused by deformation and breakage) and fibre accretion (due to van der Waals interaction). Similarities to the observed toughness of polymer adhesive in the abalone shell composition are explained. The author would like to dedicate this work to the 60th anniversary of Professor Subrata Mukherjee.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Shu-Kun
1996-12-31
Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics, quantum theory and biophysics. However, all the relevant chemical experimental observations and logical analyses indicate that the Gibbs paradox statement is false. I prove that this statement is wrong: Gibbs paradox statement implies that entropy decreases with the increase in symmetry (as represented by a symmetry number {sigma}; see any statistical mechanics textbook). From group theory any system has at least a symmetry number {sigma}=1 which is the identity operation for a strictly asymmetric system. It follows that the entropy of a systemmore » is equal to, or less than, zero. However, from either von Neumann-Shannon entropy formula (S(w) =-{Sigma}{sup {omega}} in p{sub 1}) or the Boltzmann entropy formula (S = in w) and the original definition, entropy is non-negative. Therefore, this statement is false. It should not be a surprise that for the first time, many outstanding problems such as the validity of Pauling`s resonance theory, the explanation of second order phase transition phenomena, the biophysical problem of protein folding and the related hydrophobic effect, etc., can be solved. Empirical principles such as Pauli principle (and Hund`s rule) and HSAB principle, etc., can also be given a theoretical explanation.« less
Geospatial Characterization of Fluvial Wood Arrangement in a Semi-confined Alluvial River
NASA Astrophysics Data System (ADS)
Martin, D. J.; Harden, C. P.; Pavlowsky, R. T.
2014-12-01
Large woody debris (LWD) has become universally recognized as an integral component of fluvial systems, and as a result, has become increasingly common as a river restoration tool. However, "natural" processes of wood recruitment and the subsequent arrangement of LWD within the river network are poorly understood. This research used a suite of spatial statistics to investigate longitudinal arrangement patterns of LWD in a low-gradient, Midwestern river. First, a large-scale GPS inventory of LWD, performed on the Big River in the eastern Missouri Ozarks, resulted in over 4,000 logged positions of LWD along seven river segments that covered nearly 100 km of the 237 km river system. A global Moran's I analysis indicates that LWD density is spatially autocorrelated and displays a clustering tendency within all seven river segments (P-value range = 0.000 to 0.054). A local Moran's I analysis identified specific locations along the segments where clustering occurs and revealed that, on average, clusters of LWD density (high or low) spanned 400 m. Spectral analyses revealed that, in some segments, LWD density is spatially periodic. Two segments displayed strong periodicity, while the remaining segments displayed varying degrees of noisiness. Periodicity showed a positive association with gravel bar spacing and meander wavelength, although there were insufficient data to statistically confirm the relationship. A wavelet analysis was then performed to investigate periodicity relative to location along the segment. The wavelet analysis identified significant (α = 0.05) periodicity at discrete locations along each of the segments. Those reaches yielding strong periodicity showed stronger relationships between LWD density and the geomorphic/riparian independent variables tested. Analyses consistently identified valley width and sinuosity as being associated with LWD density. The results of these analyses contribute a new perspective on the longitudinal distribution of LWD in a river system, which should help identify physical and/or riparian control mechanisms of LWD arrangement and support the development of models of LWD arrangement. Additionally, the spatial statistical tools presented here have shown to be valuable for identifying longitudinal patterns in river system components.
ERIC Educational Resources Information Center
National Centre for Vocational Education Research, Leabrook (Australia).
Statistics regarding Australians participating in apprenticeships and traineeships in the mechanical engineering and fabrication trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: mechanical engineering trades; fabrication engineering…
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis
NASA Astrophysics Data System (ADS)
Klose, C. D.
2012-12-01
The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
Aftershock Energy Distribution by Statistical Mechanics Approach
NASA Astrophysics Data System (ADS)
Daminelli, R.; Marcellini, A.
2015-12-01
The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.
A quantitative study of nanoparticle skin penetration with interactive segmentation.
Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook
2016-10-01
In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.
Wang, Xiaoliang; Shojaie, Ali; Zhang, Yuzheng; Shelley, David; Lampe, Paul D; Levy, Lisa; Peters, Ulrike; Potter, John D; White, Emily; Lampe, Johanna W
2017-01-01
Long-term use of aspirin is associated with lower risk of colorectal cancer and other cancers; however, the mechanism of chemopreventive effect of aspirin is not fully understood. Animal studies suggest that COX-2, NFκB signaling and Wnt/β-catenin pathways may play a role, but no clinical trials have systematically evaluated the biological response to aspirin in healthy humans. Using a high-density antibody array, we assessed the difference in plasma protein levels after 60 days of regular dose aspirin (325 mg/day) compared to placebo in a randomized double-blinded crossover trial of 44 healthy non-smoking men and women, aged 21-45 years. The plasma proteome was analyzed on an antibody microarray with ~3,300 full-length antibodies, printed in triplicate. Moderated paired t-tests were performed on individual antibodies, and gene-set analyses were performed based on KEGG and GO pathways. Among the 3,000 antibodies analyzed, statistically significant differences in plasma protein levels were observed for nine antibodies after adjusting for false discoveries (FDR adjusted p-value<0.1). The most significant protein was succinate dehydrogenase subunit C (SDHC), a key enzyme complex of the mitochondrial tricarboxylic acid (TCA) cycle. The other statistically significant proteins (NR2F1, MSI1, MYH1, FOXO1, KHDRBS3, NFKBIE, LYZ and IKZF1) are involved in multiple pathways, including DNA base-pair repair, inflammation and oncogenic pathways. None of the 258 KEGG and 1,139 GO pathways was found to be statistically significant after FDR adjustment. This study suggests several chemopreventive mechanisms of aspirin in humans, which have previously been reported to play a role in anti- or pro-carcinogenesis in cell systems; however, larger, confirmatory studies are needed.
Wei, L; Liu, M; Xiong, H; Peng, B
2017-11-06
To investigate the effects of the pro-inflammatory and Th17-polarizing mediator IL-17 on HDPFs-mediated IL-23 production and the molecular mechanism involved. Interleukin (IL)-17R expression was determined by semi-quantitative reverse transcriptase-polymerase chain reaction and Western blot in cultured human dental pulp fibroblasts (HDPFs). Quantitative real-time polymerase chain reaction and enzyme-linked immunosorbent assay were used to determine IL-23 mRNA and protein levels in IL-17-stimulated HDPFs, respectively. The nuclear factor-kappa B (NF-κB) and mitogen-activated protein kinases (MAPKs) signalling pathways that mediate the IL-17-stimulated production of IL-23 was investigated using Western blot and specific signalling inhibitor analyses. Statistical analyses were performed using Kruskal-Wallis tests followed by the Mann-Whitney U-test. Statistical significance was considered when the P value < 0.05. Primary HDPFs steadily expressed IL-17R mRNA and surface-bound protein. IL-17 stimulated the expression of IL-23 mRNA and protein in cultured human dental pulp fibroblasts, which was attenuated by IL-17 or IL-17R neutralizing antibodies. In accordance with the enhanced expression of IL-23, IL-17 stimulation resulted in rapid activation of p38 MAPK, extracellular signal-regulated kinase (ERK) 1/2, c-Jun-N-terminal kinase (JNK) and NF-κB in HDPFs. Inhibitors of p38 MAPK, ERK 1/2 or NF-κB significantly suppressed, whereas blocking JNK substantially augmented IL-23 production from IL-17-stimulated HDPFs. HDPFs expressed IL-17R and responded to IL-17 to produce IL-23 via the activation of the NF-κB and MAPK signalling pathways. The findings provide insights into the cellular mechanisms of the participation of IL-17 in the activation of HDPFs in inflamed pulp tissue. © 2017 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Repairability of CAD/CAM high-density PMMA- and composite-based polymers.
Wiegand, Annette; Stucki, Lukas; Hoffmann, Robin; Attin, Thomas; Stawarczyk, Bogna
2015-11-01
The study aimed to analyse the shear bond strength of computer-aided design and computer-aided manufacturing (CAD/CAM) polymethyl methacrylate (PMMA)- and composite-based polymer materials repaired with a conventional methacrylate-based composite after different surface pretreatments. Each 48 specimens was prepared from six different CAD/CAM polymer materials (Ambarino high-class, artBloc Temp, CAD-Temp, Lava Ultimate, Telio CAD, Everest C-Temp) and a conventional dimethacrylate-based composite (Filtek Supreme XTE, control) and aged by thermal cycling (5000 cycles, 5-55 °C). The surfaces were left untreated or were pretreated by mechanical roughening, aluminium oxide air abrasion or silica coating/silanization (each subgroup n = 12). The surfaces were further conditioned with an etch&rinse adhesive (OptiBond FL) before the repair composite (Filtek Supreme XTE) was adhered to the surface. After further thermal cycling, shear bond strength was tested, and failure modes were assessed. Shear bond strength was statistically analysed by two- and one-way ANOVAs and Weibull statistics, failure mode by chi(2) test (p ≤ 0.05). Shear bond strength was highest for silica coating/silanization > aluminium oxide air abrasion = mechanical roughening > no surface pretreatment. Independently of the repair pretreatment, highest bond strength values were observed in the control group and for the composite-based Everest C-Temp and Ambarino high-class, while PMMA-based materials (artBloc Temp, CAD-Temp and Telio CAD) presented significantly lowest values. For all materials, repair without any surface pretreatment resulted in adhesive failures only, which mostly were reduced when surface pretreatment was performed. Repair of CAD/CAM high-density polymers requires surface pretreatment prior to adhesive and composite application. However, four out of six of the tested CAD/CAM materials did not achieve the repair bond strength of a conventional dimethacrylate-based composite. Repair of PMMA- and composite-based polymers can be achieved by surface pretreatment followed by application of an adhesive and a conventional methacrylate-based composite.
Injury Severity Score coding: Data analyst v. emerging m-health technology.
Spence, R T; Zargaran, E; Hameed, M; Fong, D; Shangguan, E; Martinez, R; Navsaria, P; Nicol, A
2016-09-08
The cost of Abbreviated Injury Scale (AIS) coding has limited its utility in areas of the world with the highest incidence of trauma. We hypothesised that emerging mobile health (m-health) technology could offer a cost-effective alternative to the current gold-standard AIS mechanism in a high-volume trauma centre in South Africa. A prospectively collected sample of consecutive patients admitted following a traumatic injury that required an operation during a 1-month period was selected for the study. AISs and Injury Severity Scores (ISSs) were generated by clinician-entered data using an m-health application (ISS eTHR) as well as by a team of AIS coders at Vancouver General Hospital, Canada (ISS VGH). Rater agreements for ISSs were analysed using Bland-Altman plots with 95% limits of agreement (LoA) and kappa statistics of the ISSs grouped into ordinal categories. Reliability was analysed using a two-way mixed-model intraclass correlation coefficient (ICC). Calibration and discrimination of univariate logistic regression models built to predict in-hospital complications using ISSs coded by the two methods were also compared. Fifty-seven patients were managed operatively during the study period. The mean age of the cohort was 27.2 years (range 14 - 62), and 96.3% were male. The mechanism of injury was penetrating in 93.4% of cases, of which 52.8% were gunshot injuries. The LoA fell within -8.6 - 9.4. The mean ISS difference was 0.4 (95% CI -0.8 - 1.6). The kappa statistic was 0.53. The ICC of the individual ISS was 0.88 (95% CI 0.81 - 0.93) and the categorical ISS was 0.81 (95% CI 0.68 - 0.87). Model performance to predict in-hospital complications using either the ISS eTHR or the ISS VGH was equivalent. ISSs calculated by the eTHR and gold-standard coding were comparable. Emerging m-health technology provides a cost-effective alternative for injury severity scoring.
Learning physics concepts as a function of colloquial language usage
NASA Astrophysics Data System (ADS)
Maier, Steven J.
Data from two sections of college introductory, algebra-based physics courses (n1 = 139, n2 = 91) were collected using three separate instruments to investigate the relationships between reasoning ability, conceptual gain and colloquial language usage. To obtain a measure of reasoning ability, Lawson's Classroom Test of Scientific Reasoning Ability (TSR) was administered once near mid-term for each sample. The Force Concept Inventory (FCI) was administered at the beginning and at the end of the term for pre- and post-test measures. Pre- and post-test data from the Mechanics Language Usage instrument were also collected in conjunction with FCI data collection at the beginning and end of the term. The MLU was developed specifically for this study prior to data collection, and results of a pilot test to establish validity and reliability are reported. T-tests were performed on the data collected to compare the means from each sample. In addition, correlations among the measures were investigated between the samples separately and combined. Results from these investigations served as justification for combining the samples into a single sample of 230 for performing further statistical analyses. The primary objective of this study was to determine if scientific reasoning ability (a function of developmental stage) and conceptual gains in Newtonian mechanics predict students' usages of "force" as measured by the MLU. Regression analyses were performed to evaluate these mediated relationships among TSR and FCI performance as a predictor of MLU performance. Statistically significant correlations and relationships existed among several of the measures, which are discussed at length in the body of the narrative. The findings of this research are that although there exists a discernable relationship between reasoning ability and conceptual change, more work needs to be done to establish improved quantitative measures of the role language usage has in developing understandings of course content.
Hill, Genevieve; Nagaraja, Srinidhi; Akbarnia, Behrooz A; Pawelek, Jeff; Sponseller, Paul; Sturm, Peter; Emans, John; Bonangelino, Pablo; Cockrum, Joshua; Kane, William; Dreher, Maureen
2017-10-01
Growing rod constructs are an important contribution for treating patients with early-onset scoliosis. These devices experience high failure rates, including rod fractures. The objective of this study was to identify the failure mechanism of retrieved growing rods, and to identify differences between patients with failed and intact constructs. Growing rod patients who had implant removal and were previously enrolled in a multicenter registry were eligible for this study. Forty dual-rod constructs were retrieved from 36 patients across four centers, and 34 of those constructs met the inclusion criteria. Eighteen constructs failed due to rod fracture. Sixteen intact constructs were removed due to final fusion (n=7), implant exchange (n=5), infection (n=2), or implant prominence (n=2). Analyses of clinical registry data, radiographs, and retrievals were the outcome measures. Retrievals were analyzed with microscopic imaging (optical and scanning electron microscopy) for areas of mechanical failure, damage, and corrosion. Failure analyses were conducted on the fracture surfaces to identify failure mechanism(s). Statistical analyses were performed to determine significant differences between the failed and intact groups. The failed rods fractured due to bending fatigue under flexion motion. Construct configuration and loading dictate high bending stresses at three distinct locations along the construct: (1) mid-construct, (2) adjacent to the tandem connector, or (3) adjacent to the distal anchor foundation. In addition, high torques used to insert set screws may create an initiation point for fatigue. Syndromic scoliosis, prior rod fractures, increase in patient weight, and rigid constructs consisting of tandem connectors and multiple crosslinks were associated with failure. This is the first study to examine retrieved, failed growing rod implants across multiple centers. Our analysis found that rod fractures are due to bending fatigue, and that stress concentrations play an important role in rod fractures. Recommendations are made on surgical techniques, such as the use of torque-limiting wrenches or not exceeding the prescribed torques. Additional recommendations include frequent rod replacement in select patients during scheduled surgeries. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo
2017-08-01
We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.
Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics
NASA Astrophysics Data System (ADS)
Sugiyama, Masaru
. Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
Sex differences in thickness, and folding developments throughout the cortex.
Mutlu, A Kadir; Schneider, Maude; Debbané, Martin; Badoud, Deborah; Eliez, Stephan; Schaer, Marie
2013-11-15
While significant differences in male and female brain structures have commonly been reported, only a few studies have focused on the sex differences in the way the cortex matures over time. Here, we investigated cortical thickness maturation between the age of 6 to 30 years, using 209 longitudinally-acquired brain MRI scans. Significant sex differences in the trajectories of cortical thickness change with age were evidenced using non-linear mixed effects models. Similar statistical analyses were computed to quantify the differences between cortical gyrification changes with age in males and females. During adolescence, we observed a statistically significant higher rate of cortical thinning in females compared to males in the right temporal regions, the left temporoparietal junction and the left orbitofrontal cortex. This finding is interpreted as a faster maturation of the social brain areas in females. Concomitantly, statistically significant sex differences in cortical folding changes with age were observed only in one cluster of the right prefrontal regions, suggesting that the mechanisms underlying cortical thickness and gyrification changes with age are quite distinct. Sexual dimorphism in the developmental course of the cortical maturation may be associated with the different age of onset and clinical presentation of many psychiatric disorders between males and females. Copyright © 2013 Elsevier Inc. All rights reserved.
Learning Predictive Statistics: Strategies and Brain Mechanisms.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-08-30
When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.
NASA Astrophysics Data System (ADS)
Lu, Q.-B.
2013-07-01
This study is focused on the effects of cosmic rays (solar activity) and halogen-containing molecules (mainly chlorofluorocarbons — CFCs) on atmospheric ozone depletion and global climate change. Brief reviews are first given on the cosmic-ray-driven electron-induced-reaction (CRE) theory for O3 depletion and the warming theory of halogenated molecules for climate change. Then natural and anthropogenic contributions to these phenomena are examined in detail and separated well through in-depth statistical analyses of comprehensive measured datasets of quantities, including cosmic rays (CRs), total solar irradiance, sunspot number, halogenated gases (CFCs, CCl4 and HCFCs), CO2, total O3, lower stratospheric temperatures and global surface temperatures. For O3 depletion, it is shown that an analytical equation derived from the CRE theory reproduces well 11-year cyclic variations of both polar O3 loss and stratospheric cooling, and new statistical analyses of the CRE equation with observed data of total O3 and stratospheric temperature give high linear correlation coefficients ≥ 0.92. After the removal of the CR effect, a pronounced recovery by 20 25 % of the Antarctic O3 hole is found, while no recovery of O3 loss in mid-latitudes has been observed. These results show both the correctness and dominance of the CRE mechanism and the success of the Montreal Protocol. For global climate change, in-depth analyses of the observed data clearly show that the solar effect and human-made halogenated gases played the dominant role in Earth's climate change prior to and after 1970, respectively. Remarkably, a statistical analysis gives a nearly zero correlation coefficient (R = -0.05) between corrected global surface temperature data by removing the solar effect and CO2 concentration during 1850-1970. In striking contrast, a nearly perfect linear correlation with coefficients as high as 0.96-0.97 is found between corrected or uncorrected global surface temperature and total amount of stratospheric halogenated gases during 1970-2012. Furthermore, a new theoretical calculation on the greenhouse effect of halogenated gases shows that they (mainly CFCs) could alone result in the global surface temperature rise of 0.6°C in 1970-2002. These results provide solid evidence that recent global warming was indeed caused by the greenhouse effect of anthropogenic halogenated gases. Thus, a slow reversal of global temperature to the 1950 value is predicted for coming 5 7 decades. It is also expected that the global sea level will continue to rise in coming 1 2 decades until the effect of the global temperature recovery dominates over that of the polar O3 hole recovery; after that, both will drop concurrently. All the observed, analytical and theoretical results presented lead to a convincing conclusion that both the CRE mechanism and the CFC-warming mechanism not only provide new fundamental understandings of the O3 hole and global climate change but have superior predictive capabilities, compared with the conventional models.
Six new mechanics corresponding to further shape theories
NASA Astrophysics Data System (ADS)
Anderson, Edward
2016-02-01
In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
Morgan, Elise F.; Mason, Zachary D.; Chien, Karen B.; Pfeiffer, Anthony J.; Barnes, George L.; Einhorn, Thomas A.; Gerstenfeld, Louis C.
2009-01-01
Non-invasive characterization of fracture callus structure and composition may facilitate development of surrogate measures of the regain of mechanical function. As such, quantitative computed tomography- (CT-) based analyses of fracture calluses could enable more reliable clinical assessments of bone healing. Although previous studies have used CT to quantify and predict fracture healing, it is unclear which of the many CT-derived metrics of callus structure and composition are the most predictive of callus mechanical properties. The goal of this study was to identify the changes in fracture callus structure and composition that occur over time and that are most closely related to the regain of mechanical function. Micro-computed tomography (μCT) imaging and torsion testing were performed on murine fracture calluses (n=188) at multiple post-fracture timepoints and under different experimental conditions that alter fracture healing. Total callus volume (TV), mineralized callus volume (BV), callus mineralized volume fraction (BV/TV), bone mineral content (BMC), tissue mineral density (TMD), standard deviation of mineral density (σTMD), effective polar moment of inertia (Jeff), torsional strength, and torsional rigidity were quantified. Multivariate statistical analyses, including multivariate analysis of variance, principal components analysis, and stepwise regression were used to identify differences in callus structure and composition among experimental groups and to determine which of the μCT outcome measures were the strongest predictors of mechanical properties. Although calluses varied greatly in the absolute and relative amounts of mineralized tissue (BV, BMC, and BV/TV), differences among timepoints were most strongly associated with changes in tissue mineral density. Torsional strength and rigidity were dependent on mineral density as well as the amount of mineralized tissue: TMD, BV, and σTMD explained 62% of the variation in torsional strength (p<0.001); and TMD, BMC, BV/TV, and σTMD explained 70% of the variation in torsional rigidity (p<0.001). These results indicate that fracture callus mechanical properties can be predicted by several μCT-derived measures of callus structure and composition. These findings form the basis for developing non-invasive assessments of fracture healing and for identifying biological and biomechanical mechanisms that lead to impaired or enhanced healing. PMID:19013264
Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas
2016-11-01
In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions
NASA Astrophysics Data System (ADS)
Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás
2016-03-01
Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.
The Immuno-Dynamics of Conflict Intervention in Social Systems
Krakauer, David C.; Page, Karen; Flack, Jessica
2011-01-01
We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed– a dyadic strategy – pacifying interventions, and a triadic strategy –policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity. PMID:21887221
The immuno-dynamics of conflict intervention in social systems.
Krakauer, David C; Page, Karen; Flack, Jessica
2011-01-01
We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed--a dyadic strategy--pacifying interventions, and a triadic strategy--policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity.
Modeling the atmospheric chemistry of TICs
NASA Astrophysics Data System (ADS)
Henley, Michael V.; Burns, Douglas S.; Chynwat, Veeradej; Moore, William; Plitz, Angela; Rottmann, Shawn; Hearn, John
2009-05-01
An atmospheric chemistry model that describes the behavior and disposition of environmentally hazardous compounds discharged into the atmosphere was coupled with the transport and diffusion model, SCIPUFF. The atmospheric chemistry model was developed by reducing a detailed atmospheric chemistry mechanism to a simple empirical effective degradation rate term (keff) that is a function of important meteorological parameters such as solar flux, temperature, and cloud cover. Empirically derived keff functions that describe the degradation of target toxic industrial chemicals (TICs) were derived by statistically analyzing data generated from the detailed chemistry mechanism run over a wide range of (typical) atmospheric conditions. To assess and identify areas to improve the developed atmospheric chemistry model, sensitivity and uncertainty analyses were performed to (1) quantify the sensitivity of the model output (TIC concentrations) with respect to changes in the input parameters and (2) improve, where necessary, the quality of the input data based on sensitivity results. The model predictions were evaluated against experimental data. Chamber data were used to remove the complexities of dispersion in the atmosphere.
Lakatos, Eszter; Salehi-Reyhani, Ali; Barclay, Michael; Stumpf, Michael P H; Klug, David R
2017-01-01
We determine p53 protein abundances and cell to cell variation in two human cancer cell lines with single cell resolution, and show that the fractional width of the distributions is the same in both cases despite a large difference in average protein copy number. We developed a computational framework to identify dominant mechanisms controlling the variation of protein abundance in a simple model of gene expression from the summary statistics of single cell steady state protein expression distributions. Our results, based on single cell data analysed in a Bayesian framework, lends strong support to a model in which variation in the basal p53 protein abundance may be best explained by variations in the rate of p53 protein degradation. This is supported by measurements of the relative average levels of mRNA which are very similar despite large variation in the level of protein.
Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions
Cassidy, Rachel N; Raiff, Bethany R
2013-01-01
Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668
An organelle-specific protein landscape identifies novel diseases and molecular mechanisms
Boldt, Karsten; van Reeuwijk, Jeroen; Lu, Qianhao; Koutroumpas, Konstantinos; Nguyen, Thanh-Minh T.; Texier, Yves; van Beersum, Sylvia E. C.; Horn, Nicola; Willer, Jason R.; Mans, Dorus A.; Dougherty, Gerard; Lamers, Ideke J. C.; Coene, Karlien L. M.; Arts, Heleen H.; Betts, Matthew J.; Beyer, Tina; Bolat, Emine; Gloeckner, Christian Johannes; Haidari, Khatera; Hetterschijt, Lisette; Iaconis, Daniela; Jenkins, Dagan; Klose, Franziska; Knapp, Barbara; Latour, Brooke; Letteboer, Stef J. F.; Marcelis, Carlo L.; Mitic, Dragana; Morleo, Manuela; Oud, Machteld M.; Riemersma, Moniek; Rix, Susan; Terhal, Paulien A.; Toedt, Grischa; van Dam, Teunis J. P.; de Vrieze, Erik; Wissinger, Yasmin; Wu, Ka Man; Apic, Gordana; Beales, Philip L.; Blacque, Oliver E.; Gibson, Toby J.; Huynen, Martijn A.; Katsanis, Nicholas; Kremer, Hannie; Omran, Heymut; van Wijk, Erwin; Wolfrum, Uwe; Kepes, François; Davis, Erica E.; Franco, Brunella; Giles, Rachel H.; Ueffing, Marius; Russell, Robert B.; Roepman, Ronald; Al-Turki, Saeed; Anderson, Carl; Antony, Dinu; Barroso, Inês; Bentham, Jamie; Bhattacharya, Shoumo; Carss, Keren; Chatterjee, Krishna; Cirak, Sebahattin; Cosgrove, Catherine; Danecek, Petr; Durbin, Richard; Fitzpatrick, David; Floyd, Jamie; Reghan Foley, A.; Franklin, Chris; Futema, Marta; Humphries, Steve E.; Hurles, Matt; Joyce, Chris; McCarthy, Shane; Mitchison, Hannah M.; Muddyman, Dawn; Muntoni, Francesco; O'Rahilly, Stephen; Onoufriadis, Alexandros; Payne, Felicity; Plagnol, Vincent; Raymond, Lucy; Savage, David B.; Scambler, Peter; Schmidts, Miriam; Schoenmakers, Nadia; Semple, Robert; Serra, Eva; Stalker, Jim; van Kogelenberg, Margriet; Vijayarangakannan, Parthiban; Walter, Klaudia; Whittall, Ros; Williamson, Kathy
2016-01-01
Cellular organelles provide opportunities to relate biological mechanisms to disease. Here we use affinity proteomics, genetics and cell biology to interrogate cilia: poorly understood organelles, where defects cause genetic diseases. Two hundred and seventeen tagged human ciliary proteins create a final landscape of 1,319 proteins, 4,905 interactions and 52 complexes. Reverse tagging, repetition of purifications and statistical analyses, produce a high-resolution network that reveals organelle-specific interactions and complexes not apparent in larger studies, and links vesicle transport, the cytoskeleton, signalling and ubiquitination to ciliary signalling and proteostasis. We observe sub-complexes in exocyst and intraflagellar transport complexes, which we validate biochemically, and by probing structurally predicted, disruptive, genetic variants from ciliary disease patients. The landscape suggests other genetic diseases could be ciliary including 3M syndrome. We show that 3M genes are involved in ciliogenesis, and that patient fibroblasts lack cilia. Overall, this organelle-specific targeting strategy shows considerable promise for Systems Medicine. PMID:27173435
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Empirical Nature and Statistical Treatment of Missing Data
ERIC Educational Resources Information Center
Tannenbaum, Christyn E.
2009-01-01
Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
How did the economic recession (2008-2010) influence traffic fatalities in OECD-countries?
Wegman, Fred; Allsop, Richard; Antoniou, Constantinos; Bergel-Hayat, Ruth; Elvik, Rune; Lassarre, Sylvain; Lloyd, Daryl; Wijnen, Wim
2017-05-01
This paper presents analyses of how the economic recession that started in 2008 has influenced the number of traffic fatalities in OECD countries. Previous studies of the relationship between economic recessions and changes in the number of traffic fatalities are reviewed. Based on these studies, a causal diagram of the relationship between changes of the business cycle and changes in the number of traffic fatalities is proposed. This causal model is tested empirically by means of multivariate analyses and analyses of accident statistics for Great Britain and Sweden. Economic recession, as indicated both by slower growth of, or decline of gross national product, and by increased unemployment is associated with an accelerated decline in the number of traffic fatalities, i.e. a larger decline than the long-term trend that is normal in OECD countries. The principal mechanisms bringing this about are a disproportionate reduction of driving among high-risk drivers, in particular young drivers and a reduction of fatality rate per kilometre of travel, probably attributable to changes in road user behaviour that are only partly observable. The total number of vehicle kilometres of travel did not change very much as a result of the recession. The paper is based on an ITF-report that presents the analyses in greater detail. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
NASA Astrophysics Data System (ADS)
Pieczara, Łukasz
2015-09-01
The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).
Statistical mechanics in the context of special relativity.
Kaniadakis, G
2002-11-01
In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of the ordinary statistical mechanics and is suitable to describe a very large class of experimentally observed phenomena in low and high energy physics and in natural, economic, and social sciences. Finally, in order to test the correctness and predictability of the theory, as working example we consider the cosmic rays spectrum, which spans 13 decades in energy and 33 decades in flux, finding a high quality agreement between our predictions and observed data.
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
DNA viewed as an out-of-equilibrium structure
NASA Astrophysics Data System (ADS)
Provata, A.; Nicolis, C.; Nicolis, G.
2014-05-01
The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.
Ventral and dorsal streams for choosing word order during sentence production
Thothathiri, Malathi; Rattinger, Michelle
2015-01-01
Proficient language use requires speakers to vary word order and choose between different ways of expressing the same meaning. Prior statistical associations between individual verbs and different word orders are known to influence speakers’ choices, but the underlying neural mechanisms are unknown. Here we show that distinct neural pathways are used for verbs with different statistical associations. We manipulated statistical experience by training participants in a language containing novel verbs and two alternative word orders (agent-before-patient, AP; patient-before-agent, PA). Some verbs appeared exclusively in AP, others exclusively in PA, and yet others in both orders. Subsequently, we used sparse sampling neuroimaging to examine the neural substrates as participants generated new sentences in the scanner. Behaviorally, participants showed an overall preference for AP order, but also increased PA order for verbs experienced in that order, reflecting statistical learning. Functional activation and connectivity analyses revealed distinct networks underlying the increased PA production. Verbs experienced in both orders during training preferentially recruited a ventral stream, indicating the use of conceptual processing for mapping meaning to word order. In contrast, verbs experienced solely in PA order recruited dorsal pathways, indicating the use of selective attention and sensorimotor integration for choosing words in the right order. These results show that the brain tracks the structural associations of individual verbs and that the same structural output may be achieved via ventral or dorsal streams, depending on the type of regularities in the input. PMID:26621706
DNA viewed as an out-of-equilibrium structure.
Provata, A; Nicolis, C; Nicolis, G
2014-05-01
The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.
Formalizing the definition of meta-analysis in Molecular Ecology.
ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E
2015-08-01
Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Probabilistic tsunami hazard analysis: Multiple sources and global applications
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-01-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications
NASA Astrophysics Data System (ADS)
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-12-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Variation of a test's sensitivity and specificity with disease prevalence.
Leeflang, Mariska M G; Rutjes, Anne W S; Reitsma, Johannes B; Hooft, Lotty; Bossuyt, Patrick M M
2013-08-06
Anecdotal evidence suggests that the sensitivity and specificity of a diagnostic test may vary with disease prevalence. Our objective was to investigate the associations between disease prevalence and test sensitivity and specificity using studies of diagnostic accuracy. We used data from 23 meta-analyses, each of which included 10-39 studies (416 total). The median prevalence per review ranged from 1% to 77%. We evaluated the effects of prevalence on sensitivity and specificity using a bivariate random-effects model for each meta-analysis, with prevalence as a covariate. We estimated the overall effect of prevalence by pooling the effects using the inverse variance method. Within a given review, a change in prevalence from the lowest to highest value resulted in a corresponding change in sensitivity or specificity from 0 to 40 percentage points. This effect was statistically significant (p < 0.05) for either sensitivity or specificity in 8 meta-analyses (35%). Overall, specificity tended to be lower with higher disease prevalence; there was no such systematic effect for sensitivity. The sensitivity and specificity of a test often vary with disease prevalence; this effect is likely to be the result of mechanisms, such as patient spectrum, that affect prevalence, sensitivity and specificity. Because it may be difficult to identify such mechanisms, clinicians should use prevalence as a guide when selecting studies that most closely match their situation.
Variation of a test’s sensitivity and specificity with disease prevalence
Leeflang, Mariska M.G.; Rutjes, Anne W.S.; Reitsma, Johannes B.; Hooft, Lotty; Bossuyt, Patrick M.M.
2013-01-01
Background: Anecdotal evidence suggests that the sensitivity and specificity of a diagnostic test may vary with disease prevalence. Our objective was to investigate the associations between disease prevalence and test sensitivity and specificity using studies of diagnostic accuracy. Methods: We used data from 23 meta-analyses, each of which included 10–39 studies (416 total). The median prevalence per review ranged from 1% to 77%. We evaluated the effects of prevalence on sensitivity and specificity using a bivariate random-effects model for each meta-analysis, with prevalence as a covariate. We estimated the overall effect of prevalence by pooling the effects using the inverse variance method. Results: Within a given review, a change in prevalence from the lowest to highest value resulted in a corresponding change in sensitivity or specificity from 0 to 40 percentage points. This effect was statistically significant (p < 0.05) for either sensitivity or specificity in 8 meta-analyses (35%). Overall, specificity tended to be lower with higher disease prevalence; there was no such systematic effect for sensitivity. Interpretation: The sensitivity and specificity of a test often vary with disease prevalence; this effect is likely to be the result of mechanisms, such as patient spectrum, that affect prevalence, sensitivity and specificity. Because it may be difficult to identify such mechanisms, clinicians should use prevalence as a guide when selecting studies that most closely match their situation. PMID:23798453
Getting a scientific paper published in Epilepsia: an editor's perspective.
Schwartzkroin, Philip A
2013-11-01
Getting a paper published in Epilepsia depends first and foremost on the quality of the work reported, and on the clarity and convincingness of the presentation. Papers should focus on important and interesting topics with clearly stated objectives and goals. The observations and findings are of greatest interest when they are novel and change our views on the mechanisms and/or treatment of an epileptic disease. Studies should be carefully designed to include adequate sample size, comparison groups, and statistical analyses. Critically, the data must be clearly presented and appropriately interpreted. If followed, these recommendations will improve an author's chances of having his/her paper accepted in a high quality journal like Epilepsia. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.
Antipodal hotspot pairs on the earth
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Caldeira, Ken
1992-01-01
The results of statistical analyses performed on three published hotspot distributions suggest that significantly more hotspots occur as nearly antipodal pairs than is anticipated from a random distribution, or from their association with geoid highs and divergent plate margins. The observed number of antipodal hotspot pairs depends on the maximum allowable deviation from exact antipodality. At a maximum deviation of not greater than 700 km, 26 to 37 percent of hotspots form antipodal pairs in the published lists examined here, significantly more than would be expected from the general hotspot distribution. Two possible mechanisms that might create such a distribution include: (1) symmetry in the generation of mantle plumes, and (2) melting related to antipodal focusing of seismic energy from large-body impacts.
Annual modulation of seismicity along the San Andreas Fault near Parkfield, CA
Christiansen, L.B.; Hurwitz, S.; Ingebritsen, S.E.
2007-01-01
We analyze seismic data from the San Andreas Fault (SAF) near Parkfield, California, to test for annual modulation in seismicity rates. We use statistical analyses to show that seismicity is modulated with an annual period in the creeping section of the fault and a semiannual period in the locked section of the fault. Although the exact mechanism for seasonal triggering is undetermined, it appears that stresses associated with the hydrologic cycle are sufficient to fracture critically stressed rocks either through pore-pressure diffusion or crustal loading/ unloading. These results shed additional light on the state of stress along the SAF, indicating that hydrologically induced stress perturbations of ???2 kPa may be sufficient to trigger earthquakes.
SEER Cancer Query Systems (CanQues)
These applications provide access to cancer statistics including incidence, mortality, survival, prevalence, and probability of developing or dying from cancer. Users can display reports of the statistics or extract them for additional analyses.
From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory
ERIC Educational Resources Information Center
Bringuier, E.
2008-01-01
The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…
Facilitating the Transition from Bright to Dim Environments
2016-03-04
For the parametric data, a multivariate ANOVA was used in determining the systematic presence of any statistically significant performance differences...performed. All significance levels were p < 0.05, and statistical analyses were performed with the Statistical Package for Social Sciences ( SPSS ...1950. Age changes in rate and level of visual dark adaptation. Journal of Applied Physiology, 2, 407–411. Field, A. 2009. Discovering statistics
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Imaging Depression in Adults with ASD
2017-10-01
collected temporally close enough to imaging data in Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk...Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk attrition between Phase 1 and 2, we chose to hold...supervision is ongoing (since 9/2014). • Co-l Dr. Lerner’s 2nd year Clinical Psychology PhD students have participated in ADOS- 2 Introductory Clinical
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Effect of CorrelatedRotational Noise
NASA Astrophysics Data System (ADS)
Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna
The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.
A statistical study of ionopause perturbation and associated boundary wave formation at Venus.
NASA Astrophysics Data System (ADS)
Chong, G. S.; Pope, S. A.; Walker, S. N.; Zhang, T.; Balikhin, M. A.
2017-12-01
In contrast to Earth, Venus does not possess an intrinsic magnetic field. Hence the interaction between solar wind and Venus is significantly different when compared to Earth, even though these two planets were once considered similar. Within the induced magnetosphere and ionosphere of Venus, previous studies have shown the existence of ionospheric boundary waves. These structures may play an important role in the atmospheric evolution of Venus. By using Venus Express data, the crossings of the ionopause boundary are determined based on the observations of photoelectrons during 2011. Pulses of dropouts in the electron energy spectrometer were observed in 92 events, which suggests potential perturbations of the boundary. Minimum variance analysis of the 1Hz magnetic field data for the perturbations is conducted and used to confirm the occurrence of the boundary waves. Statistical analysis shows that they were propagating mainly in the ±VSO-Y direction in the polar north terminator region. The generation mechanisms of boundary waves and their evolution into the potential nonlinear regime are discussed and analysed.
Qin, Zhao; Fabre, Andrea; Buehler, Markus J
2013-05-01
The stability of alpha helices is important in protein folding, bioinspired materials design, and controls many biological properties under physiological and disease conditions. Here we show that a naturally favored alpha helix length of 9 to 17 amino acids exists at which the propensity towards the formation of this secondary structure is maximized. We use a combination of thermodynamical analysis, well-tempered metadynamics molecular simulation and statistical analyses of experimental alpha helix length distributions and find that the favored alpha helix length is caused by a competition between alpha helix folding, unfolding into a random coil and formation of higher-order tertiary structures. The theoretical result is suggested to be used to explain the statistical distribution of the length of alpha helices observed in natural protein structures. Our study provides mechanistic insight into fundamental controlling parameters in alpha helix structure formation and potentially other biopolymers or synthetic materials. The result advances our fundamental understanding of size effects in the stability of protein structures and may enable the design of de novo alpha-helical protein materials.
Fluctuating observation time ensembles in the thermodynamics of trajectories
NASA Astrophysics Data System (ADS)
Budini, Adrián A.; Turner, Robert M.; Garrahan, Juan P.
2014-03-01
The dynamics of stochastic systems, both classical and quantum, can be studied by analysing the statistical properties of dynamical trajectories. The properties of ensembles of such trajectories for long, but fixed, times are described by large-deviation (LD) rate functions. These LD functions play the role of dynamical free energies: they are cumulant generating functions for time-integrated observables, and their analytic structure encodes dynamical phase behaviour. This ‘thermodynamics of trajectories’ approach is to trajectories and dynamics what the equilibrium ensemble method of statistical mechanics is to configurations and statics. Here we show that, just like in the static case, there are a variety of alternative ensembles of trajectories, each defined by their global constraints, with that of trajectories of fixed total time being just one of these. We show how the LD functions that describe an ensemble of trajectories where some time-extensive quantity is constant (and large) but where total observation time fluctuates can be mapped to those of the fixed-time ensemble. We discuss how the correspondence between generalized ensembles can be exploited in path sampling schemes for generating rare dynamical trajectories.
Big heart data: advancing health informatics through data sharing in cardiovascular imaging.
Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A
2015-07-01
The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases.
Language experience changes subsequent learning
Onnis, Luca; Thiessen, Erik
2013-01-01
What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. PMID:23200510
Compressor seal rub energetics study
NASA Technical Reports Server (NTRS)
Laverty, W. F.
1978-01-01
The rub mechanics of compressor abradable blade tip seals at simulated engine conditions were investigated. Twelve statistically planned, instrumented rub tests were conducted with titanium blades and Feltmetal fibermetal rubstrips. The tests were conducted with single stationary blades rubbing against seal material bonded to rotating test disks. The instantaneous rub torque, speed, incursion rate and blade temperatures were continuously measured and recorded. Basic rub parameters (incursion rate, rub depth, abradable density, blade thickness and rub velocity) were varied to determine the effects on rub energy and heat split between the blade, rubstrip surface and rub debris. The test data was reduced, energies were determined and statistical analyses were completed to determine the primary and interactive effects. Wear surface morphology, profile measurements and metallographic analysis were used to determine wear, glazing, melting and material transfer. The rub energies for these tests were most significantly affected by the incursion rate while rub velocity and blade thickness were of secondary importance. The ratios of blade wear to seal wear were representative of those experienced in engine operation of these seal system materials.
At least some errors are randomly generated (Freud was wrong)
NASA Technical Reports Server (NTRS)
Sellen, A. J.; Senders, J. W.
1986-01-01
An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Drug-Induced Dental Caries: A Disproportionality Analysis Using Data from VigiBase.
de Campaigno, Emilie Patras; Kebir, Inès; Montastruc, Jean-Louis; Rueter, Manuela; Maret, Delphine; Lapeyre-Mestre, Maryse; Sallerin, Brigitte; Despas, Fabien
2017-12-01
Dental caries is defined as a pathological breakdown of the tooth. It is an infectious phenomenon involving a multifactorial aetiology. The impact of drugs on cariogenic risk has been poorly investigated. In this study, we identified drugs suspected to induce dental caries as adverse drug reactions (ADRs) and then studied a possible pathogenic mechanism for each drug that had a statistically significant disproportionality. We extracted individual case safety reports of dental caries associated with drugs from VigiBase ® (the World Health Organization global individual case safety report database). We calculated disproportionality for each drug with a reporting odds ratio (ROR) and 99% confidence interval. We analysed the pharmacodynamics of each drug that had a statistically significant disproportionality. In VigiBase ® , 5229 safety reports for dental caries concerning 733 drugs were identified. Among these drugs, 88 had a significant ROR, and for 65 of them (73.9%), no information about dental caries was found in the summaries of the product characteristics, the Micromedex ® DRUGDEX, or the Martindale databases. Regarding the pharmacological classes of drugs involved in dental caries, we identified bisphosphonates, atropinic drugs, antidepressants, corticoids, immunomodulating drugs, antipsychotics, antiepileptics, opioids and β 2 -adrenoreceptor agonist drugs. Regarding possible pathogenic mechanisms for these drugs, we identified changes in salivary flow/composition for 54 drugs (61.4%), bone metabolism changes for 31 drugs (35.2%), hyperglycaemia for 32 drugs (36.4%) and/or immunosuppression for 23 drugs (26.1%). For nine drugs (10.2%), the mechanism was unclear. We identified 88 drugs with a significant positive disproportionality for dental caries. Special attention has to be paid to bisphosphonates, atropinic drugs, immunosuppressants and drugs causing hyperglycaemia.
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
Statistical Literacy in the Data Science Workplace
ERIC Educational Resources Information Center
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
Assessment of modified gold surfaced titanium implants on skeletal fixation
Zainali, Kasra; Danscher, Gorm; Jakobsen, Thomas; Baas, Jorgen; Møller, Per; Bechtold, Joan E.; Soballe, Kjeld
2013-01-01
Noncemented implants are the primary choice for younger patients undergoing total hip replacements. However, the major concern in this group of patients regarding revision is the concern from wear particles, periimplant inflammation, and subsequently aseptic implant loosening. Macrophages have been shown to liberate gold ions through the process termed dissolucytosis. Furthermore, gold ions are known to act in an anti-inflammatory manner by inhibiting cellular NF-κB-DNA binding. The present study investigated whether partial coating of titanium implants could augment early osseointegration and increase mechanical fixation. Cylindrical porous coated Ti-6Al-4V implants partially coated with metallic gold were inserted in the proximal region of the humerus in ten canines and control implants without gold were inserted in contralateral humerus. Observation time was 4 weeks. Biomechanical push out tests and stereological histomorphometrical analyses showed no statistically significant differences in the two groups. The unchanged parameters are considered an improvement of the coating properties, as a previous complete gold-coated implant showed inferior mechanical fixation and reduced osseointegration compared to control titanium implants in a similar model. Since sufficient early mechanical fixation is achieved with this new coating, it is reasonable to investigate the implant further in long-term studies. PMID:22847873
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Tengfei; Spinella, Laura; Im, Jay
2013-11-18
In this paper, we demonstrated the plasticity mechanism for copper (Cu) extrusion in through-silicon via structures under thermal cycling. The local plasticity was directly observed by synchrotron x-ray micro-diffraction near the top of the via with the amount increasing with the peak temperature. The Cu extrusion was confirmed by Atomic Force Microscopy (AFM) measurements and found to be consistent with the observed Cu plasticity behavior. A simple analytical model elucidated the role of plasticity during thermal cycling, and finite element analyses were carried out to confirm the plasticity mechanism as well as the effect of the via/Si interface. The modelmore » predictions were able to account for the via extrusions observed in two types of experiments, with one representing a nearly free sliding interface and the other a strongly bonded interface. Interestingly, the AFM extrusion profiles seemed to contour with the local grain structures near the top of the via, suggesting that the grain structure not only affects the yield strength of the Cu and thus its plasticity but could also be important in controlling the pop-up behavior and the statistics for a large ensemble of vias.« less
Xie, Hualin; Liu, Zhifei; Wang, Peng; Liu, Guiying; Lu, Fucai
2013-01-01
Ecological land is one of the key resources and conditions for the survival of humans because it can provide ecosystem services and is particularly important to public health and safety. It is extremely valuable for effective ecological management to explore the evolution mechanisms of ecological land. Based on spatial statistical analyses, we explored the spatial disparities and primary potential drivers of ecological land change in the Poyang Lake Eco-economic Zone of China. The results demonstrated that the global Moran’s I value is 0.1646 during the 1990 to 2005 time period and indicated significant positive spatial correlation (p < 0.05). The results also imply that the clustering trend of ecological land changes weakened in the study area. Some potential driving forces were identified by applying the spatial autoregressive model in this study. The results demonstrated that the higher economic development level and industrialization rate were the main drivers for the faster change of ecological land in the study area. This study also tested the superiority of the spatial autoregressive model to study the mechanisms of ecological land change by comparing it with the traditional linear regressive model. PMID:24384778
Deducing Wild 2 Components with a Statistical Dataset of Olivine in Chondrite Matrix
NASA Technical Reports Server (NTRS)
Frank, D. R.; Zolensky, M. E.; Le, L.
2012-01-01
Introduction: A preliminary exam of the Wild 2 olivine yielded a major element distribution that is strikingly similar to those for aqueously altered carbonaceous chondrites (CI, CM, and CR) [1], in which FeO-rich olivine is preferentially altered. With evidence lacking for large-scale alteration in Wild 2, the mechanism for this apparent selectivity is poorly understood. We use a statistical approach to explain this distribution in terms of relative contributions from different chondrite forming regions. Samples and Analyses: We have made a particular effort to obtain the best possible analyses of both major and minor elements in Wild 2 olivine and the 5-30 micrometer population in chondrite matrix. Previous studies of chondrite matrix either include larger isolated grains (not found in the Wild 2 collection) or lack minor element abundances. To overcome this gap in the existing data, we have now compiled greater than 10(exp 3) EPMA analyses of matrix olivine in CI, CM, CR, CH, Kakangari, C2-ungrouped, and the least equilibrated CO, CV, LL, and EH chondrites. Also, we are acquiring TEM/EDXS analyses of the Wild 2 olivine with 500s count times, to reduce relative errors of minor elements with respect to those otherwise available. Results: Using our Wild 2 analyses and those from [2], the revised major element distribution is more similar to anhydrous IDPs than previous results, which were based on more limited statistics (see figure below). However, a large frequency peak at Fa(sub 0-1) still persists. All but one of these grains has no detectable Cr, which is dissimilar to the Fa(sub 0-1) found in the CI and CM matrices. In fact, Fa(sub 0-1) with strongly depleted Cr content is a composition that appears to be unique to Kakangari and enstatite (highly reduced) chondrites. We also note the paucity of Fa(sub greater than 58), which would typically indicate crystallization in a more oxidizing environment [3]. We conclude that, relative to the bulk of anhydrous IDPs, Wild 2 may have received a larger contribution from the Kakangari and/or enstatite chondrite forming regions. Alternatively, Wild 2 may have undergone accretion in an anomalously reducing region, marked by nebular condensation of this atypical forsterite. In [4], a similar conclusion was reached with an Fe-XANES study. We will also use similar lines of reasoning, and our previous conclusions in [5], to constrain the relative contributions of silicates that appear to have been radially transported from different ordinary and carbonaceous chondrite forming regions to the Kuiper Belt. In addition, the widespread depletion of Cr in these FeO-rich (Fa(sub greater than 20)) fragments is consistent with mild thermal metamorphism in Wild 2.
Racial and ethnic differences in experimental pain sensitivity: systematic review and meta-analysis.
Kim, Hee Jun; Yang, Gee Su; Greenspan, Joel D; Downton, Katherine D; Griffith, Kathleen A; Renn, Cynthia L; Johantgen, Meg; Dorsey, Susan G
2017-02-01
Our objective was to describe the racial and ethnic differences in experimental pain sensitivity. Four databases (PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, and PsycINFO) were searched for studies examining racial/ethnic differences in experimental pain sensitivity. Thermal-heat, cold-pressor, pressure, ischemic, mechanical cutaneous, electrical, and chemical experimental pain modalities were assessed. Risk of bias was assessed using the Agency for Healthcare Research and Quality guideline. Meta-analysis was used to calculate standardized mean differences (SMDs) by pain sensitivity measures. Studies comparing African Americans (AAs) and non-Hispanic whites (NHWs) were included for meta-analyses because of high heterogeneity in other racial/ethnic group comparisons. Statistical heterogeneity was assessed by subgroup analyses by sex, sample size, sample characteristics, and pain modalities. A total of 41 studies met the review criteria. Overall, AAs, Asians, and Hispanics had higher pain sensitivity compared with NHWs, particularly lower pain tolerance, higher pain ratings, and greater temporal summation of pain. Meta-analyses revealed that AAs had lower pain tolerance (SMD: -0.90, 95% confidence intervals [CIs]: -1.10 to -0.70) and higher pain ratings (SMD: 0.50, 95% CI: 0.30-0.69) but no significant differences in pain threshold (SMD: -0.06, 95% CI: -0.23 to 0.10) compared with NHWs. Estimates did not vary by pain modalities, nor by other demographic factors; however, SMDs were significantly different based on the sample size. Racial/ethnic differences in experimental pain sensitivity were more pronounced with suprathreshold than with threshold stimuli, which is important in clinical pain treatment. Additional studies examining mechanisms to explain such differences in pain tolerance and pain ratings are needed.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Data-driven subtypes of major depressive disorder: a systematic review
2012-01-01
Background According to current classification systems, patients with major depressive disorder (MDD) may have very different combinations of symptoms. This symptomatic diversity hinders the progress of research into the causal mechanisms and treatment allocation. Theoretically founded subtypes of depression such as atypical, psychotic, and melancholic depression have limited clinical applicability. Data-driven analyses of symptom dimensions or subtypes of depression are scarce. In this systematic review, we examine the evidence for the existence of data-driven symptomatic subtypes of depression. Methods We undertook a systematic literature search of MEDLINE, PsycINFO and Embase in May 2012. We included studies analyzing the depression criteria of the Diagnostic and Statistical Manual of Mental Disorders, fourth edition (DSM-IV) of adults with MDD in latent variable analyses. Results In total, 1176 articles were retrieved, of which 20 satisfied the inclusion criteria. These reports described a total of 34 latent variable analyses: 6 confirmatory factor analyses, 6 exploratory factor analyses, 12 principal component analyses, and 10 latent class analyses. The latent class techniques distinguished 2 to 5 classes, which mainly reflected subgroups with different overall severity: 62 of 71 significant differences on symptom level were congruent with a latent class solution reflecting severity. The latent class techniques did not consistently identify specific symptom clusters. Latent factor techniques mostly found a factor explaining the variance in the symptoms depressed mood and interest loss (11 of 13 analyses), often complemented by psychomotor retardation or fatigue (8 of 11 analyses). However, differences in found factors and classes were substantial. Conclusions The studies performed to date do not provide conclusive evidence for the existence of depressive symptom dimensions or symptomatic subtypes. The wide diversity of identified factors and classes might result either from the absence of patterns to be found, or from the theoretical and modeling choices preceding analysis. PMID:23210727
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
Maximum entropy models of ecosystem functioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertram, Jason, E-mail: jason.bertram@anu.edu.au
2014-12-05
Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less
NASA Technical Reports Server (NTRS)
Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.;
2015-01-01
The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.
Meng, Xiang-He; Shen, Hui; Chen, Xiang-Ding; Xiao, Hong-Mei; Deng, Hong-Wen
2018-03-01
Genome-wide association studies (GWAS) have successfully identified numerous genetic variants associated with diverse complex phenotypes and diseases, and provided tremendous opportunities for further analyses using summary association statistics. Recently, Pickrell et al. developed a robust method for causal inference using independent putative causal SNPs. However, this method may fail to infer the causal relationship between two phenotypes when only a limited number of independent putative causal SNPs identified. Here, we extended Pickrell's method to make it more applicable for the general situations. We extended the causal inference method by replacing the putative causal SNPs with the lead SNPs (the set of the most significant SNPs in each independent locus) and tested the performance of our extended method using both simulation and empirical data. Simulations suggested that when the same number of genetic variants is used, our extended method had similar distribution of test statistic under the null model as well as comparable power under the causal model compared with the original method by Pickrell et al. But in practice, our extended method would generally be more powerful because the number of independent lead SNPs was often larger than the number of independent putative causal SNPs. And including more SNPs, on the other hand, would not cause more false positives. By applying our extended method to summary statistics from GWAS for blood metabolites and femoral neck bone mineral density (FN-BMD), we successfully identified ten blood metabolites that may causally influence FN-BMD. We extended a causal inference method for inferring putative causal relationship between two phenotypes using summary statistics from GWAS, and identified a number of potential causal metabolites for FN-BMD, which may provide novel insights into the pathophysiological mechanisms underlying osteoporosis.
Henley, Amy J; DiGennaro Reed, Florence D; Reed, Derek D; Kaplan, Brent A
2016-09-01
Incentives are a popular method to achieve desired employee performance; however, research on optimal incentive magnitude is lacking. Behavioral economic demand curves model persistence of responding in the face of increasing cost and may be suitable to examine the reinforcing value of incentives on work performance. The present use-inspired basic study integrated an experiential human operant task within a crowdsourcing platform to evaluate the applicability of behavioral economics for quantifying changes in workforce attrition. Participants included 88 Amazon Mechanical Turk Workers who earned either a $0.05 or $0.10 incentive for completing a progressively increasing response requirement. Analyses revealed statistically significant differences in breakpoint between the two groups. Additionally, a novel translation of the Kaplan-Meier survival-curve analyses for use within a demand curve framework allowed for examination of elasticity of workforce attrition. Results indicate greater inelastic attrition in the $0.05 group. We discuss the benefits of a behavioral economic approach to modeling employee behavior, how the metrics obtained from the elasticity of workforce attrition analyses (e.g., P max ) may be used to set goals for employee behavior while balancing organizational costs, and how economy type may have influenced observed outcomes. © 2016 Society for the Experimental Analysis of Behavior.
Does Educational Status Impact Adult Mortality in Denmark? A Twin Approach
Madsen, Mia; Andersen, Anne-Marie Nybo; Christensen, Kaare; Andersen, Per Kragh; Osler, Merete
2010-01-01
To disentangle an independent effect of educational status on mortality risk from direct and indirect selection mechanisms, the authors used a discordant twin pair design, which allowed them to isolate the effect of education by means of adjustment for genetic and environmental confounding per design. The study is based on data from the Danish Twin Registry and Statistics Denmark. Using Cox regression, they estimated hazard ratios for mortality according to the highest attained education among 5,260 monozygotic and 11,088 dizygotic same-sex twin pairs born during 1921–1950 and followed during 1980–2008. Both standard cohort and intrapair analyses were conducted separately for zygosity, gender, and birth cohort. Educational differences in mortality were demonstrated in the standard cohort analyses but attenuated in the intrapair analyses in all subgroups but men born during 1921–1935, and no effect modification by zygosity was observed. Hence, the results are most compatible with an effect of early family environment in explaining the educational inequality in mortality. However, large educational differences were still reflected in mortality risk differences within twin pairs, thus supporting some degree of independent effect of education. In addition, the effect of education may be more pronounced in older cohorts of Danish men. PMID:20530466
Responding to Nonwords in the Lexical Decision Task: Insights from the English Lexicon Project
Yap, Melvin J.; Sibley, Daragh E.; Balota, David A.; Ratcliff, Roger; Rueckl, Jay
2014-01-01
Researchers have extensively documented how various statistical properties of words (e.g., word-frequency) influence lexical processing. However, the impact of lexical variables on nonword decision-making performance is less clear. This gap is surprising, since a better specification of the mechanisms driving nonword responses may provide valuable insights into early lexical processes. In the present study, item-level and participant-level analyses were conducted on the trial-level lexical decision data for almost 37,000 nonwords in the English Lexicon Project in order to identify the influence of different psycholinguistic variables on nonword lexical decision performance, and to explore individual differences in how participants respond to nonwords. Item-level regression analyses reveal that nonword response time was positively correlated with number of letters, number of orthographic neighbors, number of affixes, and baseword number of syllables, and negatively correlated with Levenshtein orthographic distance and baseword frequency. Participant-level analyses also point to within- and between-session stability in nonword responses across distinct sets of items, and intriguingly reveal that higher vocabulary knowledge is associated with less sensitivity to some dimensions (e.g., number of letters) but more sensitivity to others (e.g., baseword frequency). The present findings provide well-specified and interesting new constraints for informing models of word recognition and lexical decision. PMID:25329078
NASA Astrophysics Data System (ADS)
Bovier, Anton
2006-06-01
Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field
Armstrong, Ben; Fleming, Lora E.; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L.
2017-01-01
Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally. PMID:28604791
Lo Iacono, Giovanni; Armstrong, Ben; Fleming, Lora E; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L
2017-06-01
Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally.
2012-01-01
Background Huntington’s disease (HD) is a fatal progressive neurodegenerative disorder caused by the expansion of the polyglutamine repeat region in the huntingtin gene. Although the disease is triggered by the mutation of a single gene, intensive research has linked numerous other genes to its pathogenesis. To obtain a systematic overview of these genes, which may serve as therapeutic targets, CHDI Foundation has recently established the HD Research Crossroads database. With currently over 800 cataloged genes, this web-based resource constitutes the most extensive curation of genes relevant to HD. It provides us with an unprecedented opportunity to survey molecular mechanisms involved in HD in a holistic manner. Methods To gain a synoptic view of therapeutic targets for HD, we have carried out a variety of bioinformatical and statistical analyses to scrutinize the functional association of genes curated in the HD Research Crossroads database. In particular, enrichment analyses were performed with respect to Gene Ontology categories, KEGG signaling pathways, and Pfam protein families. For selected processes, we also analyzed differential expression, using published microarray data. Additionally, we generated a candidate set of novel genetic modifiers of HD by combining information from the HD Research Crossroads database with previous genome-wide linkage studies. Results Our analyses led to a comprehensive identification of molecular mechanisms associated with HD. Remarkably, we not only recovered processes and pathways, which have frequently been linked to HD (such as cytotoxicity, apoptosis, and calcium signaling), but also found strong indications for other potentially disease-relevant mechanisms that have been less intensively studied in the context of HD (such as the cell cycle and RNA splicing, as well as Wnt and ErbB signaling). For follow-up studies, we provide a regularly updated compendium of molecular mechanism, that are associated with HD, at http://hdtt.sysbiolab.eu Additionally, we derived a candidate set of 24 novel genetic modifiers, including histone deacetylase 3 (HDAC3), metabotropic glutamate receptor 1 (GRM1), CDK5 regulatory subunit 2 (CDK5R2), and coactivator 1ß of the peroxisome proliferator-activated receptor gamma (PPARGC1B). Conclusions The results of our study give us an intriguing picture of the molecular complexity of HD. Our analyses can be seen as a first step towards a comprehensive list of biological processes, molecular functions, and pathways involved in HD, and may provide a basis for the development of more holistic disease models and new therapeutics. PMID:22741533
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
Competitive Processes in Cross-Situational Word Learning
Yurovsky, Daniel; Yu, Chen; Smith, Linda B.
2013-01-01
Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. But the information that learners pick up from these regularities is dependent on their learning mechanism. This paper investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input, and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales – both within and across trials/situations –learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is then considered from the perspective of a process-level understanding of cross-situational learning. PMID:23607610
Competitive processes in cross-situational word learning.
Yurovsky, Daniel; Yu, Chen; Smith, Linda B
2013-07-01
Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. However, the information that learners pick up from these regularities is dependent on their learning mechanism. This article investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales-both within and across trials/situations-learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is considered from the perspective of a process-level understanding of cross-situational learning. Copyright © 2013 Cognitive Science Society, Inc.
A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction
NASA Astrophysics Data System (ADS)
Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame
2018-04-01
This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.
Welch, Kyle J; Hastings-Hauss, Isaac; Parthasarathy, Raghuveer; Corwin, Eric I
2014-04-01
We have constructed a macroscopic driven system of chaotic Faraday waves whose statistical mechanics, we find, are surprisingly simple, mimicking those of a thermal gas. We use real-time tracking of a single floating probe, energy equipartition, and the Stokes-Einstein relation to define and measure a pseudotemperature and diffusion constant and then self-consistently determine a coefficient of viscous friction for a test particle in this pseudothermal gas. Because of its simplicity, this system can serve as a model for direct experimental investigation of nonequilibrium statistical mechanics, much as the ideal gas epitomizes equilibrium statistical mechanics.
Dwan, Kerry; Altman, Douglas G.; Clarke, Mike; Gamble, Carrol; Higgins, Julian P. T.; Sterne, Jonathan A. C.; Williamson, Paula R.; Kirkham, Jamie J.
2014-01-01
Background Most publications about selective reporting in clinical trials have focussed on outcomes. However, selective reporting of analyses for a given outcome may also affect the validity of findings. If analyses are selected on the basis of the results, reporting bias may occur. The aims of this study were to review and summarise the evidence from empirical cohort studies that assessed discrepant or selective reporting of analyses in randomised controlled trials (RCTs). Methods and Findings A systematic review was conducted and included cohort studies that assessed any aspect of the reporting of analyses of RCTs by comparing different trial documents, e.g., protocol compared to trial report, or different sections within a trial publication. The Cochrane Methodology Register, Medline (Ovid), PsycInfo (Ovid), and PubMed were searched on 5 February 2014. Two authors independently selected studies, performed data extraction, and assessed the methodological quality of the eligible studies. Twenty-two studies (containing 3,140 RCTs) published between 2000 and 2013 were included. Twenty-two studies reported on discrepancies between information given in different sources. Discrepancies were found in statistical analyses (eight studies), composite outcomes (one study), the handling of missing data (three studies), unadjusted versus adjusted analyses (three studies), handling of continuous data (three studies), and subgroup analyses (12 studies). Discrepancy rates varied, ranging from 7% (3/42) to 88% (7/8) in statistical analyses, 46% (36/79) to 82% (23/28) in adjusted versus unadjusted analyses, and 61% (11/18) to 100% (25/25) in subgroup analyses. This review is limited in that none of the included studies investigated the evidence for bias resulting from selective reporting of analyses. It was not possible to combine studies to provide overall summary estimates, and so the results of studies are discussed narratively. Conclusions Discrepancies in analyses between publications and other study documentation were common, but reasons for these discrepancies were not discussed in the trial reports. To ensure transparency, protocols and statistical analysis plans need to be published, and investigators should adhere to these or explain discrepancies. Please see later in the article for the Editors' Summary PMID:24959719
Krifka, Stephanie; Stangl, Martin; Wiesbauer, Sarah; Hiller, Karl-Anton; Schmalz, Gottfried; Federlin, Marianne
2009-09-01
No information is available to date about cusp design of thin (1.0 mm) non-functional cusps and its influence upon (1) marginal integrity of ceramic inlays (CI) and partial ceramic crowns (PCC) and (2) crack formation of dental tissues. The aim of this in vitro study was to investigate the effect of cusp coverage of thin non-functional cusps on marginal integrity and enamel crack formation. CI and PCC preparations were performed on extracted human molars. Non-functional cusps were adjusted to 1.0-mm wall thickness and 1.0-mm wall thickness with horizontal reduction of about 2.0 mm. Ceramic restorations (Vita Mark II, Cerec3 System) were adhesively luted with Excite/Variolink II. The specimens were exposed to thermocycling and central mechanical loading. Marginal integrity was assessed by evaluating dye penetration after thermal cycling and mechanical loading. Enamel cracks were documented under a reflective-light microscope. The data were statistically analysed with the Mann-Whitney U test, the Fishers exact test (alpha = 0.05) and the error rates method. PCC with horizontal reduction of non-functional cusps showed statistically significant less microleakage than PCC without such a cusp coverage. Preparation designs with horizontal reduction of non-functional cusps showed a tendency to less enamel crack formation than preparation designs without cusp coverage. Thin non-functional cusp walls of adhesively bonded restorations should be completely covered or reduced to avoid enamel cracks and marginal deficiency.
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
Using DEWIS and R for Multi-Staged Statistics e-Assessments
ERIC Educational Resources Information Center
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
The fossil record of evolution: Analysis of extinction
NASA Technical Reports Server (NTRS)
Raup, D. M.
1986-01-01
There is increasing evidence that events in space have had direct effects on Earth history and on the history of life on Earth. Nowhere is this more evident than in mass extinction. The biosphere has undergone repeated devastation caused by relatively short-lived environmental stress, with species kill rates up to 80 and 95%. For five of the mass extinctions, geochemical or other evidence was reported suggesting large body impact as the cause of the environmental stress producing the extinctions. It was argued on statistical ground that the major extinction events are uniformly periodic in geological time. If it is true that large body impact is a principal cause of mass extinctions and if the periodicity is real, than a cosmic driving mechanism is inescapable. Paleontological data sets were developed which detail the ranges in geological time of about 4,000 families and 25,000 genera of fossil marine organisms. Analyses to date have concentrated on the most recent 250 million years. Associated with these studies are analyses of other aspects of Earth history which may have signatures indicative of extraterrestrial effects.
NASA Technical Reports Server (NTRS)
Komar, P. D.
1984-01-01
The diversity of proposed origins for large Martian outflow channels results from the differing interpretations given to the landforms associated with the outflow channels. In an attempt to help limit the possible mechanisms of channel erosion, detailed studies of three of the channel features were done; the streamlined islands, longitudinal grooves and scour marks. This examination involved a comparison of the martian streamlined islands with various streamlined landforms on Earth including those found in the Channel Scabland in large rivers, glacial drumlins, and desert yardangs. The comparisons included statistical analyses of the landform lengths versus widths and positions of maximum width, and an examination of the degree of shape agreement with the geometric lemniscate which was in turn demonstrated to correspond closely with true airfoil shapes. The analyses showed that the shapes of the martian islands correspond closely to the streamlined islands in rivers and the Channel Scabland land. Drumlins show a much smaller correlation. Erosional rock islands formed by glaciers are very much different in shape.
Influences of roughness on the inertial mechanism of turbulent boundary-layer scale separation
NASA Astrophysics Data System (ADS)
Ebner, Rachel
Measurements and scaling analyses are conducted to clarify the combined effects of roughness and Reynolds number on momentum transport in the rough-wall zero pressure gradient turbulent boundary layer. A series of multi-sensor hot-wire experiments are presented that cover nearly a decade in Reynolds number and nearly three decades in the inner-normalized sand grain roughness. This dissertation utilizes the difference between two velocity-vorticity correlations to represent the turbulent inertia term in the statement of the mean dynamics for turbulent boundary layer flow. Analyses focus on the first term on the right hand side of the equation, because it is physically affiliated with change-of-scale effects (Tennekes and Lumley, 1972). Similarity analysis, streamwise correlations, and spectral methods are performed to elucidate the scaling behaviors of the turbulent inertia term relative to the mean dynamics. The present results reveal complex behaviors in the long-time statistics of the velocity-vorticity correlation that exhibit both Reynolds number and roughness dependencies. The results broadly support the combined roughness-Reynolds number description provided by Mehdi et al, (2013).
Glavičić, Snježana; Anić, Ivica; Braut, Alen; Miletić, Ivana; Borčić, Josipa
2011-08-01
The purpose was to measure and analyse the vertical force and torque developed in the wider and narrower root canals during hand ProTaper instrumentation. Twenty human incisors were divided in two groups. Upper incisors were experimental model for the wide, while the lower incisors for the narrow root canals. Measurements of the force and torque were done by a device constructed for this purpose. Differences between the groups were statistically analysed by Mann-Whitney U-test with the significance level set to P<0.05. Vertical force in the upper incisors ranged 0.25-2.58 N, while in the lower incisors 0.38-6.94 N. Measured torque in the upper incisors ranged 0.53-12.03 Nmm, while in the lower incisor ranged 0.94-10.0 Nmm. Vertical force and torque were higher in the root canals of smaller diameter. The increase in the contact surface results in increase of the vertical force and torque as well in both narrower and wider root canals. © 2010 The Authors. Australian Endodontic Journal © 2010 Australian Society of Endodontology.
Acoustic Features Influence Musical Choices Across Multiple Genres
Barone, Michael D.; Bansal, Jotthi; Woolhouse, Matthew H.
2017-01-01
Based on a large behavioral dataset of music downloads, two analyses investigate whether the acoustic features of listeners' preferred musical genres influence their choice of tracks within non-preferred, secondary musical styles. Analysis 1 identifies feature distributions for pairs of genre-defined subgroups that are distinct. Using correlation analysis, these distributions are used to test the degree of similarity between subgroups' main genres and the other music within their download collections. Analysis 2 explores the issue of main-to-secondary genre influence through the production of 10 feature-influence matrices, one per acoustic feature, in which cell values indicate the percentage change in features for genres and subgroups compared to overall population averages. In total, 10 acoustic features and 10 genre-defined subgroups are explored within the two analyses. Results strongly indicate that the acoustic features of people's main genres influence the tracks they download within non-preferred, secondary musical styles. The nature of this influence and its possible actuating mechanisms are discussed with respect to research on musical preference, personality, and statistical learning. PMID:28725200
Fracture Mechanics Analyses of Reinforced Carbon-Carbon Wing-Leading-Edge Panels
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Phillips, Dawn R.; Knight, Norman F., Jr.; Song, Kyongchan
2010-01-01
Fracture mechanics analyses of subsurface defects within the joggle regions of the Space Shuttle wing-leading-edge RCC panels are performed. A 2D plane strain idealized joggle finite element model is developed to study the fracture behavior of the panels for three distinct loading conditions - lift-off and ascent, on-orbit, and entry. For lift-off and ascent, an estimated bounding aerodynamic pressure load is used for the analyses, while for on-orbit and entry, thermo-mechanical analyses are performed using the extreme cold and hot temperatures experienced by the panels. In addition, a best estimate for the material stress-free temperature is used in the thermo-mechanical analyses. In the finite element models, the substrate and coating are modeled separately as two distinct materials. Subsurface defects are introduced at the coating-substrate interface and within the substrate. The objective of the fracture mechanics analyses is to evaluate the defect driving forces, which are characterized by the strain energy release rates, and determine if defects can become unstable for each of the loading conditions.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T; Pereira, Carol; Rosenkranz, Susan L; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu Jeanne; Wang, Rui; Lok, Judith; Evans, Scott R
2017-03-15
The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Glass-Kaastra, Shiona K.; Pearl, David L.; Reid-Smith, Richard J.; McEwen, Beverly; Slavic, Durda; McEwen, Scott A.; Fairles, Jim
2014-01-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection. PMID:24688133
Glass-Kaastra, Shiona K; Pearl, David L; Reid-Smith, Richard J; McEwen, Beverly; Slavic, Durda; McEwen, Scott A; Fairles, Jim
2014-04-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection.
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
Brands, H; Maassen, S R; Clercx, H J
1999-09-01
In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.
Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics
NASA Astrophysics Data System (ADS)
Ohzeki, Masayuki
2013-09-01
In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.
Physical Activity for Cognitive and Mental Health in Youth: A Systematic Review of Mechanisms.
Lubans, David; Richards, Justin; Hillman, Charles; Faulkner, Guy; Beauchamp, Mark; Nilsson, Michael; Kelly, Paul; Smith, Jordan; Raine, Lauren; Biddle, Stuart
2016-09-01
Physical activity can improve cognitive and mental health, but the underlying mechanisms have not been established. To present a conceptual model explaining the mechanisms for the effect of physical activity on cognitive and mental health in young people and to conduct a systematic review of the evidence. Six electronic databases (PubMed, PsycINFO, SCOPUS, Ovid Medline, SportDiscus, and Embase) were used. School-, home-, or community-based physical activity intervention or laboratory-based exercise interventions were assessed. Studies were eligible if they reported statistical analyses of changes in the following: (1) cognition or mental health; and (2) neurobiological, psychosocial, and behavioral mechanisms. Data relating to methods, assessment period, participant characteristics, intervention type, setting, and facilitator/delivery were extracted. Twenty-five articles reporting results from 22 studies were included. Mechanisms studied were neurobiological (6 studies), psychosocial (18 studies), and behavioral (2 studies). Significant changes in at least 1 potential neurobiological mechanism were reported in 5 studies, and significant effects for at least 1 cognitive outcome were also found in 5 studies. One of 2 studies reported a significant effect for self-regulation, but neither study reported a significant impact on mental health. Small number of studies and high levels of study heterogeneity. The strongest evidence was found for improvements in physical self-perceptions, which accompanied enhanced self-esteem in the majority of studies measuring these outcomes. Few studies examined neurobiological and behavioral mechanisms, and we were unable to draw conclusions regarding their role in enhancing cognitive and mental health. Copyright © 2016 by the American Academy of Pediatrics.
The inverse relationship between prostate-specific antigen (PSA) and obesity.
Aref, Adel; Vincent, Andrew D; O'Callaghan, Michael; Martin, Sean; Sutherland, Peter; Hoy, Andrew; Butler, Lisa M; Wittert, Gary
2018-06-25
Obese men have lower serum prostate-specific antigen (PSA) than comparably aged lean men, but the underlying mechanism remains unclear. The aim of this study was to determine the effect of obesity on PSA and the potential contributing mechanisms. A cohort of 1195 men aged 35 years and over at recruitment, with demographic, anthropometric (body mass index (BMI), waist circumference (WC)) and serum hormone (serum testosterone (T), estradiol (E2)), PSA and hematology assessments obtained over two waves was assessed. Men with a history of prostate cancer or missing PSA were excluded, leaving 970 men for the final analysis. Mixed-effects regressions and mediation analyses adjusting for hormonal and volumetric factors explore the potential mechanisms relating obesity to PSA. After adjusting for age, PSA levels were lower in men with greater WC (p=0.001). In a multivariable model including WC, age, E2/T and PlasV as predictors, no statistically significant associations were observed between with PSA and either WC (p=0.36) or PlasV (p=0.49), while strong associations were observed with both E2/T (p<0.001) and age (p<0.001). In the mediation analyses with PlasV as the mediator, the average causal mediation effect (ACME) explained roughly 0.2 of the total effect of WC on PSA (p=0.31), while when E2/T is a mediator; the ACME explained roughly 0.5 of the effect (p<0.001). Our findings indicate that lower PSA levels in obese men, as compared to normal weight men, can be explained both by hormonal changes (elevated E2/T ratio) and haemodilution. Hormonal factors therefore represent a substantial but underappreciated mediating pathway.
ERIC Educational Resources Information Center
Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.
2010-01-01
The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…
A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
NASA Astrophysics Data System (ADS)
Rich, Grayson Currie
The COHERENT Collaboration has produced the first-ever observation, with a significance of 6.7sigma, of a process consistent with coherent, elastic neutrino-nucleus scattering (CEnuNS) as first predicted and described by D.Z. Freedman in 1974. Physics of the CEnuNS process are presented along with its relationship to future measurements in the arenas of nuclear physics, fundamental particle physics, and astroparticle physics, where the newly-observed interaction presents a viable tool for investigations into numerous outstanding questions about the nature of the universe. To enable the CEnuNS observation with a 14.6-kg CsI[Na] detector, new measurements of the response of CsI[Na] to low-energy nuclear recoils, which is the only mechanism by which CEnuNS is detectable, were carried out at Triangle Universities Nuclear Laboratory; these measurements are detailed and an effective nuclear-recoil quenching factor of 8.78 +/- 1.66% is established for CsI[Na] in the recoil-energy range of 5-30 keV, based on new and literature data. Following separate analyses of the CEnuNS-search data by groups at the University of Chicago and the Moscow Engineering and Physics Institute, information from simulations, calculations, and ancillary measurements were used to inform statistical analyses of the collected data. Based on input from the Chicago analysis, the number of CEnuNS events expected from the Standard Model is 173 +/- 48; interpretation as a simple counting experiment finds 136 +/- 31 CEnuNS counts in the data, while a two-dimensional, profile likelihood fit yields 134 +/- 22 CEnuNS counts. Details of the simulations, calculations, and supporting measurements are discussed, in addition to the statistical procedures. Finally, potential improvements to the CsI[Na]-based CEnuNS measurement are presented along with future possibilities for COHERENT Collaboration, including new CEnuNS detectors and measurement of the neutrino-induced neutron spallation process.
Burkhart, Timothy A; Asa, Benjamin; Payne, Michael W C; Johnson, Marjorie; Dunning, Cynthia E; Wilson, Timothy D
2015-02-01
A result of below-knee amputations (BKAs) is abnormal motion that occurs about the proximal tibiofibular joint (PTFJ). While it is known that joint morphology may play a role in joint kinematics, this is not well understood with respect to the PTFJ. Therefore, the purposes of this study were: (i) to characterize the anatomy of the PTFJ and statistically analyze the relationships within the joint; and (ii) to determine the relationships between the PTFJ characteristics and the degree of movement of the fibula in BKAs. The PTFJ was characterized in 40 embalmed specimens disarticulated at the knee, and amputated through the mid-tibia and fibula. Four metrics were measured: inclination angle (angle at which the fibula articulates with the tibia); tibial and fibular articular surface areas; articular surface concavity and shape. The specimens were mechanically tested by applying a load through the biceps femoris tendon, and the degree of motion about the tibiofibular joint was measured. Regression analyses were performed to determine the relationships between the different PTFJ characteristics and the magnitude of fibular abduction. Finally, Pearson correlation analyses were performed on inclination angle and surface area vs. fibular kinematics. The inclination angle measured on the fibula was significantly greater than that measured on the tibia. This difference may be attributed to differences in concavity of the tibial and fibular surfaces. Surface area measured on the tibia and fibula was not statistically different. The inclination angle was not statistically correlated to surface area. However, when correlating fibular kinematics in BKAs, inclination angle was positively correlated to the degree of fibular abduction, whereas surface area was negatively correlated. The characteristics of the PTFJ dictate the amount of fibular movement, specifically, fibular abduction in BKAs. Predicting BKA complications based on PTFJ characteristics can lead to recommendations in treatment. © 2014 Anatomical Society.
Statistical mechanics of multipartite entanglement
NASA Astrophysics Data System (ADS)
Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.
2009-02-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
Orbital Roof Fractures as an Indicator for Concomitant Ocular Injury
2017-11-12
between these two groups to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and exam findings, and...using Pearson’s x2 test or Fisher’s exact test to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and
NASA Astrophysics Data System (ADS)
Gebremicael, Tesfay G.; Mohamed, Yasir A.; Zaag, Pieter v.; Hagos, Eyasu Y.
2017-04-01
The Upper Tekezē-Atbara river sub-basin, part of the Nile Basin, is characterized by high temporal and spatial variability of rainfall and streamflow. In spite of its importance for sustainable water use and food security, the changing patterns of streamflow and its association with climate change is not well understood. This study aims to improve the understanding of the linkages between rainfall and streamflow trends and identify possible drivers of streamflow variabilities in the basin. Trend analyses and change-point detections of rainfall and streamflow were analysed using Mann-Kendall and Pettitt tests, respectively, using data records for 21 rainfall and 9 streamflow stations. The nature of changes and linkages between rainfall and streamflow were carefully examined for monthly, seasonal and annual flows, as well as indicators of hydrologic alteration (IHA). The trend and change-point analyses found that 19 of the tested 21 rainfall stations did not show statistically significant changes. In contrast, trend analyses on the streamflow showed both significant increasing and decreasing patterns. A decreasing trend in the dry season (October to February), short season (March to May), main rainy season (June to September) and annual totals is dominant in six out of the nine stations. Only one out of nine gauging stations experienced significant increasing flow in the dry and short rainy seasons, attributed to the construction of Tekezē hydropower dam upstream this station in 2009. Overall, streamflow trends and change-point timings were found to be inconsistent among the stations. Changes in streamflow without significant change in rainfall suggests factors other than rainfall drive the change. Most likely the observed changes in streamflow regimes could be due to changes in catchment characteristics of the basin. Further studies are needed to verify and quantify the hydrological changes shown in statistical tests by identifying the physical mechanisms behind those changes. The findings from this study are useful as a prerequisite for studying the effects of catchment management dynamics on the hydrological variabilities in the basin.
Lewis, Gregory F.; Furman, Senta A.; McCool, Martha F.; Porges, Stephen W.
2011-01-01
Three frequently used RSA metrics are investigated to document violations of assumptions for parametric analyses, moderation by respiration, influences of nonstationarity, and sensitivity to vagal blockade. Although all metrics are highly correlated, new findings illustrate that the metrics are noticeably different on the above dimensions. Only one method conforms to the assumptions for parametric analyses, is not moderated by respiration, is not influenced by nonstationarity, and reliably generates stronger effect sizes. Moreover, this method is also the most sensitive to vagal blockade. Specific features of this method may provide insights into improving the statistical characteristics of other commonly used RSA metrics. These data provide the evidence to question, based on statistical grounds, published reports using particular metrics of RSA. PMID:22138367
Confidence crisis of results in biomechanics research.
Knudson, Duane
2017-11-01
Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.
Bennett, Bradley C; Husby, Chad E
2008-03-28
Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
FHWA statistical program : a customer's guide to using highway statistics
DOT National Transportation Integrated Search
1995-08-01
The appropriate level of spatial and temporal data aggregation for highway vehicle emissions analyses is one of several important analytical questions that has received considerable interest following passage of the Clean Air Act Amendments (CAAA) of...
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Quantitative approaches in climate change ecology
Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J
2011-01-01
Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.
Soil cover characterization at large scale: the example of Perugia Province in central Italy
NASA Astrophysics Data System (ADS)
Fanelli, Giulia; Salciarini, Diana; Tamagnini, Claudio
2015-04-01
In the last years, physically-based models aimed at predicting the occurrence of landslides have had a large diffusion because the opportunity of having landslide susceptibility maps can be essential to reduce damages and human losses. On one hand physically-based models rationally analyse problems, because mathematically describe the physical processes that actually happen, on the other hand their diffusion is limited by the difficulty of having and managing accurate data over large areas. For this reason, and also because in the Perugia province geotechnical data are partial and not regularly distributed, a data collection campaign has been started in order to have a wide physical-mechanical data set that can be used to apply any physically-based model. The collected data have been derived from mechanical tests and investigations performed to characterize the soil. The data set includes about 3000 points and each record is characterized by the following quantitative information: coordinates, geological description, cohesion, friction angle. Besides, the records contain the results of seismic tests that allow knowing the shear waves velocity in the first 30 meters of soil. The database covers the whole Perugia province territory and it can be used to evaluate the effects of both rainfall-induced and earthquake-induced landslides. The database has been analysed in order to exclude possible outliers; starting from the all data set, 16 lithological units have been isolated, each one with homogeneous geological features and the same mechanical behaviour. It is important to investigate the quality of the data and know how much they are reliable; therefore statistical analyses have been performed to quantify the dispersion of the data - i.e. relative and cumulative frequency - and also geostatistical analyses to know the spatial correlation - i.e. the variogram. The empirical variogram is a common and useful tool in geostatistics because it quantifies the spatial correlation between data. Once the variogram has been calculated, it is possible to use it to forecast the best estimation of a parameter in a generic point where information are missing. One of the most used interpolation techniques is the Kriging, which makes a prediction of a function in a given point as weighted average of known values of such function in the nearest points, deriving the weights from the variogram.
Jacob, Benjamin J; Krapp, Fiorella; Ponce, Mario; Gottuzzo, Eduardo; Griffith, Daniel A; Novak, Robert J
2010-05-01
Spatial autocorrelation is problematic for classical hierarchical cluster detection tests commonly used in multi-drug resistant tuberculosis (MDR-TB) analyses as considerable random error can occur. Therefore, when MDRTB clusters are spatially autocorrelated the assumption that the clusters are independently random is invalid. In this research, a product moment correlation coefficient (i.e., the Moran's coefficient) was used to quantify local spatial variation in multiple clinical and environmental predictor variables sampled in San Juan de Lurigancho, Lima, Peru. Initially, QuickBird 0.61 m data, encompassing visible bands and the near infra-red bands, were selected to synthesize images of land cover attributes of the study site. Data of residential addresses of individual patients with smear-positive MDR-TB were geocoded, prevalence rates calculated and then digitally overlaid onto the satellite data within a 2 km buffer of 31 georeferenced health centers, using a 10 m2 grid-based algorithm. Geographical information system (GIS)-gridded measurements of each health center were generated based on preliminary base maps of the georeferenced data aggregated to block groups and census tracts within each buffered area. A three-dimensional model of the study site was constructed based on a digital elevation model (DEM) to determine terrain covariates associated with the sampled MDR-TB covariates. Pearson's correlation was used to evaluate the linear relationship between the DEM and the sampled MDR-TB data. A SAS/GIS(R) module was then used to calculate univariate statistics and to perform linear and non-linear regression analyses using the sampled predictor variables. The estimates generated from a global autocorrelation analyses were then spatially decomposed into empirical orthogonal bases using a negative binomial regression with a non-homogeneous mean. Results of the DEM analyses indicated a statistically non-significant, linear relationship between georeferenced health centers and the sampled covariate elevation. The data exhibited positive spatial autocorrelation and the decomposition of Moran's coefficient into uncorrelated, orthogonal map pattern components revealed global spatial heterogeneities necessary to capture latent autocorrelation in the MDR-TB model. It was thus shown that Poisson regression analyses and spatial eigenvector mapping can elucidate the mechanics of MDR-TB transmission by prioritizing clinical and environmental-sampled predictor variables for identifying high risk populations.
2012-01-01
Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968
Mathematical background and attitudes toward statistics in a sample of Spanish college students.
Carmona, José; Martínez, Rafael J; Sánchez, Manuel
2005-08-01
To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.
In vitro determination of the mechanical and chemical properties of a fibre orthodontic retainer.
Silvestrini-Biavati, Armando; Angiero, Francesca; Gibelli, Francesca; Signore, Antonio; Benedicenti, Stefano
2012-12-01
The aim of this study was to analyse, in vitro, the chemical and mechanical properties of a new fibre retainer, Everstick, comparing its characteristics with the requirements for an orthodontic retainer. Chemical analysis was used to examine seven fibre bundles exposed to a photocuring lamp and then to different acids and resistance to corrosion by artificial saliva fortified with plaque acids. The mechanical properties examined were tensile strength and resistance to flexural force. Ten fibre samples were tested for each mechanical analysis and the mean value and standard deviation were calculated. Wilcoxon signed rank test was used to evaluate change in weight after treatment in each group. To determine changes over time between the groups for each acid considered separately, both repeated measures analysis of variance (ANOVA) on original data and on rank transformed data were used. If the results were different, ANOVA on rank-transformed data was considered. Acetic acid was found to be the most corrosive and caused the most substance loss: both pure and at the salivary pH value. Hydrofluoric acid was the most damaging. For all acids analysed in both groups (lactic, formic, acetic, propionic), changes after treatment were statistically different between two groups (P < 0.001 for lactic, acetic, propionic; P = 0.004 for formic acid).The mean Young's modulus value was 68 510 MPa. Deformation before the fibre separated into its constituent elements (glass fibre and composite) was 3.9 per cent, stress to rupture was 1546 MPa, and resistance to bending was 534 MPa. The deflection produced over a length of 12 mm was 1.4 mm. The fibre bundle was attacked by acids potentially present in the oral cavity; the degree of aggressiveness depending on the acid concentration. To preserve fibre bundles long term, careful plaque control is necessary, especially in the interproximal spaces, to avoid acid formation. The tested product was found to be sufficiently strong to oppose flexural and occlusal forces.
Mechanisms of ACL injury in professional rugby union: a systematic video analysis of 36 cases.
Montgomery, Connor; Blackburn, Jeff; Withers, Daniel; Tierney, Gregory; Moran, Cathal; Simms, Ciaran
2016-12-30
The mechanisms of ACL injury in rugby are not well defined. To describe the mechanisms of ACL injury in male professional rugby players using systematic video analysis. 36 cases from games played in top professional leagues and international matches were analysed. 5 analysts independently assessed all videos to record the estimated frame/time of initial ground contact, frame/time of ACL tear and a range of play specific variables. This included contact versus non-contact ACL injuries, injury timing, joint flexion angles and foot contact with the ground. 37 side-stepping manoeuvres from a control game were analysed to allow comparison of non-injury versus injury situations. 57% of ACL injuries occurred in a contact manner. 2 main scenarios were identified: (1) offensive running and (2) being tackled, indicating that the ball carrier might be at higher risk of ACL injury. The majority of non-contact ACL injuries resulted from a side-stepping manoeuvre. In most non-contact cases, initial ground contact was through heel strike. Statistical assessment of heel strike at initial ground contact versus non-heel strike cases showed a significant difference in injury versus non-injury outcomes, with heel strike associated with higher injury risk. Non-contact ACL injuries had lower median knee flexion angles and a more dorsiflexed ankle when compared with a control group (10° vs 20°, p≤0.001 and 10° vs 0°, p=0.033 respectively). Over half of ACL injuries in rugby in our analysis resulted from a contact mechanism. For non-contact injuries, lower knee flexion angles and heel-first ground contact in a side-stepping manoeuvre were associated with ACL injury. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Fluctuations of conserved charges in relativistic heavy ion collisions: An introduction
NASA Astrophysics Data System (ADS)
Asakawa, Masayuki; Kitazawa, Masakiyo
2016-09-01
Bulk fluctuations of conserved charges measured by event-by-event analysis in relativistic heavy ion collisions are observables which are believed to carry significant amount of information on the hot medium created by the collisions. Active studies have been done recently experimentally, theoretically, and on the lattice. In particular, non-Gaussianity of the fluctuations has acquired much attention recently. In this review, we give a pedagogical introduction to these issues, and survey recent developments in this field of research. Starting from the definition of cumulants, basic concepts in fluctuation physics, such as thermal fluctuations in statistical mechanics and time evolution of fluctuations in diffusive systems, are described. Phenomena which are expected to occur in finite temperature and/or density QCD matter and their measurement by event-by-event analyses are also elucidated.
Ab initio Study on Ionization Energies of 3-Amino-1-propanol
NASA Astrophysics Data System (ADS)
Wang, Ke-dong; Jia, Ying-bin; Lai, Zhen-jiang; Liu, Yu-fang
2011-06-01
Fourteen conformers of 3-amino-1-propanol as the minima on the potential energy surface are examined at the MP2/6-311++G** level. Their relative energies calculated at B3LYP, MP3 and MP4 levels of theory indicated that two most stable conformers display the intramolecular OH···N hydrogen bonds. The vertical ionization energies of these conformers calculated with ab initio electron propagator theory in the P3/aug-cc-pVTZ approximation are in agreement with experimental data from photoelectron spectroscopy. Natural bond orbital analyses were used to explain the differences of IEs of the highest occupied molecular ortibal of conformers. Combined with statistical mechanics principles, conformational distributions at various temperatures are obtained and the temperature dependence of photoelectron spectra is interpreted.
Iancu, Violeta; Hla, Saw-Wai
2006-01-01
Single chlorophyll-a molecules, a vital resource for the sustenance of life on Earth, have been investigated by using scanning tunneling microscope manipulation and spectroscopy on a gold substrate at 4.6 K. Chlorophyll-a binds on Au(111) via its porphyrin unit while the phytyl-chain is elevated from the surface by the support of four CH3 groups. By injecting tunneling electrons from the scanning tunneling microscope tip, we are able to bend the phytyl-chain, which enables the switching of four molecular conformations in a controlled manner. Statistical analyses and structural calculations reveal that all reversible switching mechanisms are initiated by a single tunneling-electron energy-transfer process, which induces bond rotation within the phytyl-chain. PMID:16954201
Rainfall variability in southern Spain on decadal to centennial time scales
NASA Astrophysics Data System (ADS)
Rodrigo, F. S.; Esteban-Parra, M. J.; Pozo-Vázquez, D.; Castro-Díez, Y.
2000-06-01
In this work a long rainfall series in Andalusia (southern Spain) is analysed. Methods of historical climatology were used to reconstruct a 500-year series from historical sources. Different statistical tools were used to detect and characterize significant changes in this series. Results indicate rainfall fluctuations, without abrupt changes, in the following alternating dry and wet phases: 1501-1589 dry, 1590-1649 wet, 1650-1775 dry, 1776-1937 wet and 1938-1997 dry. Possible causal mechanisms are discussed, emphasizing the important contribution of the North Atlantic Oscillation (NAO) to rainfall variability in the region. Solar activity is discussed in relation to the Maunder Minimum period, and finally the past and present are compared. Results indicate that the magnitude of fluctuations is similar in the past and present.
Williams, Paige; Kern, Margaret L; Waters, Lea
2016-01-01
Employee psychological capital (PsyCap), perceptions of organizational virtue (OV), and work happiness have been shown to be associated within and over time. This study examines selective exposure and confirmation bias as potential processes underlying PsyCap, OV, and work happiness associations. As part of a quasi-experimental study design, school staff (N = 69) completed surveys at three time points. After the first assessment, some staff (n = 51) completed a positive psychology training intervention. Results of descriptive statistics, correlation, and regression analyses on the intervention group provide some support for selective exposure and confirmation bias as explanatory mechanisms. In focusing on the processes through which employee attitudes may influence work happiness this study advances theoretical understanding, specifically of selective exposure and confirmation bias in a field study context.
Quantum mechanics of black holes.
Witten, Edward
2012-08-03
The popular conception of black holes reflects the behavior of the massive black holes found by astronomers and described by classical general relativity. These objects swallow up whatever comes near and emit nothing. Physicists who have tried to understand the behavior of black holes from a quantum mechanical point of view, however, have arrived at quite a different picture. The difference is analogous to the difference between thermodynamics and statistical mechanics. The thermodynamic description is a good approximation for a macroscopic system, but statistical mechanics describes what one will see if one looks more closely.
NASA Astrophysics Data System (ADS)
Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig
2011-03-01
Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!
Hawe, Penelope; Bond, Lyndal; Ghali, Laura M; Perry, Rosemary; Davison, Colleen M; Casey, David M; Butler, Helen; Webster, Cynthia M; Scholz, Bert
2015-03-19
Whole school, ethos-changing interventions reduce risk behaviours in middle adolescence, more than curriculum-based approaches. Effects on older ages are not known. We set out to replicate one of these interventions, Australia's Gatehouse Project, in a rural Canadian high school. A guided, whole school change process sought to make students feel more safe, connected, and valued by: changes in teaching practices, orientation processes, professional development of staff, recognition and reward mechanisms, elevating student voice, and strategies to involve greater proactivity and participation. We conducted risk behaviour surveys in grades 10 to 12 before the intervention and 2 years afterwards, and social network analyses with the staff. Changes in health and health risk behaviours were assessed using chi-square. Interactions between the intervention and gender and between the intervention and school engagement were assessed using interaction terms in logistic regression models. Changes in the density of relationships among staff were tested with methods analogous to paired t-tests. Like Gatehouse, there was no statistically significant reduction in depressive symptoms or bullying, though the trend was in that direction. Among girls, there was a statistically significant decrease in low school engagement (45% relative reduction), and decreases in drinking (46% relative reduction), unprotected sex (61% relative reduction) and poor health (relative reduction of 73%). The reduction in drinking matched the national trend. Reductions in unprotected sex and poor health went against the national trend. We found no statistically significant changes for boys. The effects coincided with statistically significant increases in the densities of staff networks, indicating that part of the mechanism may be through relationships at school. A non-specific, risk protective intervention in the social environment of the school had a significant impact on a cluster of risk behaviours for girls. Results were remarkably like reports from similar school environment interventions elsewhere, albeit with different behaviours being affected. It may be that this type of intervention activates change processes that interact highly with context, impacting different risks differently, according to the prevalence, salience and distribution of the risk and the interconnectivity of relationships between staff and students. This requires further exploration.
Gupta, Sumit; Wilejto, Marta; Pole, Jason D; Guttmann, Astrid; Sung, Lillian
2014-01-01
While low socioeconomic status (SES) has been associated with inferior cancer outcome among adults, its impact in pediatric oncology is unclear. Our objective was therefore to conduct a systematic review to determine the impact of SES upon outcome in children with cancer. We searched Ovid Medline, EMBASE and CINAHL from inception to December 2012. Studies for which survival-related outcomes were reported by socioeconomic subgroups were eligible for inclusion. Two reviewers independently assessed articles and extracted data. Given anticipated heterogeneity, no quantitative meta-analyses were planned a priori. Of 7,737 publications, 527 in ten languages met criteria for full review; 36 studies met final inclusion criteria. In low- and middle-income countries (LMIC), lower SES was uniformly associated with inferior survival, regardless of the measure chosen. The majority of associations were statistically significant. Of 52 associations between socioeconomic variables and outcome among high-income country (HIC) children, 38 (73.1%) found low SES to be associated with worse survival, 15 of which were statistically significant. Of the remaining 14 (no association or high SES associated with worse survival), only one was statistically significant. Both HIC studies examining the effect of insurance found uninsured status to be statistically associated with inferior survival. Socioeconomic gradients in which low SES is associated with inferior childhood cancer survival are ubiquitous in LMIC and common in HIC. Future studies should elucidate mechanisms underlying these gradients, allowing the design of interventions mediating socioeconomic effects. Targeting the effect of low SES will allow for further improvements in childhood cancer survival.
Statistical Mechanics of Prion Diseases
NASA Astrophysics Data System (ADS)
Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.
2001-07-01
We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.
van 't Hoff, Marcel; Reuter, Marcel; Dryden, David T F; Oheim, Martin
2009-09-21
Bacteriophage lambda-DNA molecules are frequently used as a scaffold to characterize the action of single proteins unwinding, translocating, digesting or repairing DNA. However, scaling up such single-DNA-molecule experiments under identical conditions to attain statistically relevant sample sizes remains challenging. Additionally the movies obtained are frequently noisy and difficult to analyse with any precision. We address these two problems here using, firstly, a novel variable-angle total internal reflection fluorescence (VA-TIRF) reflector composed of a minimal set of optical reflective elements, and secondly, using single value decomposition (SVD) to improve the signal-to-noise ratio prior to analysing time-lapse image stacks. As an example, we visualize under identical optical conditions hundreds of surface-tethered single lambda-DNA molecules, stained with the intercalating dye YOYO-1 iodide, and stretched out in a microcapillary flow. Another novelty of our approach is that we arrange on a mechanically driven stage several capillaries containing saline, calibration buffer and lambda-DNA, respectively, thus extending the approach to high-content, high-throughput screening of single molecules. Our length measurements of individual DNA molecules from noise-reduced kymograph images using SVD display a 6-fold enhanced precision compared to raw-data analysis, reaching approximately 1 kbp resolution. Combining these two methods, our approach provides a straightforward yet powerful way of collecting statistically relevant amounts of data in a semi-automated manner. We believe that our conceptually simple technique should be of interest for a broader range of single-molecule studies, well beyond the specific example of lambda-DNA shown here.
Brorsson, C.; Hansen, N. T.; Lage, K.; Bergholdt, R.; Brunak, S.; Pociot, F.
2009-01-01
Aim To develop novel methods for identifying new genes that contribute to the risk of developing type 1 diabetes within the Major Histocompatibility Complex (MHC) region on chromosome 6, independently of the known linkage disequilibrium (LD) between human leucocyte antigen (HLA)-DRB1, -DQA1, -DQB1 genes. Methods We have developed a novel method that combines single nucleotide polymorphism (SNP) genotyping data with protein–protein interaction (ppi) networks to identify disease-associated network modules enriched for proteins encoded from the MHC region. Approximately 2500 SNPs located in the 4 Mb MHC region were analysed in 1000 affected offspring trios generated by the Type 1 Diabetes Genetics Consortium (T1DGC). The most associated SNP in each gene was chosen and genes were mapped to ppi networks for identification of interaction partners. The association testing and resulting interacting protein modules were statistically evaluated using permutation. Results A total of 151 genes could be mapped to nodes within the protein interaction network and their interaction partners were identified. Five protein interaction modules reached statistical significance using this approach. The identified proteins are well known in the pathogenesis of T1D, but the modules also contain additional candidates that have been implicated in β-cell development and diabetic complications. Conclusions The extensive LD within the MHC region makes it important to develop new methods for analysing genotyping data for identification of additional risk genes for T1D. Combining genetic data with knowledge about functional pathways provides new insight into mechanisms underlying T1D. PMID:19143816
Nielsen, Rasmus Østergaard; Malisoux, Laurent; Møller, Merete; Theisen, Daniel; Parner, Erik Thorlund
2016-04-01
The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.
Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.
Groppe, David M; Urbach, Thomas P; Kutas, Marta
2011-12-01
Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.
Impact of ontology evolution on functional analyses.
Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard
2012-10-15
Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
Brady, Timothy F; Oliva, Aude
2008-07-01
Recent work has shown that observers can parse streams of syllables, tones, or visual shapes and learn statistical regularities in them without conscious intent (e.g., learn that A is always followed by B). Here, we demonstrate that these statistical-learning mechanisms can operate at an abstract, conceptual level. In Experiments 1 and 2, observers incidentally learned which semantic categories of natural scenes covaried (e.g., kitchen scenes were always followed by forest scenes). In Experiments 3 and 4, category learning with images of scenes transferred to words that represented the categories. In each experiment, the category of the scenes was irrelevant to the task. Together, these results suggest that statistical-learning mechanisms can operate at a categorical level, enabling generalization of learned regularities using existing conceptual knowledge. Such mechanisms may guide learning in domains as disparate as the acquisition of causal knowledge and the development of cognitive maps from environmental exploration.
NASA Astrophysics Data System (ADS)
Emoto, K.; Saito, T.; Shiomi, K.
2017-12-01
Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.
Proliferative changes in the bronchial epithelium of former smokers treated with retinoids.
Hittelman, Walter N; Liu, Diane D; Kurie, Jonathan M; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C; Walsh, Garrett; Roth, Jack A; Minna, John; Ro, Jae Y; Broxson, Anita; Hong, Waun Ki; Lee, J Jack
2007-11-07
Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and alpha-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and alpha-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67-positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per-biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index > or = 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and alpha-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = -0.72, 95% CI = -1.24 to -0.20; P = .007) compared with placebo, and after 13-cis-RA and alpha-tocopherol treatment (coefficient estimate = -0.66, 95% CI = -1.15 to -0.17; P = .008). In per-subject analyses, treatment with 13-cis-RA and alpha-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments.
Proliferative Changes in the Bronchial Epithelium of Former Smokers Treated With Retinoids
Hittelman, Walter N.; Liu, Diane D.; Kurie, Jonathan M.; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C.; Walsh, Garrett; Roth, Jack A.; Minna, John; Ro, Jae Y.; Broxson, Anita; Hong, Waun Ki; Lee, J. Jack
2012-01-01
Background Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and α-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Methods Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and α-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67–positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per–biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index ≥ 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. Results In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and α-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = −0.72, 95% CI = −1.24 to −0.20; P = .007) compared with placebo, and after 13-cis-RA and α-tocopherol treatment (coefficient estimate = −0.66, 95% CI = −1.15 to −0.17; P = .008). Conclusions In per-subject analyses, treatment with 13-cis-RA and α-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments. PMID:17971525
Implicit Statistical Learning and Language Skills in Bilingual Children
ERIC Educational Resources Information Center
Yim, Dongsun; Rudoy, John
2013-01-01
Purpose: Implicit statistical learning in 2 nonlinguistic domains (visual and auditory) was used to investigate (a) whether linguistic experience influences the underlying learning mechanism and (b) whether there are modality constraints in predicting implicit statistical learning with age and language skills. Method: Implicit statistical learning…
NASA Technical Reports Server (NTRS)
Yeh, Leehwa
1993-01-01
The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite-mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawle, Rachel A.; Hamerly, Timothy; Tripet, Brian P.
Studies of interspecies interactions are inherently difficult due to the complex mechanisms which enable these relationships. A model system for studying interspecies interactions is the marine hyperthermophiles Ignicoccus hospitalis and Nanoarchaeum equitans. Recent independently-conducted ‘omics’ analyses have generated insights into the molecular factors modulating this association. However, significant questions remain about the nature of the interactions between these archaea. We jointly analyzed multiple levels of omics datasets obtained from published, independent transcriptomics, proteomics, and metabolomics analyses. DAVID identified functionally-related groups enriched when I. hospitalis is grown alone or in co-culture with N. equitans. Enriched molecular pathways were subsequently visualized usingmore » interaction maps generated using STRING. Key findings of our multi-level omics analysis indicated that I. hospitalis provides precursors to N. equitans for energy metabolism. Analysis indicated an overall reduction in diversity of metabolic precursors in the I. hospitalis–N. equitans co-culture, which has been connected to the differential use of ribosomal subunits and was previously unnoticed. We also identified differences in precursors linked to amino acid metabolism, NADH metabolism, and carbon fixation, providing new insights into the metabolic adaptions of I. hospitalis enabling the growth of N. equitans. In conclusion, this multi-omics analysis builds upon previously identified cellular patterns while offering new insights into mechanisms that enable the I. hospitalis–N. equitans association. This study applies statistical and visualization techniques to a mixed-source omics dataset to yield a more global insight into a complex system, that was not readily discernable from separate omics studies.« less
DNA Methylation Analysis of HTR2A Regulatory Region in Leukocytes of Autistic Subjects.
Hranilovic, Dubravka; Blazevic, Sofia; Stefulj, Jasminka; Zill, Peter
2016-02-01
Disturbed brain and peripheral serotonin homeostasis is often found in subjects with autism spectrum disorder (ASD). The role of the serotonin receptor 2A (HTR2A) in the regulation of central and peripheral serotonin homeostasis, as well as its altered expression in autistic subjects, have implicated the HTR2A gene as a major candidate for the serotonin disturbance seen in autism. Several studies, yielding so far inconclusive results, have attempted to associate autism with a functional SNP -1438 G/A (rs6311) in the HTR2A promoter region, while possible contribution of epigenetic mechanisms, such as DNA methylation, to HTR2A dysregulation in autism has not yet been investigated. In this study, we compared the mean DNA methylation within the regulatory region of the HTR2A gene between autistic and control subjects. DNA methylation was analysed in peripheral blood leukocytes using bisulfite conversion and sequencing of the HTR2A region containing rs6311 polymorphism. Autistic subjects of rs6311 AG genotype displayed higher mean methylation levels within the analysed region than the corresponding controls (P < 0.05), while there was no statistically significant difference for AA and GG carriers. Our study provides preliminary evidence for increased HTR2A promoter methylation in leukocytes of a portion of adult autistic subjects, indicating that epigenetic mechanisms might contribute to HTR2A dysregulation observed in individuals with ASD. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
A systems biology approach toward understanding seed composition in soybean.
Li, Ling; Hur, Manhoi; Lee, Joon-Yong; Zhou, Wenxu; Song, Zhihong; Ransom, Nick; Demirkale, Cumhur Yusuf; Nettleton, Dan; Westgate, Mark; Arendsee, Zebulun; Iyer, Vidya; Shanks, Jackie; Nikolau, Basil; Wurtele, Eve Syrkin
2015-01-01
The molecular, biochemical, and genetic mechanisms that regulate the complex metabolic network of soybean seed development determine the ultimate balance of protein, lipid, and carbohydrate stored in the mature seed. Many of the genes and metabolites that participate in seed metabolism are unknown or poorly defined; even more remains to be understood about the regulation of their metabolic networks. A global omics analysis can provide insights into the regulation of seed metabolism, even without a priori assumptions about the structure of these networks. With the future goal of predictive biology in mind, we have combined metabolomics, transcriptomics, and metabolic flux technologies to reveal the global developmental and metabolic networks that determine the structure and composition of the mature soybean seed. We have coupled this global approach with interactive bioinformatics and statistical analyses to gain insights into the biochemical programs that determine soybean seed composition. For this purpose, we used Plant/Eukaryotic and Microbial Metabolomics Systems Resource (PMR, http://www.metnetdb.org/pmr, a platform that incorporates metabolomics data to develop hypotheses concerning the organization and regulation of metabolic networks, and MetNet systems biology tools http://www.metnetdb.org for plant omics data, a framework to enable interactive visualization of metabolic and regulatory networks. This combination of high-throughput experimental data and bioinformatics analyses has revealed sets of specific genes, genetic perturbations and mechanisms, and metabolic changes that are associated with the developmental variation in soybean seed composition. Researchers can explore these metabolomics and transcriptomics data interactively at PMR.
Streetscape greenery and health: stress, social cohesion and physical activity as mediators.
de Vries, Sjerp; van Dillen, Sonja M E; Groenewegen, Peter P; Spreeuwenberg, Peter
2013-10-01
Several studies have shown a positive relationship between local greenspace availability and residents' health, which may offer opportunities for health improvement. This study focuses on three mechanisms through which greenery might exert its positive effect on health: stress reduction, stimulating physical activity and facilitating social cohesion. Knowledge on mechanisms helps to identify which type of greenspace is most effective in generating health benefits. In eighty neighbourhoods in four Dutch cities data on quantity and quality of streetscape greenery were collected by observations. Data on self-reported health and proposed mediators were obtained for adults by mail questionnaires (N = 1641). Multilevel regression analyses, controlling for socio-demographic characteristics, revealed that both quantity and quality of streetscape greenery were related to perceived general health, acute health-related complaints, and mental health. Relationships were generally stronger for quality than for quantity. Stress and social cohesion were the strongest mediators. Total physical activity was not a mediator. Physical activity that could be undertaken in the public space (green activity) was, but less so than stress and social cohesion. With all three mediators included in the analysis, complete mediation could statistically be proven in five out of six cases. In these analyses the contribution of green activity was often not significant. The possibility that the effect of green activity is mediated by stress and social cohesion, rather than that it has a direct health effect, is discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.
Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P
2008-05-20
Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.
Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping
2016-10-21
Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.
Quantum Mechanics From the Cradle?
ERIC Educational Resources Information Center
Martin, John L.
1974-01-01
States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)
Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter
2017-09-01
To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.
Detection of semi-volatile organic compounds in permeable ...
Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame
Accounting for Multiple Births in Neonatal and Perinatal Trials: Systematic Review and Case Study
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-01-01
Objectives To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births. To explore the sensitivity of an actual trial to several analytic approaches to multiples. Methods A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The NO CLD trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using non-clustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. Results In the systematic review, most studies did not describe the randomization of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (p<0.01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. Conclusions The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. PMID:19969305
Accounting for multiple births in neonatal and perinatal trials: systematic review and case study.
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-02-01
To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births and to explore the sensitivity of an actual trial to several analytic approaches to multiples. A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The Nitric Oxide to Prevent Chronic Lung Disease (NO CLD) trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using nonclustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. In the systematic review, most studies did not describe the random assignment of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (P < .01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. Copyright 2010 Mosby, Inc. All rights reserved.
Hetland, Breanna; Lindquist, Ruth; Weinert, Craig R.; Peden-McAlpine, Cynthia; Savik, Kay; Chlan, Linda
2017-01-01
Background Weaning from mechanical ventilation requires increased respiratory effort, which can heighten anxiety and later prolong the need for mechanical ventilation. Objectives To examine the predictive associations of music intervention, anxiety, sedative exposure, and patients’ characteristics on time to initiation and duration of weaning trials of patients receiving mechanical ventilation. Methods A descriptive, correlational design was used for a secondary analysis of data from a randomized trial. Music listening was defined as self-initiated, patient-directed music via headphones. Anxiety was measured daily with a visual analog scale. Sedative exposure was operationalized as a daily sedation intensity score and a sedative dose frequency. Analyses consisted of descriptive statistics, graphing, survival analysis, Cox proportional hazards regression, and linear regression. Results Of 307 patients, 52% were women and 86% were white. Mean age was 59.3 (SD, 14.4) years, mean Acute Physiology and Chronic Health Evaluation III score was 62.9 (SD, 21.6), mean duration of ventilatory support was 8 (range, 1–52) days, and mean stay in the intensive care unit was 18 (range, 2–71) days. Music listening, anxiety levels, and sedative exposure did not influence time to initial weaning trial or duration of trials. Clinical factors of illness severity, days of weaning trials, and tracheostomy placement influenced weaning patterns in this sample. Conclusions Prospective studies of music intervention and other psychophysiological factors during weaning from mechanical ventilation are needed to better understand factors that promote successful weaning. PMID:28461543
Kohda, Naohisa; Iijima, Masahiro; Muguruma, Takeshi; Brantley, William A; Ahluwalia, Karamdeep S; Mizoguchi, Itaru
2013-05-01
To measure the forces delivered by thermoplastic appliances made from three materials and investigate effects of mechanical properties, material thickness, and amount of activation on orthodontic forces. Three thermoplastic materials, Duran (Scheu Dental), Erkodur (Erkodent Erich Kopp GmbH), and Hardcast (Scheu Dental), with two different thicknesses were selected. Values of elastic modulus and hardness were obtained from nanoindentation measurements at 28°C. A custom-fabricated system with a force sensor was employed to obtain measurements of in vitro force delivered by the thermoplastic appliances for 0.5-mm and 1.0-mm activation for bodily tooth movement. Experimental results were subjected to several statistical analyses. Hardcast had significantly lower elastic modulus and hardness than Duran and Erkodur, whose properties were not significantly different. Appliances fabricated from thicker material (0.75 mm or 0.8 mm) always produced significantly greater force than those fabricated from thinner material (0.4 mm or 0.5 mm). Appliances with 1.0-mm activation produced significantly lower force than those with 0.5-mm activation, except for 0.4-mm thick Hardcast appliances. A strong correlation was found between mechanical properties of the thermoplastic materials and force produced by the appliances. Orthodontic forces delivered by thermoplastic appliances depend on the material, thickness, and amount of activation. Mechanical properties of the polymers obtained by nanoindentation testing are predictive of force delivery by these appliances.
Oliveira, Dayane Carvalho Ramos Salles de; Souza-Junior, Eduardo José; Dobson, Adam; Correr, Ana Rosa Costa; Brandt, William Cunha; Sinhoreti, Mário Alexandre Coelho
2016-01-01
To evaluate the influence of phenyl-propanedione on yellowing and chemical-mechanical properties of experimental resin-based materials photoactivated using different light curing units (LCUs). Experimental resin-based materials with the same organic matrix (60:40 wt% BisGMA:TEGDMA) were mechanically blended using a centrifugal mixing device. To this blend, different photoinitiator systems were added in equimolar concentrations with aliphatic amine doubled by wt%: 0.4 wt% CQ; 0.38 wt% PPD; or 0.2 wt% CQ and 0.19 wt% PPD. The degree of conversion (DC), flexural strength (FS), Young's modulus (YM), Knoop hardness (KNH), crosslinking density (CLD), and yellowing (Y) were evaluated (n=10). All samples were light cured with the following LCUs: a halogen lamp (XL 2500), a monowave LED (Radii), or a polywave LED (Valo) with 16 J/cm2. The results were analysed by two-way ANOVA and Tukey's test (α=0.05). No statistical differences were found between the different photoinitiator systems to KNH, CLS, FS, and YM properties (p≥0.05). PPD/CQ association showed the higher DC values compared with CQ and PPD isolated systems when photoactivated by a polywave LED (p≤0.05). Y values were highest for the CQ compared with the PPD systems (p≤0.05). PPD isolated system promoted similar chemical and mechanical properties and less yellowing compared with the CQ isolated system, regardless of the LCU used.
Biological and mechanical interplay at the Macro- and Microscales Modulates the Cell-Niche Fate.
Sarig, Udi; Sarig, Hadar; Gora, Aleksander; Krishnamoorthi, Muthu Kumar; Au-Yeung, Gigi Chi Ting; de-Berardinis, Elio; Chaw, Su Yin; Mhaisalkar, Priyadarshini; Bogireddi, Hanumakumar; Ramakrishna, Seeram; Boey, Freddy Yin Chiang; Venkatraman, Subbu S; Machluf, Marcelle
2018-03-02
Tissue development, regeneration, or de-novo tissue engineering in-vitro, are based on reciprocal cell-niche interactions. Early tissue formation mechanisms, however, remain largely unknown given complex in-vivo multifactoriality, and limited tools to effectively characterize and correlate specific micro-scaled bio-mechanical interplay. We developed a unique model system, based on decellularized porcine cardiac extracellular matrices (pcECMs)-as representative natural soft-tissue biomaterial-to study a spectrum of common cell-niche interactions. Model monocultures and 1:1 co-cultures on the pcECM of human umbilical vein endothelial cells (HUVECs) and human mesenchymal stem cells (hMSCs) were mechano-biologically characterized using macro- (Instron), and micro- (AFM) mechanical testing, histology, SEM and molecular biology aspects using RT-PCR arrays. The obtained data was analyzed using developed statistics, principal component and gene-set analyses tools. Our results indicated biomechanical cell-type dependency, bi-modal elasticity distributions at the micron cell-ECM interaction level, and corresponding differing gene expression profiles. We further show that hMSCs remodel the ECM, HUVECs enable ECM tissue-specific recognition, and their co-cultures synergistically contribute to tissue integration-mimicking conserved developmental pathways. We also suggest novel quantifiable measures as indicators of tissue assembly and integration. This work may benefit basic and translational research in materials science, developmental biology, tissue engineering, regenerative medicine and cancer biomechanics.
NASA Astrophysics Data System (ADS)
Aligholi, Saeed; Lashkaripour, Gholam Reza; Ghafoori, Mohammad
2017-01-01
This paper sheds further light on the fundamental relationships between simple methods, rock strength, and brittleness of igneous rocks. In particular, the relationship between mechanical (point load strength index I s(50) and brittleness value S 20), basic physical (dry density and porosity), and dynamic properties (P-wave velocity and Schmidt rebound values) for a wide range of Iranian igneous rocks is investigated. First, 30 statistical models (including simple and multiple linear regression analyses) were built to identify the relationships between mechanical properties and simple methods. The results imply that rocks with different Schmidt hardness (SH) rebound values have different physicomechanical properties or relations. Second, using these results, it was proved that dry density, P-wave velocity, and SH rebound value provide a fine complement to mechanical properties classification of rock materials. Further, a detailed investigation was conducted on the relationships between mechanical and simple tests, which are established with limited ranges of P-wave velocity and dry density. The results show that strength values decrease with the SH rebound value. In addition, there is a systematic trend between dry density, P-wave velocity, rebound hardness, and brittleness value of the studied rocks, and rocks with medium hardness have a higher brittleness value. Finally, a strength classification chart and a brittleness classification table are presented, providing reliable and low-cost methods for the classification of igneous rocks.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling
Wood, John
2017-01-01
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
Impact of Injury Mechanisms on Patterns and Management of Facial Fractures.
Greathouse, S Travis; Adkinson, Joshua M; Garza, Ramon; Gilstrap, Jarom; Miller, Nathan F; Eid, Sherrine M; Murphy, Robert X
2015-07-01
Mechanisms causing facial fractures have evolved over time and may be predictive of the types of injuries sustained. The objective of this study is to examine the impact of mechanisms of injury on the type and management of facial fractures at our Level 1 Trauma Center. The authors performed an Institutional Review Board-approved review of our network's trauma registry from 2006 to 2010, documenting age, sex, mechanism, Injury Severity Score, Glasgow Coma Scale, facial fracture patterns (nasal, maxillary/malar, orbital, mandible), and reconstructions. Mechanism rates were compared using a Pearson χ2 test. The database identified 23,318 patients, including 1686 patients with facial fractures and a subset of 1505 patients sustaining 2094 fractures by motor vehicle collision (MVC), fall, or assault. Nasal fractures were the most common injuries sustained by all mechanisms. MVCs were most likely to cause nasal and malar/maxillary fractures (P < 0.01). Falls were the least likely and assaults the most likely to cause mandible fractures (P < 0.001), the most common injury leading to surgical intervention (P < 0.001). Although not statistically significant, fractures sustained in MVCs were the most likely overall to undergo surgical intervention. Age, number of fractures, and alcohol level were statistically significant variables associated with operative management. Age and number of fractures sustained were associated with operative intervention. Although there is a statistically significant correlation between mechanism of injury and type of facial fracture sustained, none of the mechanisms evaluated herein are statistically associated with surgical intervention. Clinical Question/Level of Evidence: Therapeutic, III.
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
ERIC Educational Resources Information Center
Dahabreh, Issa J.; Chung, Mei; Kitsios, Georgios D.; Terasawa, Teruhiko; Raman, Gowri; Tatsioni, Athina; Tobar, Annette; Lau, Joseph; Trikalinos, Thomas A.; Schmid, Christopher H.
2013-01-01
We performed a survey of meta-analyses of test performance to describe the evolution in their methods and reporting. Studies were identified through MEDLINE (1966-2009), reference lists, and relevant reviews. We extracted information on clinical topics, literature review methods, quality assessment, and statistical analyses. We reviewed 760…
NASA Astrophysics Data System (ADS)
Miller, C. Cameron; van Zee, Roger D.; Stephenson, John C.
2001-01-01
The mechanism of the reaction CH4+O(1D2)→CH3+OH was investigated by ultrafast, time-resolved and state-resolved experiments. In the ultrafast experiments, short ultraviolet pulses photolyzed ozone in the CH4ṡO3 van der Waals complex to produce O(1D2). The ensuing reaction with CH4 was monitored by measuring the appearance rate of OH(v=0,1;J,Ω,Λ) by laser-induced fluorescence, through the OH A←X transition, using short probe pulses. These spectrally broad pulses, centered between 307 and 316 nm, probe many different OH rovibrational states simultaneously. At each probe wavelength, both a fast and a slow rise time were evident in the fluorescence signal, and the ratio of the fast-to-slow signal varied with probe wavelength. The distribution of OH(v,J,Ω,Λ) states, Pobs(v,J,Ω,Λ), was determined by laser-induced fluorescence using a high-resolution, tunable dye laser. The Pobs(v,J,Ω,Λ) data and the time-resolved data were analyzed under the assumption that different formation times represent different reaction mechanisms and that each mechanism produces a characteristic rovibrational distribution. The state-resolved and the time-resolved data can be fit independently using a two-mechanism model: Pobs(v,J,Ω,Λ) can be decomposed into two components, and the appearance of OH can be fit by two exponential rise times. However, these independent analyses are not mutually consistent. The time-resolved and state-resolved data can be consistently fit using a three-mechanism model. The OH appearance signals, at all probe wavelengths, were fit with times τfast≈0.2 ps, τinter≈0.5 ps and τslow≈5.4 ps. The slowest of these three is the rate for dissociation of a vibrationally excited methanol intermediate (CH3OH*) predicted by statistical theory after complete intramolecular energy redistribution following insertion of O(1D2) into CH4. The Pobs(v,J,Ω,Λ) was decomposed into three components, each with a linear surprisal, under the assumption that the mechanism producing OH at a statistical rate would be characterized by a statistical prior. Dissociation of a CH4O* intermediate before complete energy randomization was identified as producing OH at the intermediate rate and was associated with a population distribution with more rovibrational energy than the slow mechanism. The third mechanism produces OH promptly with a cold rovibrational distribution, indicative of a collinear abstraction mechanism. After these identifications were made, it was possible to predict the fraction of signal associated with each mechanism at different probe wavelengths in the ultrafast experiment, and the predictions proved consistent with measured appearance signals. This model also reconciles data from a variety of previous experiments. While this model is the simplest that is consistent with the data, it is not definitive for several reasons. First, the appearance signals measured in these experiments probe simultaneously many OH(v,J,Ω,Λ) states, which would tend to obfuscate differences in the appearance rate of specific rovibrational states. Second, only about half of the OH(v,J,Ω,Λ) states populated by this reaction could be probed by laser-induced fluorescence through the OH A←X band with our apparatus. Third, the cluster environment might influence the dynamics compared to the free bimolecular reaction.
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...
2016-03-09
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
Entropy for Mechanically Vibrating Systems
NASA Astrophysics Data System (ADS)
Tufano, Dante
The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.
Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance
ERIC Educational Resources Information Center
Whitley, Cameron T.; Dietz, Thomas
2018-01-01
Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…
NASA Technical Reports Server (NTRS)
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Determining Functional Reliability of Pyrotechnic Mechanical Devices
NASA Technical Reports Server (NTRS)
Bement, Laurence J.; Multhaup, Herbert A.
1997-01-01
This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.
Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye
2018-02-21
A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Buckner, Julia D.; Schmidt, Norman B.
2009-01-01
Background: Individuals with social anxiety disorder (SAD) appear particularly vulnerable to marijuana-related problems. Yet, mechanisms underlying this association are unclear. Methods: This study examined the role of marijuana effect expectancies in the relation between SAD and marijuana problems among 107 marijuana users (43.0% female), 26.2% of whom met Diagnostic and Statistical Manual for Mental Disorders—Fourth Edition criteria for SAD. Anxiety and mood disorders were determined during clinical interviews using the Anxiety Disorders Interview Schedule—IV-L (ADIS-IV). Results: Analyses (including sex, marijuana use frequency, major depressive disorder, and other anxiety disorders) suggest that SAD was the only disorder significantly associated with past 3-month marijuana problems. Compared to those without SAD, individuals with SAD were more likely to endorse the following marijuana expectancies: cognitive/behavioral impairment and global negative expectancies. Importantly, these expectancies mediated the relations between SAD status and marijuana problems. Conclusions: These data support the contention that SAD is uniquely related to marijuana problems and provide insight into mechanisms underlying this vulnerability. PMID:19373871
Bahrami, Hoda; Keshel, Saeed Heidari; Chari, Aliakbar Jafari; Biazar, Esmaeil
2016-09-01
Unrestricted somatic stem cells (USSCs) loaded in nanofibrous polycaprolactone (PCL) scaffolds can be used for skin regeneration when grafted onto full-thickness skin defects of rats. Nanofibrous PCL scaffolds were designed by the electrospinning method and crosslinked with laminin protein. Afterwards, the scaffolds were evaluated by scanning electron microscopy, and physical and mechanical assays. In this study, nanofibrous PCL scaffolds loaded with USSCs were grafted onto the skin defects. The wounds were subsequently investigated 21 days after grafting. Results of mechanical and physical analyses showed good resilience and compliance to movement as a skin graft. In animal models; study samples exhibited the most pronounced effect on wound closure, with statistically significant improvement in wound healing being seen at 21 days post-operatively. Histological examinations of healed wounds from all samples showed a thin epidermis plus recovered skin appendages in the dermal layer for samples with cell. Thus, the graft of nanofibrous PCL scaffolds loaded with USSC showed better results during the healing process of skin defects in rat models.
From behavioural analyses to models of collective motion in fish schools
Lopez, Ugo; Gautrais, Jacques; Couzin, Iain D.; Theraulaz, Guy
2012-01-01
Fish schooling is a phenomenon of long-lasting interest in ethology and ecology, widely spread across taxa and ecological contexts, and has attracted much interest from statistical physics and theoretical biology as a case of self-organized behaviour. One topic of intense interest is the search of specific behavioural mechanisms at stake at the individual level and from which the school properties emerges. This is fundamental for understanding how selective pressure acting at the individual level promotes adaptive properties of schools and in trying to disambiguate functional properties from non-adaptive epiphenomena. Decades of studies on collective motion by means of individual-based modelling have allowed a qualitative understanding of the self-organization processes leading to collective properties at school level, and provided an insight into the behavioural mechanisms that result in coordinated motion. Here, we emphasize a set of paradigmatic modelling assumptions whose validity remains unclear, both from a behavioural point of view and in terms of quantitative agreement between model outcome and empirical data. We advocate for a specific and biologically oriented re-examination of these assumptions through experimental-based behavioural analysis and modelling. PMID:24312723
Quasi-Static Probabilistic Structural Analyses Process and Criteria
NASA Technical Reports Server (NTRS)
Goldberg, B.; Verderaime, V.
1999-01-01
Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
Football goal distributions and extremal statistics
NASA Astrophysics Data System (ADS)
Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.
2002-12-01
We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.
Logistic regression applied to natural hazards: rare event logistic regression with replications
NASA Astrophysics Data System (ADS)
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Gadbury, Gary L.; Allison, David B.
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a “near significant p-value” to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called “fiddling”) in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000. PMID:23056287
Gadbury, Gary L; Allison, David B
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a "near significant p-value" to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called "fiddling") in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q
2015-11-01
To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.
Unconscious analyses of visual scenes based on feature conjunctions.
Tachibana, Ryosuke; Noguchi, Yasuki
2015-06-01
To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).
Study Designs and Statistical Analyses for Biomarker Research
Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori
2012-01-01
Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528
NASA Technical Reports Server (NTRS)
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
Accuracy of medical subject heading indexing of dental survival analyses.
Layton, Danielle M; Clarke, Michael
2014-01-01
To assess the Medical Subject Headings (MeSH) indexing of articles that employed time-to-event analyses to report outcomes of dental treatment in patients. Articles published in 2008 in 50 dental journals with the highest impact factors were hand searched to identify articles reporting dental treatment outcomes over time in human subjects with time-to-event statistics (included, n = 95), without time-to-event statistics (active controls, n = 91), and all other articles (passive controls, n = 6,769). The search was systematic (kappa 0.92 for screening, 0.86 for eligibility). Outcome-, statistic- and time-related MeSH were identified, and differences in allocation between groups were analyzed with chi-square and Fischer exact statistics. The most frequently allocated MeSH for included and active control articles were "dental restoration failure" (77% and 52%, respectively) and "treatment outcome" (54% and 48%, respectively). Outcome MeSH was similar between these groups (86% and 77%, respectively) and significantly greater than passive controls (10%, P < .001). Significantly more statistical MeSH were allocated to the included articles than to the active or passive controls (67%, 15%, and 1%, respectively, P < .001). Sixty-nine included articles specifically used Kaplan-Meier or life table analyses, but only 42% (n = 29) were indexed as such. Significantly more time-related MeSH were allocated to the included than the active controls (92% and 79%, respectively, P = .02), or to the passive controls (22%, P < .001). MeSH allocation within MEDLINE to time-to-event dental articles was inaccurate and inconsistent. Statistical MeSH were omitted from 30% of the included articles and incorrectly allocated to 15% of active controls. Such errors adversely impact search accuracy.
Fracture Mechanics Analyses of the Slip-Side Joggle Regions of Wing-Leading-Edge Panels
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr.; Song, Kyongchan; Phillips, Dawn R.
2011-01-01
The Space Shuttle wing-leading edge consists of panels that are made of reinforced carbon-carbon. Coating spallation was observed near the slip-side region of the panels that experience extreme heating. To understand this phenomenon, a root-cause investigation was conducted. As part of that investigation, fracture mechanics analyses of the slip-side joggle regions of the hot panels were conducted. This paper presents an overview of the fracture mechanics analyses.
Statistical analysis of iron geochemical data suggests limited late Proterozoic oxygenation
NASA Astrophysics Data System (ADS)
Sperling, Erik A.; Wolock, Charles J.; Morgan, Alex S.; Gill, Benjamin C.; Kunzmann, Marcus; Halverson, Galen P.; MacDonald, Francis A.; Knoll, Andrew H.; Johnston, David T.
2015-07-01
Sedimentary rocks deposited across the Proterozoic-Phanerozoic transition record extreme climate fluctuations, a potential rise in atmospheric oxygen or re-organization of the seafloor redox landscape, and the initial diversification of animals. It is widely assumed that the inferred redox change facilitated the observed trends in biodiversity. Establishing this palaeoenvironmental context, however, requires that changes in marine redox structure be tracked by means of geochemical proxies and translated into estimates of atmospheric oxygen. Iron-based proxies are among the most effective tools for tracking the redox chemistry of ancient oceans. These proxies are inherently local, but have global implications when analysed collectively and statistically. Here we analyse about 4,700 iron-speciation measurements from shales 2,300 to 360 million years old. Our statistical analyses suggest that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous (depleted in dissolved oxygen and iron-bearing), but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoic era. Analyses further indicate that early animals did not experience appreciable benthic sulfide stress. Finally, unlike proxies based on redox-sensitive trace-metal abundances, iron geochemical data do not show a statistically significant change in oxygen content through the Ediacaran and Cambrian periods, sharply constraining the magnitude of the end-Proterozoic oxygen increase. Indeed, this re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era. Therefore, if changing redox conditions facilitated animal diversification, it did so through a limited rise in oxygen past critical functional and ecological thresholds, as is seen in modern oxygen minimum zone benthic animal communities.
Ensor, Joie; Riley, Richard D.
2016-01-01
Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915
Eu, Byung Chan
2008-09-07
In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.
Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.
Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L
2016-02-09
Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.
NASA Astrophysics Data System (ADS)
Mascaro, G.; Vivoni, E. R.; Gochis, D. J.; Watts, C. J.; Rodriguez, J. C.
2013-12-01
In northwest Mexico, the statistical properties of rainfall at high temporal resolution (up to 1 min) have been poorly characterized, mainly due to a lack of observations. Under a combined effort of US and Mexican institutions initiated during the North American Monsoon-Soil Moisture Experiment in 2004 (NAME-SMEX04), a network of 8 tipping-bucket rain gauges were installed across a topographic transect in the Sierra Los Locos basin of Sonora, Mexico. The transect spans a distance of ~14 km and an elevation difference of 748 m, thus including valley, mid-elevation and ridge sites where rainfall generation mechanisms in the summer and winter seasons are potentially affected by orography. In this study, we used the data collected during the period of 2007-2010 to characterize the rainfall statistical properties in a wide range of time scales (1 min to ~45 days) and analyzed how these properties change as a function of elevation, the gauge separation distance, and the summer and winter seasons. We found that the total summer (winter) rainfall decreases (increases) with elevation, and that rainfall has a clear diurnal cycle in the summertime, with a peak around 9 pm at all gauges. The correlation structure across the transect indicates that: (i) when times series are aggregated at a resolution greater than 3 hours, the correlation distance is greater than the maximum separation distance (~14 km), while it dramatically decreases for lower time resolutions (e.g., it is ~1.5 km when the resolution is 10 min). Consistent with other semiarid regions, spectral and scale invariance analyses show the presence of different scaling regimes, which are associated to single convective events and larger stratiform systems, with different intermittency properties dependent on the rainfall season. Results of this work are useful for the interpretation of storm generation mechanisms and hydrologic response in the region, as well as for the calibration of high-resolution, stochastic rainfall models used in climate, hydrology, and engineering applications.
Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koprinkov, I. G.
2010-11-25
The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.
ERIC Educational Resources Information Center
Findley, Bret R.; Mylon, Steven E.
2008-01-01
We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…
Non-equilibrium dog-flea model
NASA Astrophysics Data System (ADS)
Ackerson, Bruce J.
2017-11-01
We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.