Improving surveillance for injuries associated with potential motor vehicle safety defects
Whitfield, R; Whitfield, A
2004-01-01
Objective: To improve surveillance for deaths and injuries associated with potential motor vehicle safety defects. Design: Vehicles in fatal crashes can be studied for indications of potential defects using an "early warning" surveillance statistic previously suggested for screening reports of adverse drug reactions. This statistic is illustrated with time series data for fatal, tire related and fire related crashes. Geographic analyses are used to augment the tire related statistics. Results: A statistical criterion based on the Poisson distribution that tests the likelihood of an expected number of events, given the number of events that actually occurred, is a promising method that can be readily adapted for use in injury surveillance. Conclusions: Use of the demonstrated techniques could have helped to avert a well known injury surveillance failure. This method is adaptable to aid in the direction of engineering and statistical reviews to prevent deaths and injuries associated with potential motor vehicle safety defects using available databases. PMID:15066972
Confidence crisis of results in biomechanics research.
Knudson, Duane
2017-11-01
Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, Jason D.; Doraiswamy, Sriram; Candler, Graham V., E-mail: truhlar@umn.edu, E-mail: candler@aem.umn.edu
2014-02-07
Fitting potential energy surfaces to analytic forms is an important first step for efficient molecular dynamics simulations. Here, we present an improved version of the local interpolating moving least squares method (L-IMLS) for such fitting. Our method has three key improvements. First, pairwise interactions are modeled separately from many-body interactions. Second, permutational invariance is incorporated in the basis functions, using permutationally invariant polynomials in Morse variables, and in the weight functions. Third, computational cost is reduced by statistical localization, in which we statistically correlate the cutoff radius with data point density. We motivate our discussion in this paper with amore » review of global and local least-squares-based fitting methods in one dimension. Then, we develop our method in six dimensions, and we note that it allows the analytic evaluation of gradients, a feature that is important for molecular dynamics. The approach, which we call statistically localized, permutationally invariant, local interpolating moving least squares fitting of the many-body potential (SL-PI-L-IMLS-MP, or, more simply, L-IMLS-G2), is used to fit a potential energy surface to an electronic structure dataset for N{sub 4}. We discuss its performance on the dataset and give directions for further research, including applications to trajectory calculations.« less
Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin
2014-06-05
In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.
2014-01-01
Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483
Passage relevance models for genomics search.
Urbain, Jay; Frieder, Ophir; Goharian, Nazli
2009-03-19
We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.
Detecting changes in dynamic and complex acoustic environments
Boubenec, Yves; Lawlor, Jennifer; Górska, Urszula; Shamma, Shihab; Englitz, Bernhard
2017-01-01
Natural sounds such as wind or rain, are characterized by the statistical occurrence of their constituents. Despite their complexity, listeners readily detect changes in these contexts. We here address the neural basis of statistical decision-making using a combination of psychophysics, EEG and modelling. In a texture-based, change-detection paradigm, human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. The potential's amplitude scaled with the duration of pre-change exposure, suggesting a time-dependent decision threshold. Auditory cortex-related potentials showed no response to the change. A dual timescale, statistical estimation model accounted for subjects' performance. Furthermore, a decision-augmented auditory cortex model accounted for performance and reaction times, suggesting that the primary cortical representation requires little post-processing to enable change-detection in complex acoustic environments. DOI: http://dx.doi.org/10.7554/eLife.24910.001 PMID:28262095
2013-02-01
of a bearing must be put into practice. There are many potential methods, the most traditional being the use of statistical time-domain features...accelerate degradation to test multiples bearings to gain statistical relevance and extrapolate results to scale for field conditions. Temperature...as time statistics , frequency estimation to improve the fault frequency detection. For future investigations, one can further explore the
Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.
2016-01-01
Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.
Adaptive interference cancel filter for evoked potential using high-order cumulants.
Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei
2004-01-01
This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.
Grbovic, Vesna; Jurisic-Skevin, Aleksandra; Djukic, Svetlana; Stefanović, Srdjan; Nurkovic, Jasmin
2016-01-01
[Purpose] Painful diabetic polyneuropathy occurs as a complication in 16% of all patients with diabetes mellitus. [Subjects and Methods] A clinical, prospective open-label randomized intervention study was conducted of 60 adult patients, with distal sensorimotor diabetic neuropathy two groups of 30 patients, with diabetes mellitus type 2 with distal sensorimotor diabetic neuropathy. Patients in group A were treated with combined physical procedures, and patients in group B were treated with alpha lipoic acid. [Results] There where a statistically significant improvements in terminal latency and the amplitude of the action potential in group A patients, while group B patients showed a statistically significant improvements in conduction velocity and terminal latency of n. peroneus. Group A patients showed a statistically significant improvements in conduction velocity and terminal latency, while group B patients also showed a statistically significant improvements in conduction velocity and terminal latency. This was reflected in a significant improvements in electrophysiological parameters (conduction velocity, amplitude and latency) of the motor and sensory nerves (n. peroneus, n. suralis). [Conclusion] These results present further evidence justifying of the use of physical agents in the treatment of diabetic sensorimotor polyneuropathy. PMID:27065527
Statistical methods for nuclear material management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowen W.M.; Bennett, C.A.
1988-12-01
This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material managementmore » problems.« less
Biostatistical analysis of quantitative immunofluorescence microscopy images.
Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C
2016-12-01
Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Packet Randomized Experiments for Eliminating Classes of Confounders
Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.
2014-01-01
Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088
Park, Jungkap; Saitou, Kazuhiro
2014-09-18
Multibody potentials accounting for cooperative effects of molecular interactions have shown better accuracy than typical pairwise potentials. The main challenge in the development of such potentials is to find relevant structural features that characterize the tightly folded proteins. Also, the side-chains of residues adopt several specific, staggered conformations, known as rotamers within protein structures. Different molecular conformations result in different dipole moments and induce charge reorientations. However, until now modeling of the rotameric state of residues had not been incorporated into the development of multibody potentials for modeling non-bonded interactions in protein structures. In this study, we develop a new multibody statistical potential which can account for the influence of rotameric states on the specificity of atomic interactions. In this potential, named "rotamer-dependent atomic statistical potential" (ROTAS), the interaction between two atoms is specified by not only the distance and relative orientation but also by two state parameters concerning the rotameric state of the residues to which the interacting atoms belong. It was clearly found that the rotameric state is correlated to the specificity of atomic interactions. Such rotamer-dependencies are not limited to specific type or certain range of interactions. The performance of ROTAS was tested using 13 sets of decoys and was compared to those of existing atomic-level statistical potentials which incorporate orientation-dependent energy terms. The results show that ROTAS performs better than other competing potentials not only in native structure recognition, but also in best model selection and correlation coefficients between energy and model quality. A new multibody statistical potential, ROTAS accounting for the influence of rotameric states on the specificity of atomic interactions was developed and tested on decoy sets. The results show that ROTAS has improved ability to recognize native structure from decoy models compared to other potentials. The effectiveness of ROTAS may provide insightful information for the development of many applications which require accurate side-chain modeling such as protein design, mutation analysis, and docking simulation.
Chung, Dongjun; Kim, Hang J; Zhao, Hongyu
2017-02-01
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.
Use of iPhone technology in improving acetabular component position in total hip arthroplasty.
Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George
2017-09-01
Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.
2017-12-01
Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have shown potential to improve real-time seasonal rainfall predictions in the future.
A quality improvement management model for renal care.
Vlchek, D L; Day, L M
1991-04-01
The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.
α -induced reactions on 115In: Cross section measurements and statistical model analysis
NASA Astrophysics Data System (ADS)
Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.
2018-05-01
Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.
Sinko, William; de Oliveira, César Augusto F; Pierce, Levi C T; McCammon, J Andrew
2012-01-10
Molecular dynamics (MD) is one of the most common tools in computational chemistry. Recently, our group has employed accelerated molecular dynamics (aMD) to improve the conformational sampling over conventional molecular dynamics techniques. In the original aMD implementation, sampling is greatly improved by raising energy wells below a predefined energy level. Recently, our group presented an alternative aMD implementation where simulations are accelerated by lowering energy barriers of the potential energy surface. When coupled with thermodynamic integration simulations, this implementation showed very promising results. However, when applied to large systems, such as proteins, the simulation tends to be biased to high energy regions of the potential landscape. The reason for this behavior lies in the boost equation used since the highest energy barriers are dramatically more affected than the lower ones. To address this issue, in this work, we present a new boost equation that prevents oversampling of unfavorable high energy conformational states. The new boost potential provides not only better recovery of statistics throughout the simulation but also enhanced sampling of statistically relevant regions in explicit solvent MD simulations.
An Exploratory Study of OEE Implementation in Indian Manufacturing Companies
NASA Astrophysics Data System (ADS)
Kumar, J.; Soni, V. K.
2015-04-01
Globally, the implementation of Overall equipment effectiveness (OEE) has proven to be highly effective in improving availability, performance rate and quality rate while reducing unscheduled breakdown and wastage that stems from the equipment. This paper investigates the present status and future scope of OEE metrics in Indian manufacturing companies through an extensive survey. In this survey, opinions of Production and Maintenance Managers have been analyzed statistically to explore the relationship between factors, perspective of OEE and potential use of OEE metrics. Although the sample has been divers in terms of product, process type, size, and geographic location of the companies, they are enforced to implement improvement techniques such as OEE metrics to improve performance. The findings reveal that OEE metrics has huge potential and scope to improve performance. Responses indicate that Indian companies are aware of OEE but they are not utilizing full potential of OEE metrics.
Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée
2014-01-01
Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862
Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain
Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young
2010-01-01
Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071
Weiss, Jeremy C; Page, David; Peissig, Peggy L; Natarajan, Sriraam; McCarty, Catherine
2013-01-01
Electronic health records (EHRs) are an emerging relational domain with large potential to improve clinical outcomes. We apply two statistical relational learning (SRL) algorithms to the task of predicting primary myocardial infarction. We show that one SRL algorithm, relational functional gradient boosting, outperforms propositional learners particularly in the medically-relevant high recall region. We observe that both SRL algorithms predict outcomes better than their propositional analogs and suggest how our methods can augment current epidemiological practices. PMID:25360347
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
Quantitative evaluation of pairs and RS steganalysis
NASA Astrophysics Data System (ADS)
Ker, Andrew D.
2004-06-01
We give initial results from a new project which performs statistically accurate evaluation of the reliability of image steganalysis algorithms. The focus here is on the Pairs and RS methods, for detection of simple LSB steganography in grayscale bitmaps, due to Fridrich et al. Using libraries totalling around 30,000 images we have measured the performance of these methods and suggest changes which lead to significant improvements. Particular results from the project presented here include notes on the distribution of the RS statistic, the relative merits of different "masks" used in the RS algorithm, the effect on reliability when previously compressed cover images are used, and the effect of repeating steganalysis on the transposed image. We also discuss improvements to the Pairs algorithm, restricting it to spatially close pairs of pixels, which leads to a substantial performance improvement, even to the extent of surpassing the RS statistic which was previously thought superior for grayscale images. We also describe some of the questions for a general methodology of evaluation of steganalysis, and potential pitfalls caused by the differences between uncompressed, compressed, and resampled cover images.
A review of failure models for unidirectional ceramic matrix composites under monotonic loads
NASA Technical Reports Server (NTRS)
Tripp, David E.; Hemann, John H.; Gyekenyesi, John P.
1989-01-01
Ceramic matrix composites offer significant potential for improving the performance of turbine engines. In order to achieve their potential, however, improvements in design methodology are needed. In the past most components using structural ceramic matrix composites were designed by trial and error since the emphasis of feasibility demonstration minimized the development of mathematical models. To understand the key parameters controlling response and the mechanics of failure, the development of structural failure models is required. A review of short term failure models with potential for ceramic matrix composite laminates under monotonic loads is presented. Phenomenological, semi-empirical, shear-lag, fracture mechanics, damage mechanics, and statistical models for the fast fracture analysis of continuous fiber unidirectional ceramic matrix composites under monotonic loads are surveyed.
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Experimental Design in Clinical 'Omics Biomarker Discovery.
Forshed, Jenny
2017-11-03
This tutorial highlights some issues in the experimental design of clinical 'omics biomarker discovery, how to avoid bias and get as true quantities as possible from biochemical analyses, and how to select samples to improve the chance of answering the clinical question at issue. This includes the importance of defining clinical aim and end point, knowing the variability in the results, randomization of samples, sample size, statistical power, and how to avoid confounding factors by including clinical data in the sample selection, that is, how to avoid unpleasant surprises at the point of statistical analysis. The aim of this Tutorial is to help translational clinical and preclinical biomarker candidate research and to improve the validity and potential of future biomarker candidate findings.
Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.
2015-01-01
The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470
SSGP: SNP-set based genomic prediction to incorporate biological information
USDA-ARS?s Scientific Manuscript database
Genomic prediction has emerged as an effective approach in plant and animal breeding and in precision medicine. Much research has been devoted to an improved accuracy in genomic prediction, and one of the potential ways is to incorporate biological information. Due to the statistical and computation...
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; England, J. F.; Berenbrock, C. E.; Mason, R. R.; Stedinger, J. R.; Lamontagne, J. R.
2013-08-01
The Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as "less-than" values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
Note: Modification of the Gay-Berne potential for improved accuracy and speed
NASA Astrophysics Data System (ADS)
Persson, Rasmus A. X.
2012-06-01
A modification of the Gay-Berne (GB) potential is proposed which is about 10% to 20% more speed efficient and statistically more accurate in reproducing the energy of interaction of two linear Lennard-Jones tetratomics when averaged over all orientations. For the special cases of end-to-end and side-by-side configurations, the new potential is equivalent to the GB one. A simple generalization to dissimilar particles of D∞h symmetry is presented but does not retain the superior agreement with respect to its GB counterpart, except at close range.
Seasonal Drought Prediction: Advances, Challenges, and Future Prospects
NASA Astrophysics Data System (ADS)
Hao, Zengchao; Singh, Vijay P.; Xia, Youlong
2018-03-01
Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
Modelling for Prediction vs. Modelling for Understanding: Commentary on Musso et al. (2013)
ERIC Educational Resources Information Center
Edelsbrunner, Peter; Schneider, Michael
2013-01-01
Musso et al. (2013) predict students' academic achievement with high accuracy one year in advance from cognitive and demographic variables, using artificial neural networks (ANNs). They conclude that ANNs have high potential for theoretical and practical improvements in learning sciences. ANNs are powerful statistical modelling tools but they can…
Bio-Security Proficiencies Project for Beginning Producers in 4-H
ERIC Educational Resources Information Center
Smith, Martin H.; Meehan, Cheryl L.; Borba, John A.
2014-01-01
Improving bio-security practices among 4-H members who raise and show project animals is important. Bio-security measures can reduce the risk of disease spread and mitigate potential health and economic risks of disease outbreaks involving animal and zoonotic pathogens. Survey data provided statistical evidence that the Bio-Security Proficiencies…
Confounding in statistical mediation analysis: What it is and how to address it.
Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P
2017-11-01
Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Austin, Peter C; Fine, Jason P
2017-04-15
In studies with survival or time-to-event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high-impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan-Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.
2012-10-01
In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less
Perinatal legislative policies and health outcomes.
Lorch, Scott A
2017-10-01
Perinatal epidemiology examines the variation and determinants of pregnancy outcomes from a maternal and neonatal perspective. However, improving public and population health also requires the translation of this evidence base into substantive public policies. Assessing the impact of such public policies requires sufficient data to include potential confounding factors in the analysis, such as coexisting medical conditions and socioeconomic status, and appropriate statistical and epidemiological techniques. This review will explore policies addressing three areas of perinatal medicine-elective deliveries prior to 39 weeks' gestation; perinatal regionalization; and mandatory paid maternity leave policies-to illustrate the challenges when assessing the impact of specific policies at the patient and population level. Data support the use of these policies to improve perinatal health, but with weaker and less certain effect sizes when compared to the initial patient-level studies. Improved data collection and epidemiological techniques will allow for improved assessment of these policies and the identification of potential areas of improvement when translating patient-level studies into public policies. Copyright © 2017 Elsevier Inc. All rights reserved.
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Gallagher, Robyn; Roach, Kellie; Belshaw, Julie; Kirkness, Ann; Sadler, Leonie; Warrington, Darrell
2013-05-01
Patient delay in recognizing and responding to potential acute myocardial infarction (AMI) symptoms is an international issue. Cardiac rehabilitation provides an ideal opportunity to deliver an intervention. This study examines an individual educational intervention on knowledge of heart attack warning signs and specific chest pain action plans for people with coronary heart disease. Cardiac rehabilitation participants at five hospitals were assessed at program entry and tailored education was provided using the Heart Foundation of Australia's Heart Attack Warning Signs campaign educational tool. Participants (n=137) were reassessed at program conclusion (six to eight weeks). Study participants had a mean age of 64.48 years (SD 12.22), were predominantly male (78%) and most commonly presented with a current referral diagnosis of a percutaneous coronary intervention (PCI) (80%) and/or AMI (60%). There were statistically significant improvements in the reporting of 11 of the 14 warning signs of heart attack, with patients reporting 2.56 more warning signs on average at outcome (p<.0001). Patients reported more heart attack warning signs if they had completed high school education (β=1.14) or had better knowledge before the intervention (β=.57). There were statistically significant improvements in reporting of all appropriate actions in response to potential AMI symptoms, with patients reporting an average of 1.3 more actions at outcome (p<.001), with no change in the median time they would tolerate symptoms (p=.16). A brief education session using a single standardised tool and adapted to a patient assessment is effective in improving knowledge of potential AMI symptoms and appropriate responses in cardiac rehabilitation up to two months following. Copyright © 2012 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
Verhulst, Brad
2016-01-01
P values have become the scapegoat for a wide variety of problems in science. P values are generally over-emphasized, often incorrectly applied, and in some cases even abused. However, alternative methods of hypothesis testing will likely fall victim to the same criticisms currently leveled at P values if more fundamental changes are not made in the research process. Increasing the general level of statistical literacy and enhancing training in statistical methods provide a potential avenue for identifying, correcting, and preventing erroneous conclusions from entering the academic literature and for improving the general quality of patient care. PMID:28366961
Mercury concentrations in lentic fish populations related to ecosystem and watershed characteristics
Andrew L. Rypel
2010-01-01
Predicting mercury (Hg) concentrations of fishes at large spatial scales is a fundamental environmental challenge with the potential to improve human health. In this study, mercury concentrations were examined for five species across 161 lakes and ecosystem, and watershed parameters were investigated as explanatory variables in statistical models. For all species, Hg...
Improving the College Scorecard: Using Student Feedback to Create an Effective Disclosure
ERIC Educational Resources Information Center
Morgan, Julie Margetta; Dechter, Gadi
2012-01-01
The White House will soon unveil a final version of its "college scorecard"--an online tool giving college-bound students and their families a hype-free snapshot of reliable information about any U.S. campus: real costs, graduation rates, student debt statistics, and earning potential of graduates. The college scorecard is a good idea…
Preparing Teachers for a Mobile World, to Improve Access to Education
ERIC Educational Resources Information Center
Ally, Mohamed; Grimus, Margarete; Ebner, Martin
2014-01-01
Recent statistics on the use of mobile technology proclaim that the world is becoming mobile. People use their phones to socialize, to conduct business, to search for information, and more. For the first time in history, people around the world have the potential to learn from any location at their own convenience. But first, education systems…
NASA Astrophysics Data System (ADS)
Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting
2017-04-01
Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a pixel spacing of 40 meters near Prydz Bay area, East Antarctica. Main work is listed as follows: 1) A mixture statistical distribution based CRF algorithm has been developed for leads detection from Sentinel-1A dual polarization images. 2) The assessment of the proposed mixture statistical distribution based CRF method and single distribution based CRF algorithm has been presented. 3) The preferable parameters sets including statistical distributions, the aspect ratio threshold and spatial smoothing window size have been provided. In the future, the proposed algorithm will be developed for the operational Sentinel series data sets processing due to its less time consuming cost and high accuracy in leads detection.
Space, time, and the third dimension (model error)
Moss, Marshall E.
1979-01-01
The space-time tradeoff of hydrologic data collection (the ability to substitute spatial coverage for temporal extension of records or vice versa) is controlled jointly by the statistical properties of the phenomena that are being measured and by the model that is used to meld the information sources. The control exerted on the space-time tradeoff by the model and its accompanying errors has seldom been studied explicitly. The technique, known as Network Analyses for Regional Information (NARI), permits such a study of the regional regression model that is used to relate streamflow parameters to the physical and climatic characteristics of the drainage basin.The NARI technique shows that model improvement is a viable and sometimes necessary means of improving regional data collection systems. Model improvement provides an immediate increase in the accuracy of regional parameter estimation and also increases the information potential of future data collection. Model improvement, which can only be measured in a statistical sense, cannot be quantitatively estimated prior to its achievement; thus an attempt to upgrade a particular model entails a certain degree of risk on the part of the hydrologist.
Ledbetter, Alexander K; Sohlberg, McKay Moore; Fickas, Stephen F; Horney, Mark A; McIntosh, Kent
2017-11-06
This study evaluated a computer-based prompting intervention for improving expository essay writing after acquired brain injury (ABI). Four undergraduate participants aged 18-21 with mild-moderate ABI and impaired fluid cognition at least 6 months post-injury reported difficulty with the writing process after injury. The study employed a non-concurrent multiple probe across participants, in a single-case design. Outcome measures included essay quality scores and number of revisions to writing counted then coded by type using a revision taxonomy. An inter-scorer agreement procedure was completed for quality scores for 50% of essays, with data indicating that agreement exceeded a goal of 85%. Visual analysis of results showed increased essay quality for all participants in intervention phase compared with baseline, maintained 1 week after. Statistical analyses showed statistically significant results for two of the four participants. The authors discuss external cuing for self-monitoring and tapping of existing writing knowledge as possible explanations for improvement. The study provides preliminary evidence that computer-based prompting has potential to improve writing quality for undergraduates with ABI.
Nurse-Administered Hand Massage: Integration Into an Infusion Suite's Standard of Care .
Braithwaite, Caitlin M; Ringdahl, Deborah
2017-08-01
Nurse-delivered hand massage is a safe and effective intervention that has potential for positively affecting nursing and patient outcomes. . Nurses in a National Cancer Institute-designated academic health center outpatient chemotherapy infusion suite were taught how to administer a hand massage to strengthen the nurse-patient relationship and improve patient experience, comfort, satisfaction, stress, and anxiety. . A pre-/postimplementation group comparison design was used. Patients in both groups completed self-reported measures of stress, comfort, satisfaction, and anxiety. Nurses completed Likert-type scales pre- and postimplementation on the perceived benefits of hand massage to the patient and nursing practice, impact on patient anxiety, and preparation in providing a hand massage. . A positive trend was seen in all indicators. Patients who received a hand massage had a statistically significant improvement in comfort (p = 0.025) compared to those who did not. A statistically significant improvement was seen in all nurse indicators pre- to postimplementation.
Improved population estimates through the use of auxiliary information
Johnson, D.H.; Ralph, C.J.; Scott, J.M.
1981-01-01
When estimating the size of a population of birds, the investigator may have, in addition to an estimator based on a statistical sample, information on one of several auxiliary variables, such as: (1) estimates of the population made on previous occasions, (2) measures of habitat variables associated with the size of the population, and (3) estimates of the population sizes of other species that correlate with the species of interest. Although many studies have described the relationships between each of these kinds of data and the population size to be estimated, very little work has been done to improve the estimator by incorporating such auxiliary information. A statistical methodology termed 'empirical Bayes' seems to be appropriate to these situations. The potential that empirical Bayes methodology has for improved estimation of the population size of the Mallard (Anas platyrhynchos) is explored. In the example considered, three empirical Bayes estimators were found to reduce the error by one-fourth to one-half of that of the usual estimator.
Hilarion, Pilar; Groene, Oliver; Colom, Joan; Lopez, Rosa M; Suñol, Rosa
2010-10-23
The Health Department of the Regional Government of Catalonia, Spain, issued a quality plan for substance abuse centers. The objective of this paper is to evaluate the impact of a multidimensional quality improvement initiative in the field of substance abuse care and to discuss potentials and limitations for further quality improvement. The study uses an uncontrolled, sector-wide pre-post design. All centers providing services for persons with substance abuse issues in the Autonomous Community of Catalonia participated in this assessment. Measures of compliance were developed based on indicators reported in the literature and by broad stakeholder involvement. We compared pre-post differences in dimension-specific and overall compliance-scores using one-way ANOVA for repeated measures and the Friedman statistic. We described the spread of the data using the inter-quartile range and the Fligner-Killen statistic. Finally, we adjusted compliance scores for location and size using linear and logistic regression models. We performed a baseline and follow up assessment in 22 centers for substance abuse care and observed substantial and statistically significant improvements for overall compliance (pre: 60.9%; post: 79.1%) and for compliance in the dimensions 'care pathway' (pre: 66.5%; post: 83.5%) and 'organization and management' (pre: 50.5%; post: 77.2%). We observed improvements in the dimension 'environment and infrastructure' (pre: 81.8%; post: 95.5%) and in the dimension 'relations and user rights' (pre: 66.5%; post: 72.5%); however, these were not statistically significant. The regression analysis suggests that improvements in compliance are positively influenced by being located in the Barcelona region in case of the dimension 'relations and user rights'. The positive results of this quality improvement initiative are possibly associated with the successful involvement of stakeholders, the consciously constructed feedback reports on individual and sector-wide performance and the support of evidence-based guidance wherever possible. Further research should address how contextual issues shape the uptake and effectiveness of quality improvement actions and how such quality improvements can be sustained.
Providing peak river flow statistics and forecasting in the Niger River basin
NASA Astrophysics Data System (ADS)
Andersson, Jafet C. M.; Ali, Abdou; Arheimer, Berit; Gustafsson, David; Minoungou, Bernard
2017-08-01
Flooding is a growing concern in West Africa. Improved quantification of discharge extremes and associated uncertainties is needed to improve infrastructure design, and operational forecasting is needed to provide timely warnings. In this study, we use discharge observations, a hydrological model (Niger-HYPE) and extreme value analysis to estimate peak river flow statistics (e.g. the discharge magnitude with a 100-year return period) across the Niger River basin. To test the model's capacity of predicting peak flows, we compared 30-year maximum discharge and peak flow statistics derived from the model vs. derived from nine observation stations. The results indicate that the model simulates peak discharge reasonably well (on average + 20%). However, the peak flow statistics have a large uncertainty range, which ought to be considered in infrastructure design. We then applied the methodology to derive basin-wide maps of peak flow statistics and their associated uncertainty. The results indicate that the method is applicable across the hydrologically active part of the river basin, and that the uncertainty varies substantially depending on location. Subsequently, we used the most recent bias-corrected climate projections to analyze potential changes in peak flow statistics in a changed climate. The results are generally ambiguous, with consistent changes only in very few areas. To test the forecasting capacity, we ran Niger-HYPE with a combination of meteorological data sets for the 2008 high-flow season and compared with observations. The results indicate reasonable forecasting capacity (on average 17% deviation), but additional years should also be evaluated. We finish by presenting a strategy and pilot project which will develop an operational flood monitoring and forecasting system based in-situ data, earth observations, modelling, and extreme statistics. In this way we aim to build capacity to ultimately improve resilience toward floods, protecting lives and infrastructure in the region.
A critical assessment of mortality statistics in Thailand: potential for improvements.
Tangcharoensathien, Viroj; Faramnuayphol, Pinij; Teokul, Waranya; Bundhamcharoen, Kanitta; Wibulpholprasert, Suwit
2006-01-01
This study evaluates the collection and flow of mortality and cause-of-death (COD) data in Thailand, identifying areas of weakness and presenting potential approaches to improve these statistics. Methods include systems analysis, literature review, and the application of the Health Metrics Network (HMN) self-assessment tool by key stakeholders. We identified two weaknesses underlying incompleteness of death registration and inaccuracy of COD attribution: problems in recording events or certifying deaths, and problems in transferring information from death certificates to death registers. Deaths occurring outside health facilities, representing 65% of all deaths in Thailand, contribute to the inaccuracy of cause-of-death data because they must be certified by village heads with limited knowledge and expertise in cause-of-death attribution. However, problems also exist with in-hospital cause-of-death certification by physicians. Priority should be given to training medical personnel in death certification, review of medical records by health personnel in district hospitals, and use of verbal autopsy techniques for assessing internal consistency. This should be coupled with stronger collaboration with district registrars for the 65% of deaths that occur outside hospitals. Training of physicians and data coders and harmonization of death certificates and registries would improve COD data for the 35% of deaths that take place in hospital. Public awareness of the importance of registering all deaths and the application of registration requirements prior to funerals would also improve coverage, though enforcement would be difficult. PMID:16583083
NASA Astrophysics Data System (ADS)
Xu, Xianjin; Yan, Chengfei; Zou, Xiaoqin
2017-08-01
The growing number of protein-ligand complex structures, particularly the structures of proteins co-bound with different ligands, in the Protein Data Bank helps us tackle two major challenges in molecular docking studies: the protein flexibility and the scoring function. Here, we introduced a systematic strategy by using the information embedded in the known protein-ligand complex structures to improve both binding mode and binding affinity predictions. Specifically, a ligand similarity calculation method was employed to search a receptor structure with a bound ligand sharing high similarity with the query ligand for the docking use. The strategy was applied to the two datasets (HSP90 and MAP4K4) in recent D3R Grand Challenge 2015. In addition, for the HSP90 dataset, a system-specific scoring function (ITScore2_hsp90) was generated by recalibrating our statistical potential-based scoring function (ITScore2) using the known protein-ligand complex structures and the statistical mechanics-based iterative method. For the HSP90 dataset, better performances were achieved for both binding mode and binding affinity predictions comparing with the original ITScore2 and with ensemble docking. For the MAP4K4 dataset, although there were only eight known protein-ligand complex structures, our docking strategy achieved a comparable performance with ensemble docking. Our method for receptor conformational selection and iterative method for the development of system-specific statistical potential-based scoring functions can be easily applied to other protein targets that have a number of protein-ligand complex structures available to improve predictions on binding.
Yoga-Enhanced Cognitive Behavioral Therapy (Y-CBT) for Anxiety Management: A Pilot Study
Khalsa, Manjit K.; Greiner-Ferris, Julie M.; Hofmann, Stefan G.; Khalsa, Sat Bir S.
2014-01-01
Cognitive behavioral therapy is an effective treatment for generalized anxiety disorder (GAD), but there is still room for improvement. The aim of the present study was to examine the potential benefit of enriching cognitive behavioral therapy (CBT) with Kundalini Yoga (Y-CBT). Participants consisted of treatment resistant clients at a community mental health clinic. A total of 32 participants enrolled in the study and 22 completed the program. After the Y-CBT intervention, pre-post comparisons showed statistically significant improvements in state and trait anxiety, depression, panic, sleep, and quality of life. Results from this preliminary study suggest that Y-CBT may have potential as a promising treatment for those suffering from GAD. PMID:24804619
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
NASA Astrophysics Data System (ADS)
Khatri, Ayisha Al; Jens, Grundmann; der Weth Rüdiger, van; Niels, Schütze
2015-04-01
Al Batinah coastal area is the main agricultural region in Oman. Agriculture is concentrated in Al Batinah, because of more fertile soils and easier access to water in the form of groundwater compared to other administrative areas in the country. The region now is facing a problem as a result of over abstraction of fresh groundwater for irrigation from the main aquifer along the coast. This enforces the inflow of sea water into the coastal aquifer and causes salinization of the groundwater. As a consequence the groundwater becomes no longer suitable for irrigation which impacts the social and economical situation of farmers as well as the environment. Therefore, the existing situation generates conflicts between different stakeholders regarding water availability, sustainable aquifer management, and profitable agricultural production in Al Batinah region. Several management measures to maintain the groundwater aquifer in the region, were implemented by the government. However, these solutions showed only limited successes for the existing problem. The aim of this study now is to evaluate the implementation potential of several management interventions and their combinations by analysing opinions and responses of all relevant stakeholders in the region. This is done in order to identify potential conflicts among stakeholders to a participatory process within the frame of an integrated water resources management and to support decision makers in taking more informed decisions. Questionnaires were designed for collecting data from different groups of stakeholders e.g. water professionals, farmers from the study area and decision makers of different organizations and ministries. These data were analysed statistically for each group separately as well as regarding relations amongst groups by using the SPSS (Statistical Package for Social Science) software package. Results show, that the need to improve the situation is supported by all groups. However, significant differences exist between groups on how to achieve this improvement, since farmers prefer management interventions operating more on the water resources side while decision makers support measures for a better management on the water demand side. Furthermore, the opinions within single groups are sometimes contradicting for several management interventions. The use of more advanced statistical methods like discriminant analysis or Bayesian network allow for identifying factors and drivers to explain these differences. Both approaches, will help to understand stakeholder's behaviours and to evaluate the implementation potential of several management interventions. Keywords IWRM, Stakeholder participation, field survey, statistical analysis, Oman
Statistical Techniques for Assessing water‐quality effects of BMPs
Walker, John F.
1994-01-01
Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)
Birenbaum, H J; Pfoh, E R; Helou, S; Pane, M A; Marinkovich, G A; Dentry, A; Yeh, Hsin-Chieh; Updegraff, L; Arnold, C; Liverman, S; Cawman, H
2016-05-19
We previously demonstrated a significant reduction in our incidence of chronic lung disease in our NICU using potentially better practices of avoiding delivery room endotracheal intubation and using early nasal CPAP. We sought to demonstrate whether these improvements were sustained and or improved over time. We conducted a retrospective, cross-sectional analysis of infants 501-1500 grams born at our hospital between 2005 and 2013. Infants born during the 2005-2007, 2008-2010 and 2011-2013 epochs were grouped together, respectively. Descriptive analysis was conducted to determine the number and percent of maternal and neonatal characteristics by year grouping. Chi-squared tests were used to determine whether there were any statistically significant changes in characteristics across year groupings.. Two outcome variables were assessed: a diagnosis of chronic lung disease based on the Vermont Oxford Network definition and being discharged home on supplemental oxygen. There was a statistically significant improvement in the incidence of chronic lung disease in infants below 27 weeks' gestation in the three year period in the 2011-2013 cohort compared with those in the 2005-2007 cohort. We also found a statistically significant improvement in the number of infants discharged on home oxygen with birth weights 751-1000 grams and infants with gestational age less than 27 weeks in the 2011-2013 cohort compared to the 2005-2007 cohort. We demonstrated sustained improvement in our incidence of CLD between 2005 and 2013. We speculate that a multifaceted strategy of avoiding intubation and excessive oxygen in the delivery room, the early use of CPAP, as well as the use of volume targeted ventilation, when needed, may help significantly reduce the incidence of CLD.
Collection development using interlibrary loan borrowing and acquisitions statistics.
Byrd, G D; Thomas, D A; Hughes, K E
1982-01-01
Libraries, especially those supporting the sciences, continually face the problem of selecting appropriate new books for their users. Traditional collection development techniques include the use of librarian or user subject specialists, user recommendations, and approval plans. These methods of selection, however, are most effective in large libraries and do not systemically correlate new book purchases with the actual demands of users served. This paper describes a statistical method for determining subject strengths and weaknesses in a library book collection in relation to user demand. Using interlibrary loan borrowing and book acquisition statistics gathered for one fiscal year from three health sciences libraries, the authors developed a way to graph the broad and narrow subject fields of strength and potential weakness in a book collection. This method has the advantages of simplicity, speed of implementation, and clarity. It can also be used over a period of time to verify the success or failure of a collection development program. Finally, the method has potential as a tool for use by two or more libraries seeking to improve cooperative collection development in a network or consortium. PMID:7059712
Hao, Chen; LiJun, Chen; Albright, Thomas P.
2007-01-01
Invasive exotic species pose a growing threat to the economy, public health, and ecological integrity of nations worldwide. Explaining and predicting the spatial distribution of invasive exotic species is of great importance to prevention and early warning efforts. We are investigating the potential distribution of invasive exotic species, the environmental factors that influence these distributions, and the ability to predict them using statistical and information-theoretic approaches. For some species, detailed presence/absence occurrence data are available, allowing the use of a variety of standard statistical techniques. However, for most species, absence data are not available. Presented with the challenge of developing a model based on presence-only information, we developed an improved logistic regression approach using Information Theory and Frequency Statistics to produce a relative suitability map. This paper generated a variety of distributions of ragweed (Ambrosia artemisiifolia L.) from logistic regression models applied to herbarium specimen location data and a suite of GIS layers including climatic, topographic, and land cover information. Our logistic regression model was based on Akaike's Information Criterion (AIC) from a suite of ecologically reasonable predictor variables. Based on the results we provided a new Frequency Statistical method to compartmentalize habitat-suitability in the native range. Finally, we used the model and the compartmentalized criterion developed in native ranges to "project" a potential distribution onto the exotic ranges to build habitat-suitability maps. ?? Science in China Press 2007.
Makeyev, Oleksandr; Joe, Cody; Lee, Colin; Besio, Walter G
2017-07-01
Concentric ring electrodes have shown promise in non-invasive electrophysiological measurement demonstrating their superiority to conventional disc electrodes, in particular, in accuracy of Laplacian estimation. Recently, we have proposed novel variable inter-ring distances concentric ring electrodes. Analytic and finite element method modeling results for linearly increasing distances electrode configurations suggested they may decrease the truncation error resulting in more accurate Laplacian estimates compared to currently used constant inter-ring distances configurations. This study assesses statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes. Full factorial design of analysis of variance was used with one categorical and two numerical factors: the inter-ring distances, the electrode diameter, and the number of concentric rings in the electrode. The response variables were the Relative Error and the Maximum Error of Laplacian estimation computed using a finite element method model for each of the combinations of levels of three factors. Effects of the main factors and their interactions on Relative Error and Maximum Error were assessed and the obtained results suggest that all three factors have statistically significant effects in the model confirming the potential of using inter-ring distances as a means of improving accuracy of Laplacian estimation.
An analysis of landing rates and separations at the Dallas/Fort Worth International Airport
NASA Technical Reports Server (NTRS)
Ballin, Mark G.; Erzberger, Heinz
1996-01-01
Advanced air traffic management systems such as the Center/TRACON Automation System (CTAS) should yield a wide range of benefits, including reduced aircraft delays and controller workload. To determine the traffic-flow benefits achievable from future terminal airspace automation, live radar information was used to perform an analysis of current aircraft landing rates and separations at the Dallas/Fort Worth International Airport. Separation statistics that result when controllers balance complex control procedural constraints in order to maintain high landing rates are presented. In addition, the analysis estimates the potential for airport capacity improvements by determining the unused landing opportunities that occur during rush traffic periods. Results suggest a large potential for improving the accuracy and consistency of spacing between arrivals on final approach, and they support earlier simulation findings that improved air traffic management would increase capacity and reduce delays.
Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.
ERIC Educational Resources Information Center
Peacock, Darren
This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…
Dickinson, Kathleen; Place, Maurice
2016-06-01
Problems with social functioning are a major area of difficulty for children with autism. Such problems have the potential to exert a negative influence on several aspects of the children's functioning, including their ability to access education. This study looked to examine if a computer-based activity program could improve the social functioning of these children. Using a pooled subject design, 100 children with autistic spectrum disorder were randomly allocated, controlling where possible for age and gender, to either an intervention or a control group. The children in the intervention group were encouraged to use the Nintendo (Kyoto, Japan) Wii™ and the software package "Mario & Sonic at the Olympics" in addition to their routine school physical education classes over a 9-month period. The control group attended only the routine physical education classes. After 1 year, analysis of the changes in the scores of teacher-completed measures of social functioning showed that boys in the intervention group had made statistically significant improvement in their functioning when compared with controls. The number of girls in the study was too small for any change to reach statistical significance. This type of intervention appears to have potential as a mechanism to produce improvement in the social functioning, at least of boys, as part of a physical education program.
NASA Astrophysics Data System (ADS)
Bhowmik, Mrinal Kanti; Gogoi, Usha Rani; Das, Kakali; Ghosh, Anjan Kumar; Bhattacharjee, Debotosh; Majumdar, Gautam
2016-05-01
The non-invasive, painless, radiation-free and cost-effective infrared breast thermography (IBT) makes a significant contribution to improving the survival rate of breast cancer patients by early detecting the disease. This paper presents a set of standard breast thermogram acquisition protocols to improve the potentiality and accuracy of infrared breast thermograms in early breast cancer detection. By maintaining all these protocols, an infrared breast thermogram acquisition setup has been established at the Regional Cancer Centre (RCC) of Government Medical College (AGMC), Tripura, India. The acquisition of breast thermogram is followed by the breast thermogram interpretation, for identifying the presence of any abnormality. However, due to the presence of complex vascular patterns, accurate interpretation of breast thermogram is a very challenging task. The bilateral symmetry of the thermal patterns in each breast thermogram is quantitatively computed by statistical feature analysis. A series of statistical features are extracted from a set of 20 thermograms of both healthy and unhealthy subjects. Finally, the extracted features are analyzed for breast abnormality detection. The key contributions made by this paper can be highlighted as -- a) the designing of a standard protocol suite for accurate acquisition of breast thermograms, b) creation of a new breast thermogram dataset by maintaining the protocol suite, and c) statistical analysis of the thermograms for abnormality detection. By doing so, this proposed work can minimize the rate of false findings in breast thermograms and thus, it will increase the utilization potentiality of breast thermograms in early breast cancer detection.
On the analysis of very small samples of Gaussian repeated measurements: an alternative approach.
Westgate, Philip M; Burchett, Woodrow W
2017-03-15
The analysis of very small samples of Gaussian repeated measurements can be challenging. First, due to a very small number of independent subjects contributing outcomes over time, statistical power can be quite small. Second, nuisance covariance parameters must be appropriately accounted for in the analysis in order to maintain the nominal test size. However, available statistical strategies that ensure valid statistical inference may lack power, whereas more powerful methods may have the potential for inflated test sizes. Therefore, we explore an alternative approach to the analysis of very small samples of Gaussian repeated measurements, with the goal of maintaining valid inference while also improving statistical power relative to other valid methods. This approach uses generalized estimating equations with a bias-corrected empirical covariance matrix that accounts for all small-sample aspects of nuisance correlation parameter estimation in order to maintain valid inference. Furthermore, the approach utilizes correlation selection strategies with the goal of choosing the working structure that will result in the greatest power. In our study, we show that when accurate modeling of the nuisance correlation structure impacts the efficiency of regression parameter estimation, this method can improve power relative to existing methods that yield valid inference. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Massive parallelization of serial inference algorithms for a complex generalized linear model
Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David
2014-01-01
Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363
Enhanced Higgs boson to τ(+)τ(-) search with deep learning.
Baldi, P; Sadowski, P; Whiteson, D
2015-03-20
The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.
Stern, Hal S; Blower, Daniel; Cohen, Michael L; Czeisler, Charles A; Dinges, David F; Greenhouse, Joel B; Guo, Feng; Hanowski, Richard J; Hartenbaum, Natalie P; Krueger, Gerald P; Mallis, Melissa M; Pain, Richard F; Rizzo, Matthew; Sinha, Esha; Small, Dylan S; Stuart, Elizabeth A; Wegman, David H
2018-03-09
This article summarizes the recommendations on data and methodology issues for studying commercial motor vehicle driver fatigue of a National Academies of Sciences, Engineering, and Medicine study. A framework is provided that identifies the various factors affecting driver fatigue and relating driver fatigue to crash risk and long-term driver health. The relevant factors include characteristics of the driver, vehicle, carrier and environment. Limitations of existing data are considered and potential sources of additional data described. Statistical methods that can be used to improve understanding of the relevant relationships from observational data are also described. The recommendations for enhanced data collection and the use of modern statistical methods for causal inference have the potential to enhance our understanding of the relationship of fatigue to highway safety and to long-term driver health. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Improved side-chain torsion potentials for the Amber ff99SB protein force field
Lindorff-Larsen, Kresten; Piana, Stefano; Palmo, Kim; Maragakis, Paul; Klepeis, John L; Dror, Ron O; Shaw, David E
2010-01-01
Recent advances in hardware and software have enabled increasingly long molecular dynamics (MD) simulations of biomolecules, exposing certain limitations in the accuracy of the force fields used for such simulations and spurring efforts to refine these force fields. Recent modifications to the Amber and CHARMM protein force fields, for example, have improved the backbone torsion potentials, remedying deficiencies in earlier versions. Here, we further advance simulation accuracy by improving the amino acid side-chain torsion potentials of the Amber ff99SB force field. First, we used simulations of model alpha-helical systems to identify the four residue types whose rotamer distribution differed the most from expectations based on Protein Data Bank statistics. Second, we optimized the side-chain torsion potentials of these residues to match new, high-level quantum-mechanical calculations. Finally, we used microsecond-timescale MD simulations in explicit solvent to validate the resulting force field against a large set of experimental NMR measurements that directly probe side-chain conformations. The new force field, which we have termed Amber ff99SB-ILDN, exhibits considerably better agreement with the NMR data. Proteins 2010. © 2010 Wiley-Liss, Inc. PMID:20408171
NASA Astrophysics Data System (ADS)
Kim, Dae Hoe; Choi, Jae Young; Choi, Seon Hyeong; Ro, Yong Man
2012-03-01
In this study, a novel mammogram enhancement solution is proposed, aiming to improve the quality of subsequent mass segmentation in mammograms. It has been widely accepted that characteristics of masses are usually hyper-dense or uniform density with respect to its background. Also, their core parts are likely to have high-intensity values while the values of intensity tend to be decreased as the distance to core parts increases. Based on the aforementioned observations, we develop a new and effective mammogram enhancement method by combining local statistical measurements and Sliding Band Filtering (SBF). By effectively combining local statistical measurements and SBF, we are able to improve the contrast of the bright and smooth regions (which represent potential mass regions), as well as, at the same time, the regions where their surrounding gradients are converging to the centers of regions of interest. In this study, 89 mammograms were collected from the public MAIS database (DB) to demonstrate the effectiveness of the proposed enhancement solution in terms of improving mass segmentation. As for a segmentation method, widely used contour-based segmentation approach was employed. The contour-based method in conjunction with the proposed enhancement solution achieved overall detection accuracy of 92.4% with a total of 85 correct cases. On the other hand, without using our enhancement solution, overall detection accuracy of the contour-based method was only 78.3%. In addition, experimental results demonstrated the feasibility of our enhancement solution for the purpose of improving detection accuracy on mammograms containing dense parenchymal patterns.
Cardiac surgery report cards: comprehensive review and statistical critique.
Shahian, D M; Normand, S L; Torchiana, D F; Lewis, S M; Pastore, J O; Kuntz, R E; Dreyer, P I
2001-12-01
Public report cards and confidential, collaborative peer education represent distinctly different approaches to cardiac surgery quality assessment and improvement. This review discusses the controversies regarding their methodology and relative effectiveness. Report cards have been the more commonly used approach, typically as a result of state legislation. They are based on the presumption that publication of outcomes effectively motivates providers, and that market forces will reward higher quality. Numerous studies have challenged the validity of these hypotheses. Furthermore, although states with report cards have reported significant decreases in risk-adjusted mortality, it is unclear whether this improvement resulted from public disclosure or, rather, from the development of internal quality programs by hospitals. An additional confounding factor is the nationwide decline in heart surgery mortality, including states without quality monitoring. Finally, report cards may engender negative behaviors such as high-risk case avoidance and "gaming" of the reporting system, especially if individual surgeon results are published. The alternative approach, continuous quality improvement, may provide an opportunity to enhance performance and reduce interprovider variability while avoiding the unintended negative consequences of report cards. This collaborative method, which uses exchange visits between programs and determination of best practice, has been highly effective in northern New England and in the Veterans Affairs Administration. However, despite their potential advantages, quality programs based solely on confidential continuous quality improvement do not address the issue of public accountability. For this reason, some states may continue to mandate report cards. In such instances, it is imperative that appropriate statistical techniques and report formats are used, and that professional organizations simultaneously implement continuous quality improvement programs. The statistical methodology underlying current report cards is flawed, and does not justify the degree of accuracy presented to the public. All existing risk-adjustment methods have substantial inherent imprecision, and this is compounded when the results of such patient-level models are aggregated and used inappropriately to assess provider performance. Specific problems include sample size differences, clustering of observations, multiple comparisons, and failure to account for the random component of interprovider variability. We advocate the use of hierarchical or multilevel statistical models to address these concerns, as well as report formats that emphasize the statistical uncertainty of the results.
Rossaro, Lorenzo; Tran, Thu P; Ransibrahmanakul, Kanat; Rainwater, Julie A; Csik, Genell; Cole, Stacey L; Prosser, Colette C; Nesbitt, Thomas S
2007-06-01
This study compared the impact of multipoint videoconferencing (VC) versus standard lecturing (ST) on primary care providers' (MDs, NPs/PAs, and RNs) education regarding hepatitis C virus (HCV). The hypothesis was that the educational impact of teaching through telemedicine is comparable to the traditional method. The aim was to provide participants clinically relevant information and knowledge about the natural history, diagnosis, and management of HCV. Improved knowledge was scored from a 10-item quiz administered before and after the educational intervention. Comparison of the pretest knowledge scores within provider groups showed no statistically significant difference in baseline knowledge for the ST versus VC method. However, for all practitioners combined, the VC group scored significantly lower on the pretest than the ST group (p < 0.05). All three types of learners improved their knowledge scores following intervention. On average, MDs and NP/PAs correctly answered two to 3.5 more questions in the posttest. RNs showed the greatest improvements, correctly answering an average of four to five more questions following intervention. Improvement in knowledge scores between the two methods was statistically significant in favor of VC for the MDs (VC = 3.56 +/- 1.92 vs. ST = 2.13 +/- 1.89, p < 0.001) and all groups combined (VC 4.37 +/- 1.92 vs ST 3.06 +/- 1.89, p < 0.001). The results of this study indicate that VC is equivalent, if not better, than standard continuing medical education (CME). VC can potentially improve clinician education regarding the history, diagnosis, and management of HCV, thereby making a substantial impact on the clinical course of patients with this condition. In addition, VC has the potential to eliminate the financial and geographic barriers to professional education for rural practitioners.
An emission-weighted proximity model for air pollution exposure assessment.
Zou, Bin; Wilson, J Gaines; Zhan, F Benjamin; Zeng, Yongnian
2009-08-15
Among the most common spatial models for estimating personal exposure are Traditional Proximity Models (TPMs). Though TPMs are straightforward to configure and interpret, they are prone to extensive errors in exposure estimates and do not provide prospective estimates. To resolve these inherent problems with TPMs, we introduce here a novel Emission Weighted Proximity Model (EWPM) to improve the TPM, which takes into consideration the emissions from all sources potentially influencing the receptors. EWPM performance was evaluated by comparing the normalized exposure risk values of sulfur dioxide (SO(2)) calculated by EWPM with those calculated by TPM and monitored observations over a one-year period in two large Texas counties. In order to investigate whether the limitations of TPM in potential exposure risk prediction without recorded incidence can be overcome, we also introduce a hybrid framework, a 'Geo-statistical EWPM'. Geo-statistical EWPM is a synthesis of Ordinary Kriging Geo-statistical interpolation and EWPM. The prediction results are presented as two potential exposure risk prediction maps. The performance of these two exposure maps in predicting individual SO(2) exposure risk was validated with 10 virtual cases in prospective exposure scenarios. Risk values for EWPM were clearly more agreeable with the observed concentrations than those from TPM. Over the entire study area, the mean SO(2) exposure risk from EWPM was higher relative to TPM (1.00 vs. 0.91). The mean bias of the exposure risk values of 10 virtual cases between EWPM and 'Geo-statistical EWPM' are much smaller than those between TPM and 'Geo-statistical TPM' (5.12 vs. 24.63). EWPM appears to more accurately portray individual exposure relative to TPM. The 'Geo-statistical EWPM' effectively augments the role of the standard proximity model and makes it possible to predict individual risk in future exposure scenarios resulting in adverse health effects from environmental pollution.
ERIC Educational Resources Information Center
Currie, Janet; Moretti, Enrico
This study estimates the effect of maternal education on birth outcomes using data from the Vital Statistics Natality files for 1970 to 1999. It also assesses the importance of four potential channels through which maternal education may improve birth outcomes: use of prenatal care, smoking behavior, marriage, and fertility. In an effort to…
Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model
NASA Technical Reports Server (NTRS)
Boone, Spencer
2017-01-01
This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.
Health benefits achieved through the Seventh-Day Adventist Wellness Challenge program.
Kamieneski, R; Brown, C M; Mitchell, C; Perrin, K M; Dindial, K
2000-11-01
The Wellness Challenge program introduces the philosophy of the healing power of God and stresses the importance of developing a sense of spirituality in conjunction with the promotion of good health. To employ scientific rigor to the outcome measures of the Seventh-Day Adventist Wellness Challenge program. A 2-tailed, paired sample t test. East Pasco Medical Center in Zephyrhills, Fla. 165 participants. Presurvey, 21-day outpatient wellness intervention; postsurvey, 6 weeks after completion of the program. Changes in behaviors related to cigarette smoking, alcohol use, eating patterns, exercise, water consumption, rest, relaxation, and time spent outdoors, as well as demographic data. Statistically significant differences were found between the pre- and postprogram clinical and laboratory test results for the participants' blood pressure, weight, glucose levels, and cholesterol at .05 alpha. Furthermore, self-health improvements measured by a pre- and postsurvey response confirmed statistically significant improvement in participants' willingness to improve their lifestyle behaviors for a potentially greater quality of life. The Wellness Challenge program offers ways to reduce risk factors related to chronic disease while improving the quality of life within an adult population by allowing people to slowly incorporate newly acquired tools into their everyday life.
Bell Correlations in a Many-Body System with Finite Statistics
NASA Astrophysics Data System (ADS)
Wagner, Sebastian; Schmied, Roman; Fadel, Matteo; Treutlein, Philipp; Sangouard, Nicolas; Bancal, Jean-Daniel
2017-10-01
A recent experiment reported the first violation of a Bell correlation witness in a many-body system [Science 352, 441 (2016)]. Following discussions in this Letter, we address here the question of the statistics required to witness Bell correlated states, i.e., states violating a Bell inequality, in such experiments. We start by deriving multipartite Bell inequalities involving an arbitrary number of measurement settings, two outcomes per party and one- and two-body correlators only. Based on these inequalities, we then build up improved witnesses able to detect Bell correlated states in many-body systems using two collective measurements only. These witnesses can potentially detect Bell correlations in states with an arbitrarily low amount of spin squeezing. We then establish an upper bound on the statistics needed to convincingly conclude that a measured state is Bell correlated.
Bell Correlations in a Many-Body System with Finite Statistics.
Wagner, Sebastian; Schmied, Roman; Fadel, Matteo; Treutlein, Philipp; Sangouard, Nicolas; Bancal, Jean-Daniel
2017-10-27
A recent experiment reported the first violation of a Bell correlation witness in a many-body system [Science 352, 441 (2016)]. Following discussions in this Letter, we address here the question of the statistics required to witness Bell correlated states, i.e., states violating a Bell inequality, in such experiments. We start by deriving multipartite Bell inequalities involving an arbitrary number of measurement settings, two outcomes per party and one- and two-body correlators only. Based on these inequalities, we then build up improved witnesses able to detect Bell correlated states in many-body systems using two collective measurements only. These witnesses can potentially detect Bell correlations in states with an arbitrarily low amount of spin squeezing. We then establish an upper bound on the statistics needed to convincingly conclude that a measured state is Bell correlated.
Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.
2016-01-01
Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). Conclusion This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832
Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M
2016-01-01
Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.
New U.S. Geological Survey Method for the Assessment of Reserve Growth
Klett, Timothy R.; Attanasi, E.D.; Charpentier, Ronald R.; Cook, Troy A.; Freeman, P.A.; Gautier, Donald L.; Le, Phuong A.; Ryder, Robert T.; Schenk, Christopher J.; Tennyson, Marilyn E.; Verma, Mahendra K.
2011-01-01
Reserve growth is defined as the estimated increases in quantities of crude oil, natural gas, and natural gas liquids that have the potential to be added to remaining reserves in discovered accumulations through extension, revision, improved recovery efficiency, and additions of new pools or reservoirs. A new U.S. Geological Survey method was developed to assess the reserve-growth potential of technically recoverable crude oil and natural gas to be added to reserves under proven technology currently in practice within the trend or play, or which reasonably can be extrapolated from geologically similar trends or plays. This method currently is in use to assess potential additions to reserves in discovered fields of the United States. The new approach involves (1) individual analysis of selected large accumulations that contribute most to reserve growth, and (2) conventional statistical modeling of reserve growth in remaining accumulations. This report will focus on the individual accumulation analysis. In the past, the U.S. Geological Survey estimated reserve growth by statistical methods using historical recoverable-quantity data. Those statistical methods were based on growth rates averaged by the number of years since accumulation discovery. Accumulations in mature petroleum provinces with volumetrically significant reserve growth, however, bias statistical models of the data; therefore, accumulations with significant reserve growth are best analyzed separately from those with less significant reserve growth. Large (greater than 500 million barrels) and older (with respect to year of discovery) oil accumulations increase in size at greater rates late in their development history in contrast to more recently discovered accumulations that achieve most growth early in their development history. Such differences greatly affect the statistical methods commonly used to forecast reserve growth. The individual accumulation-analysis method involves estimating the in-place petroleum quantity and its uncertainty, as well as the estimated (forecasted) recoverability and its respective uncertainty. These variables are assigned probabilistic distributions and are combined statistically to provide probabilistic estimates of ultimate recoverable quantities. Cumulative production and remaining reserves are then subtracted from the estimated ultimate recoverable quantities to provide potential reserve growth. In practice, results of the two methods are aggregated to various scales, the highest of which includes an entire country or the world total. The aggregated results are reported along with the statistically appropriate uncertainties.
Health benefits of particle filtration.
Fisk, W J
2013-10-01
The evidence of health benefits of particle filtration in homes and commercial buildings is reviewed. Prior reviews of papers published before 2000 are summarized. The results of 16 more recent intervention studies are compiled and analyzed. Also, reviewed are four studies that modeled health benefits of using filtration to reduce indoor exposures to particles from outdoors. Prior reviews generally concluded that particle filtration is, at best, a source of small improvements in allergy and asthma health effects; however, many early studies had weak designs. A majority of recent intervention studies employed strong designs and more of these studies report statistically significant improvements in health symptoms or objective health outcomes, particularly for subjects with allergies or asthma. The percentage improvement in health outcomes is typically modest, for example, 7% to 25%. Delivery of filtered air to the breathing zone of sleeping allergic or asthmatic persons may be more consistently effective in improving health than room air filtration. Notable are two studies that report statistically significant improvements, with filtration, in markers that predict future adverse coronary events. From modeling, the largest potential benefits of indoor particle filtration may be reductions in morbidity and mortality from reducing indoor exposures to particles from outdoor air. Published 2013. This article is a US Government work and is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge
2018-04-01
Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.
Reinventing Biostatistics Education for Basic Scientists
Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.
2016-01-01
Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055
Analyses on hydrophobicity and attractiveness of all-atom distance-dependent potentials
Shirota, Matsuyuki; Ishida, Takashi; Kinoshita, Kengo
2009-01-01
Accurate model evaluation is a crucial step in protein structure prediction. For this purpose, statistical potentials, which evaluate a model structure based on the observed atomic distance frequencies in comparison with those in reference states, have been widely used. The reference state is a virtual state where all of the atomic interactions are turned off, and it provides a standard to measure the observed frequencies. In this study, we examined seven all-atom distance-dependent potentials with different reference states. As results, we observed that the variations of atom pair composition and those of distance distributions in the reference states produced systematic changes in the hydrophobic and attractive characteristics of the potentials. The performance evaluations with the CASP7 structures indicated that the preference of hydrophobic interactions improved the correlation between the energy and the GDT-TS score, but decreased the Z-score of the native structure. The attractiveness of potential improved both the correlation and Z-score for template-based modeling targets, but the benefit was smaller in free modeling targets. These results indicated that the performances of the potentials were more strongly influenced by their characteristics than by the accuracy of the definitions of the reference states. PMID:19588493
Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego
2016-06-17
Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.
Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning
Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego
2016-01-01
Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults. PMID:27322273
Efficacy of Aquatic Treadmill Training on Gait Symmetry and Balance in Subacute Stroke Patients
2017-01-01
Objective To determine the efficacy of aquatic treadmill training (ATT) as a new modality for stroke rehabilitation, by assessing changes in gait symmetry, balance function, and subjective balance confidence for the paretic and non-paretic leg in stroke patients. Methods Twenty-one subacute stroke patients participated in 15 intervention sessions of aquatic treadmill training. The Comfortable 10-Meter Walk Test (CWT), spatiotemporal gait parameters, Berg Balance Scale (BBS), and Activities-specific Balance Confidence scale (ABC) were assessed pre- and post-interventions. Results From pre- to post-intervention, statistically significant improvements were observed in the CWT (0.471±0.21 to 0.558±0.23, p<0.001), BBS (39.66±8.63 to 43.80±5.21, p<0.001), and ABC (38.39±13.46 to 46.93±12.32, p<0.001). The step-length symmetry (1.017±0.25 to 0.990±0.19, p=0.720) and overall temporal symmetry (1.404±0.36 to 1.314±0.34, p=0.218) showed improvement without statistical significance. Conclusion ATT improves the functional aspects of gait, including CWT, BBS and ABC, and spatiotemporal gait symmetry, though without statistical significance. Further studies are required to examine and compare the potential benefits of ATT as a new modality for stroke therapy, with other modalities. PMID:28758074
Efficacy of Aquatic Treadmill Training on Gait Symmetry and Balance in Subacute Stroke Patients.
Lee, Mi Eun; Jo, Geun Yeol; Do, Hwan Kwon; Choi, Hee Eun; Kim, Woo Jin
2017-06-01
To determine the efficacy of aquatic treadmill training (ATT) as a new modality for stroke rehabilitation, by assessing changes in gait symmetry, balance function, and subjective balance confidence for the paretic and non-paretic leg in stroke patients. Twenty-one subacute stroke patients participated in 15 intervention sessions of aquatic treadmill training. The Comfortable 10-Meter Walk Test (CWT), spatiotemporal gait parameters, Berg Balance Scale (BBS), and Activities-specific Balance Confidence scale (ABC) were assessed pre- and post-interventions. From pre- to post-intervention, statistically significant improvements were observed in the CWT (0.471±0.21 to 0.558±0.23, p<0.001), BBS (39.66±8.63 to 43.80±5.21, p<0.001), and ABC (38.39±13.46 to 46.93±12.32, p<0.001). The step-length symmetry (1.017±0.25 to 0.990±0.19, p=0.720) and overall temporal symmetry (1.404±0.36 to 1.314±0.34, p=0.218) showed improvement without statistical significance. ATT improves the functional aspects of gait, including CWT, BBS and ABC, and spatiotemporal gait symmetry, though without statistical significance. Further studies are required to examine and compare the potential benefits of ATT as a new modality for stroke therapy, with other modalities.
Fine, Jason P.
2017-01-01
In studies with survival or time‐to‐event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high‐impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan–Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28102550
Applications of spatial statistical network models to stream data
Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal
2014-01-01
Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.
Trends in Child Poverty Using an Improved Measure of Poverty.
Wimer, Christopher; Nam, JaeHyun; Waldfogel, Jane; Fox, Liana
2016-04-01
The official measure of poverty has been used to assess trends in children's poverty rates for many decades. But because of flaws in official poverty statistics, these basic trends have the potential to be misleading. We use an augmented Current Population Survey data set that calculates an improved measure of poverty to reexamine child poverty rates between 1967 and 2012. This measure, the Anchored Supplemental Poverty Measure, is based partially on the US Census Bureau and Bureau of Labor Statistics' new Supplemental Poverty Measure. We focus on 3 age groups of children, those aged 0 to 5, 6 to 11, and 12 to 17 years. Young children have the highest poverty rates, both historically and today. However, among all age groups, long-term poverty trends have been more favorable than official statistics would suggest. This is entirely due to the effect of counting resources from government policies and programs, which have reduced poverty rates substantially for children of all ages. However, despite this progress, considerable disparities in the risk of poverty continue to exist by education level and family structure. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Sakaguchi, Hideo
2014-06-01
Oral function improvement programs utilizing health behavior theories are considered to be effective in preventing the need for long-term social care. In the present study, an oral function improvement program based upon health behavior theories was designed, and its utility was assessed in 102 pre-frail elderly persons (33 males, 69 females, mean age: 76.9 +/- 5.7) considered to be in potential need of long-term social care and attending a long-term care prevention class in Sayama City, Saitama Prefecture, Japan. The degree of improvement in oral functions (7 items) and oral hygienic conditions (3 items) was assessed by comparing oral health before and after participation in the program. The results showed statistically significant improvements in the following oral functions: (1) lip functions (oral diadochokinesis, measured by the regularity of the repetition of the syllable "Pa"), (2) tongue functions, (3) tongue root motor skills (oral diadochokinesis, measured by the regularity of the repetition of the syllables "Ta" and "Ka"), (4) tongue extension/retraction, (5) side-to-side tongue movement functions, (6) cheek motor skills, and (7) repetitive saliva swallowing test (RSST). The following measures of oral hygiene also showed a statistically significant improvement: (1) debris on dentures or teeth, (2) coated tongue, and (3) frequency of oral cleaning. These findings demonstrated that an improvement program informed by health behavior theories is useful in improving oral functions and oral hygiene conditions.
Incorporation of operator knowledge for improved HMDS GPR classification
NASA Astrophysics Data System (ADS)
Kennedy, Levi; McClelland, Jessee R.; Walters, Joshua R.
2012-06-01
The Husky Mine Detection System (HMDS) detects and alerts operators to potential threats observed in groundpenetrating RADAR (GPR) data. In the current system architecture, the classifiers have been trained using available data from multiple training sites. Changes in target types, clutter types, and operational conditions may result in statistical differences between the training data and the testing data for the underlying features used by the classifier, potentially resulting in an increased false alarm rate or a lower probability of detection for the system. In the current mode of operation, the automated detection system alerts the human operator when a target-like object is detected. The operator then uses data visualization software, contextual information, and human intuition to decide whether the alarm presented is an actual target or a false alarm. When the statistics of the training data and the testing data are mismatched, the automated detection system can overwhelm the analyst with an excessive number of false alarms. This is evident in the performance of and the data collected from deployed systems. This work demonstrates that analyst feedback can be successfully used to re-train a classifier to account for variable testing data statistics not originally captured in the initial training data.
Public health information and statistics dissemination efforts for Indonesia on the Internet.
Hanani, Febiana; Kobayashi, Takashi; Jo, Eitetsu; Nakajima, Sawako; Oyama, Hiroshi
2011-01-01
To elucidate current issues related to health statistics dissemination efforts on the Internet in Indonesia and to propose a new dissemination website as a solution. A cross-sectional survey was conducted. Sources of statistics were identified using link relationship and Google™ search. Menu used to locate statistics, mode of presentation and means of access to statistics, and available statistics were assessed for each site. Assessment results were used to derive design specification; a prototype system was developed and evaluated with usability test. 49 sources were identified on 18 governmental, 8 international and 5 non-government websites. Of 49 menus identified, 33% used non-intuitive titles and lead to inefficient search. 69% of them were on government websites. Of 31 websites, only 39% and 23% used graph/chart and map for presentation. Further, only 32%, 39% and 19% provided query, export and print feature. While >50% sources reported morbidity, risk factor and service provision statistics, <40% sources reported health resource and mortality statistics. Statistics portal website was developed using Joomla!™ content management system. Usability test demonstrated its potential to improve data accessibility. In this study, government's efforts to disseminate statistics in Indonesia are supported by non-governmental and international organizations and existing their information may not be very useful because it is: a) not widely distributed, b) difficult to locate, and c) not effectively communicated. Actions are needed to ensure information usability, and one of such actions is the development of statistics portal website.
Soft-tissue imaging with C-arm cone-beam CT using statistical reconstruction
NASA Astrophysics Data System (ADS)
Wang, Adam S.; Webster Stayman, J.; Otake, Yoshito; Kleinszig, Gerhard; Vogt, Sebastian; Gallia, Gary L.; Khanna, A. Jay; Siewerdsen, Jeffrey H.
2014-02-01
The potential for statistical image reconstruction methods such as penalized-likelihood (PL) to improve C-arm cone-beam CT (CBCT) soft-tissue visualization for intraoperative imaging over conventional filtered backprojection (FBP) is assessed in this work by making a fair comparison in relation to soft-tissue performance. A prototype mobile C-arm was used to scan anthropomorphic head and abdomen phantoms as well as a cadaveric torso at doses substantially lower than typical values in diagnostic CT, and the effects of dose reduction via tube current reduction and sparse sampling were also compared. Matched spatial resolution between PL and FBP was determined by the edge spread function of low-contrast (˜40-80 HU) spheres in the phantoms, which were representative of soft-tissue imaging tasks. PL using the non-quadratic Huber penalty was found to substantially reduce noise relative to FBP, especially at lower spatial resolution where PL provides a contrast-to-noise ratio increase up to 1.4-2.2× over FBP at 50% dose reduction across all objects. Comparison of sampling strategies indicates that soft-tissue imaging benefits from fully sampled acquisitions at dose above ˜1.7 mGy and benefits from 50% sparsity at dose below ˜1.0 mGy. Therefore, an appropriate sampling strategy along with the improved low-contrast visualization offered by statistical reconstruction demonstrates the potential for extending intraoperative C-arm CBCT to applications in soft-tissue interventions in neurosurgery as well as thoracic and abdominal surgeries by overcoming conventional tradeoffs in noise, spatial resolution, and dose.
NASA Astrophysics Data System (ADS)
Walz, M. A.; Donat, M.; Leckebusch, G. C.
2017-12-01
As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.
Omnibus Risk Assessment via Accelerated Failure Time Kernel Machine Modeling
Sinnott, Jennifer A.; Cai, Tianxi
2013-01-01
Summary Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai et al., 2011). In this paper, we derive testing and prediction methods for KM regression under the accelerated failure time model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. PMID:24328713
Tanaka, H; Hamatsu, T; Mori, K
2017-01-01
Potential fecundity models of walleye or Alaska pollock Gadus chalcogrammus in the Pacific waters off Hokkaido, Japan, were developed. They were compared using a generalized linear model with using either standard body length (L S ) or total body mass (M T ) as a main covariate along with Fulton's condition factor (K) and mean diameter of oocytes (D O ) as additional potential covariates to account for maternal conditions and maturity stage. The results of model selection showed that M T was a better single predictor of potential fecundity (F P ) than L S . The biological importance of K on F P was obscure, because it was statistically significant when used in the predictor with L S (i.e. length-based model), but not significant when used with M T (i.e. mass-based model). Meanwhile, D O was statistically significant in both length and mass-based models, suggesting the importance of downregulation on the number of oocytes with advancing maturation. Among all candidate models, the model with M T and D O in the predictor had the lowest Akaike's information criterion value, suggesting its better predictive power. These newly developed models will improve future comparisons of the potential fecundity within and among stocks by excluding potential biases other than body size. © 2016 The Fisheries Society of the British Isles.
Starr, P; Starr, S
1995-01-01
Vital statistics offers a case study in the potential of new information technology and reengineering to achieve better public sector performance. New technology--notably the shift from a paper to an electronic process for recording vital events and transmitting the data to public agencies--is creating opportunities to produce more timely, accurate, and useful information. The furthest advanced innovation is the electronic birth certificate. At the same time, changes in welfare policy and health care--including efforts to establish paternity at the time of birth and to improve health care outcomes--are creating pressures for more policy-relevant data about vital events. In addition, the rise of integrated health plans and health information networks is radically altering the organizational context of vital statistics. On the basis of a State-by-State survey of vital statistics officials, the authors estimate that at the end of 1994, 58 percent of all births in the United States were being recorded on an electronic birth certificate and communicated to a public agency electronically. Nearly all respondents reported that the electronic birth certificate brought improvements in both timeliness and accuracy of data. Achieving the full promise of the new technology, however, will require more fundamental changes in institutions and policies and a reconceptualization of the birth certificate as part of a broader perinatal information system.
Skill of Global Raw and Postprocessed Ensemble Predictions of Rainfall over Northern Tropical Africa
NASA Astrophysics Data System (ADS)
Vogel, Peter; Knippertz, Peter; Fink, Andreas H.; Schlueter, Andreas; Gneiting, Tilmann
2018-04-01
Accumulated precipitation forecasts are of high socioeconomic importance for agriculturally dominated societies in northern tropical Africa. In this study, we analyze the performance of nine operational global ensemble prediction systems (EPSs) relative to climatology-based forecasts for 1 to 5-day accumulated precipitation based on the monsoon seasons 2007-2014 for three regions within northern tropical Africa. To assess the full potential of raw ensemble forecasts across spatial scales, we apply state-of-the-art statistical postprocessing methods in form of Bayesian Model Averaging (BMA) and Ensemble Model Output Statistics (EMOS), and verify against station and spatially aggregated, satellite-based gridded observations. Raw ensemble forecasts are uncalibrated, unreliable, and underperform relative to climatology, independently of region, accumulation time, monsoon season, and ensemble. Differences between raw ensemble and climatological forecasts are large, and partly stem from poor prediction for low precipitation amounts. BMA and EMOS postprocessed forecasts are calibrated, reliable, and strongly improve on the raw ensembles, but - somewhat disappointingly - typically do not outperform climatology. Most EPSs exhibit slight improvements over the period 2007-2014, but overall have little added value compared to climatology. We suspect that the parametrization of convection is a potential cause for the sobering lack of ensemble forecast skill in a region dominated by mesoscale convective systems.
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2003-12-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2004-01-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Comparing estimates of climate change impacts from process-based and statistical crop models
NASA Astrophysics Data System (ADS)
Lobell, David B.; Asseng, Senthold
2017-01-01
The potential impacts of climate change on crop productivity are of widespread interest to those concerned with addressing climate change and improving global food security. Two common approaches to assess these impacts are process-based simulation models, which attempt to represent key dynamic processes affecting crop yields, and statistical models, which estimate functional relationships between historical observations of weather and yields. Examples of both approaches are increasingly found in the scientific literature, although often published in different disciplinary journals. Here we compare published sensitivities to changes in temperature, precipitation, carbon dioxide (CO2), and ozone from each approach for the subset of crops, locations, and climate scenarios for which both have been applied. Despite a common perception that statistical models are more pessimistic, we find no systematic differences between the predicted sensitivities to warming from process-based and statistical models up to +2 °C, with limited evidence at higher levels of warming. For precipitation, there are many reasons why estimates could be expected to differ, but few estimates exist to develop robust comparisons, and precipitation changes are rarely the dominant factor for predicting impacts given the prominent role of temperature, CO2, and ozone changes. A common difference between process-based and statistical studies is that the former tend to include the effects of CO2 increases that accompany warming, whereas statistical models typically do not. Major needs moving forward include incorporating CO2 effects into statistical studies, improving both approaches’ treatment of ozone, and increasing the use of both methods within the same study. At the same time, those who fund or use crop model projections should understand that in the short-term, both approaches when done well are likely to provide similar estimates of warming impacts, with statistical models generally requiring fewer resources to produce robust estimates, especially when applied to crops beyond the major grains.
Using ontology network structure in text mining.
Berndt, Donald J; McCart, James A; Luther, Stephen L
2010-11-13
Statistical text mining treats documents as bags of words, with a focus on term frequencies within documents and across document collections. Unlike natural language processing (NLP) techniques that rely on an engineered vocabulary or a full-featured ontology, statistical approaches do not make use of domain-specific knowledge. The freedom from biases can be an advantage, but at the cost of ignoring potentially valuable knowledge. The approach proposed here investigates a hybrid strategy based on computing graph measures of term importance over an entire ontology and injecting the measures into the statistical text mining process. As a starting point, we adapt existing search engine algorithms such as PageRank and HITS to determine term importance within an ontology graph. The graph-theoretic approach is evaluated using a smoking data set from the i2b2 National Center for Biomedical Computing, cast as a simple binary classification task for categorizing smoking-related documents, demonstrating consistent improvements in accuracy.
Do telemedical interventions improve quality of life in patients with COPD? A systematic review
Gregersen, Thorbjørn L; Green, Allan; Frausing, Ejvind; Ringbæk, Thomas; Brøndum, Eva; Suppli Ulrik, Charlotte
2016-01-01
Objective Telehealth is an approach to disease management, which may hold the potential of improving some of the features associated with COPD, including positive impact on disease progression, and thus possibly limiting further reduction in quality of life (QoL). Our objective was, therefore, to summarize studies addressing the impact of telehealth on QoL in patients with COPD. Design Systematic review. Methods A series of systematic searches were carried out using the following databases: PubMed, EMBASE, Cochrane Controlled Trials Register, and ClinicalTrials.gov (last updated November 2015). A predefined search algorithm was utilized with the intention to capture all results related to COPD, QoL, and telehealth published since year 2000. Outcome measures Primary outcome was QoL, assessed by validated measures. Results Out of the 18 studies fulfilling the criteria for inclusion in this review, three studies found statistically significant improvements in QoL for patients allocated to telemedical interventions. However, all of the other included studies found no statistically significant differences between control and telemedical intervention groups in terms of QoL. Conclusion Telehealth does not make a strong case for itself when exclusively looking at QoL as an outcome, since statistically significant improvements relative to control groups have been observed only in few of the available studies. Nonetheless, this does not only rule out the possibility that telehealth is superior to standard care with regard to other outcomes but also seems to call for more research, not least in large-scale controlled trials. PMID:27143872
Arthroscopic lysis of adhesions for the stiff total knee: results after failed manipulation.
Tjoumakaris, Fotios Paul; Tucker, Bradfords Chofield; Post, Zachary; Pepe, Matthew David; Orozco, Fabio; Ong, Alvin C
2014-05-01
Arthrofibrosis after total knee arthroplasty (TKA) is a potentially devastating complication, resulting in loss of motion and function and residual pain. For patients in whom aggressive physical therapy and manipulation under anesthesia fail, lysis of adhesions may be the only option to rescue the stiff TKA. The purpose of this study is to report the results of arthroscopic lysis of adhesions after failed manipulation for a stiff, cruciate-substituting TKA. This retrospective study evaluated patients who had undergone arthroscopic lysis of adhesions for arthrofibrosis after TKA between 2007 and 2011. Minimum follow-up was 12 months (average, 31 months). Average total range of motion of patients in this series was 62.3°. Average preoperative flexion contracture was 16° and average flexion was 78.6°. Statistical analysis was performed using Student's t test. Pre- to postoperative increase in range of motion was significant (P<.001) (average, 62° preoperatively to 98° postoperatively). Average preoperative extension deficit was 16°, which was reduced to 4° at final follow-up. This value was also found to be statistically significant (P<.0001). With regard to ultimate flexion attained, average preoperative flexion was 79°, which was improved to 103° at final follow-up. This improvement in flexion was statistically significant (P<.0001). Patients can reliably expect an improvement after arthroscopic lysis of adhesions for a stiff TKA using a standardized arthroscopic approach; however, patients achieved approximately half of the improvement that was obtained at the time of surgery. Copyright 2014, SLACK Incorporated.
Wolstenholme, Daniel; Downes, Tom; Leaver, Jackie; Partridge, Rebecca; Langley, Joseph
2014-01-01
Advances in surgical and medical management have significantly reduced the length of time that patients with spinal cord injury (SCI) have to stay in hospital, but has left patients with potentially less time to psychologically adjust. Following a pilot in 2012, this project was designed to test the effect of "design thinking" workshops on the self-efficacy of people undergoing rehabilitation following spinal injuries. Design thinking is about understanding the approaches and methods that designers use and then applying these to think creatively about problems and suggest ways to solve them. In this instance, design thinking is not about designing new products (although the approaches can be used to do this) but about developing a long term creative and explorative mind-set through skills such as lateral thinking, prototyping and verbal and visual communication. The principles of "design thinking" have underpinned design education and practice for many years, it is also recognised in business and innovation for example, but a literature review indicated that there was no evidence of it being used in rehabilitation or spinal injury settings. Twenty participants took part in the study; 13 (65%) were male and the average age was 37 years (range 16 to 72). Statistically significant improvements were seen for EQ-5D score (t = -3.13, p = 0.007) and Patient Activation Measure score (t = -3.85, p = 0.001). Other outcome measures improved but not statistically. There were no statistical effects on length of stay or readmission rates, but qualitative interviews indicated improved patient experience.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Dewi, N. R.
2017-04-01
Statistics needed for use in the data analysis process and had a comprehensive implementation in daily life so that students must master the well statistical material. The use of Statistics textbook support with ICT and portfolio assessment approach was expected to help the students to improve mathematical connection skills. The subject of this research was 30 student teachers who take Statistics courses. The results of this research are the use of Statistics textbook support with ICT and portfolio assessment approach can improve students mathematical connection skills.
Change in perceived psychosocial status following a 12-week Tai Chi exercise programme.
Taylor-Piliae, Ruth E; Haskell, William L; Waters, Catherine M; Froelicher, Erika Sivarajan
2006-05-01
This paper reports a study to examine change in psychosocial status following a 12-week Tai Chi exercise intervention among ethnic Chinese people with cardiovascular disease risk factors living in the United States of America. Regular participation in physical activity is associated with protection against cardioavascular disease, and improvements in physical and psychological health. Increasing amounts of scientific evidence suggests that mind-body exercise, such as Tai Chi, are related to improvements in mental health, emotional well-being, and stress reduction. No prior study has examined the effect of a Tai Chi exercise intervention on psychosocial status among people with cardiovascular disease risk factors. This was a quasi-experimental study. Participants attended a 60-minute Tai Chi exercise class three times per week for 12 weeks. Data were collected at baseline, 6 and 12 weeks following the intervention. Psychosocial status was assessed using Chinese versions of Cohen's Perceived Stress Scale, Profile of Mood States, Multidimensional Scale of Perceived Social Support, and Tai Chi exercise self-efficacy. A total of 39 participants, on average 66-year-old (+/-8.3), married (85%), Cantonese-speaking (97%), immigrants participated. The majority were women (69%), with < or =12 years education (87%). Statistically significant improvements in all measures of psychosocial status were found (P < or = 0.05) following the intervention. Improvement in mood state (eta2 = 0.12), and reduction in perceived stress (eta2 = 0.13) were found. In addition, Tai Chi exercise statistically significantly increased self-efficacy to overcome barriers to Tai Chi (eta2 = 0.19), confidence to perform Tai Chi (eta2 = 0.27), and perceived social support (eta2 = 0.12). Tai Chi was a culturally appropriate mind-body exercise for these older adults, with statistically significant psychosocial benefits observed over 12-weeks. Further research examining Tai Chi exercise using a randomized clinical trial design with an attention-control group may reduce potential confounding effects, while exploring potential mechanisms underlying the relaxation response associated with mind-body exercise. In addition, future studies with people with other chronic illnesses in all ethnic groups are recommended to determine if similar benefits can be achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, Pieter; Margolis, Robert; Melius, Jennifer
We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m 2 ofmore » suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft 2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.« less
Gagnon, Pieter; Margolis, Robert; Melius, Jennifer; ...
2018-01-05
We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m 2 ofmore » suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft 2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.« less
NASA Astrophysics Data System (ADS)
Gagnon, Pieter; Margolis, Robert; Melius, Jennifer; Phillips, Caleb; Elmore, Ryan
2018-02-01
We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m2 of suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.
Dalton, Kieran; O'Brien, Gary; O'Mahony, Denis; Byrne, Stephen
2018-06-08
computerised interventions have been suggested as an effective strategy to reduce potentially inappropriate prescribing (PIP) for hospitalised older adults. This systematic review and meta-analysis examined the evidence for efficacy of computerised interventions designed to reduce PIP in this patient group. an electronic literature search was conducted using eight databases up to October 2017. Included studies were controlled trials of computerised interventions aiming to reduce PIP in hospitalised older adults (≥65 years). Risk of bias was assessed using Cochrane's Effective Practice and Organisation of Care criteria. of 653 records identified, eight studies were included-two randomised controlled trials, two interrupted time series analysis studies and four controlled before-after studies. Included studies were mostly at a low risk of bias. Overall, seven studies showed either a statistically significant reduction in the proportion of patients prescribed a potentially inappropriate medicine (PIM) (absolute risk reduction {ARR} 1.3-30.1%), or in PIMs ordered (ARR 2-5.9%). However, there is insufficient evidence thus far to suggest that these interventions can routinely improve patient-related outcomes. It was only possible to include three studies in the meta-analysis-which demonstrated that intervention patients were less likely to be prescribed a PIM (odds ratio 0.6; 95% CI 0.38, 0.93). No computerised intervention targeting potential prescribing omissions (PPOs) was identified. this systematic review concludes that computerised interventions are capable of statistically significantly reducing PIMs in hospitalised older adults. Future interventions should strive to target both PIMs and PPOs, ideally demonstrating both cost-effectiveness data and clinically significant improvements in patient-related outcomes.
Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju
2015-01-01
The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P < 0.001). At qualitative analysis of the third study, it also showed that the images reconstructed using ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P < 0.001). Our phantom studies showed that ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.
On the Use of Ocean Dynamic Temperature for Hurricane Intensity Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaguru, Karthik; Foltz, Gregory R.; Leung, L. Ruby
Sea surface temperature (SST) and the Tropical Cyclone Heat Potential (TCHP) are metrics used to incorporate the ocean's influence on hurricane intensification in the National Hurricane Center's Statistical Hurricane Intensity Prediction Scheme (SHIPS). While both SST and TCHP serve as useful measures of the upper-ocean heat content, they do not accurately represent ocean stratification effects. Here we show that replacing SST in the SHIPS framework with a dynamic temperature (Tdy), which accounts for the oceanic negative feedback to the hurricane's intensity arising from storm-induced vertical mixing and sea-surface cooling, improves the model performance. While the model with SST and TCHPmore » explains nearly 41% of the variance in 36-hr intensity changes, replacing SST with Tdy increases the variance explained to nearly 44%. Our results suggest that representation of the oceanic feedback, even through relatively simple formulations such as Tdy, may improve the performance of statistical hurricane intensity prediction models such as SHIPS.« less
NASA Astrophysics Data System (ADS)
Boswijk, G.; Fowler, A. M.; Palmer, J. G.; Fenwick, P.; Hogg, A.; Lorrey, A.; Wunder, J.
2014-04-01
Millennial and multi-millennial tree-ring chronologies can provide useful proxy records of past climate, giving insight into a more complete range of natural climate variability prior to the 20th century. Since the 1980s a multi-millennial tree-ring chronology has been developed from kauri (Agathis australis) from the upper North Island, New Zealand. Previous work has demonstrated the sensitivity of kauri to the El Niño-Southern Oscillation (ENSO). Here we present recent additions and extensions to the late Holocene kauri chronology (LHKC), and assess the potential of a composite master chronology, AGAUc13, for palaeoclimate reconstruction. The updated composite kauri chronology now spans 4491 years (2488 BCE-2002 CE) and includes data from 18 modern sites, 25 archaeological sites, and 18 sub-fossil (swamp) kauri sites. Consideration of the composition and statistical quality of AGAUc13 suggests the LHKC has utility for palaeoclimate reconstruction but there are caveats. These include: (a) differences in character between the three assemblages including growth rate and sensitivity; (b) low sample depth and low statistical quality in the 10th-13th century CE, when the record transitions from modern and archaeological material to the swamp kauri; (c) a potential difference in amplitude of the signal in the swamp kauri; (d) a westerly bias in site distribution prior to 911 CE; (e) variable statistical quality across the entire record associated with variable replication; and (f) complex changes in sample depth and tree age and size which may influence centennial scale trends in the data. Further tree ring data are required to improve statistical quality, particularly in the first half of the second millennium CE.
Creighton, Doug; Gruca, Mark; Marsh, Douglas; Murphy, Nancy
2014-11-01
Cervical mobilization and manipulation have been shown to improve cervical range of motion and pain. Rotatory thrust manipulation applied to the lower cervical segments is associated with controversy and the potential for eliciting adverse reactions (AR). The purpose of this clinical trial was to describe two translatory non-thrust mobilization techniques and evaluate their effect on cervical pain, motion restriction, and whether any adverse effects were reported when applied to the C7 segment. This trial included 30 participants with painful and restricted cervical rotation. Participants were randomly assigned to receive one of the two mobilization techniques. Active cervical rotation and pain intensity measurements were recorded pre- and post-intervention. Within group comparisons were determined using the Wilcoxon signed-rank test and between group comparisons were analyzed using the Mann-Whitney U test. Significance was set at P = 0.05. Thirty participants were evaluated immediately after one of the two mobilization techniques was applied. There was a statistically significant difference (improvement) for active cervical rotation after application of the C7 facet distraction technique for both right (P = 0.022) and left (P = 0.022) rotation. Statistically significant improvement was also found for the C7 facet gliding technique for both right (P = 0.022) and left rotation (P = 0.020). Pain reduction was statistically significant for both right and left rotation after application of both techniques. Both mobilization techniques produced similar positive effects and one was not statistically superior to the other. A single application of both C7 mobilization techniques improved active cervical rotation, reduced perceived pain, and did not produce any AR in 30 patients with neck pain and movement limitation. These two non-thrust techniques may offer clinicians an additional safe and effective manual intervention for patients with limited and painful cervical rotation. A more robust experimental design is recommended to further examine these and similar cervical translatory mobilization techniques.
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Improving risk classification of critical illness with biomarkers: a simulation study
Seymour, Christopher W.; Cooke, Colin R.; Wang, Zheyu; Kerr, Kathleen F.; Yealy, Donald M.; Angus, Derek C.; Rea, Thomas D.; Kahn, Jeremy M.; Pepe, Margaret S.
2012-01-01
Purpose Optimal triage of patients at risk of critical illness requires accurate risk prediction, yet little data exists on the performance criteria required of a potential biomarker to be clinically useful. Materials and Methods We studied an adult cohort of non-arrest, non-trauma emergency medical services encounters transported to a hospital from 2002–2006. We simulated hypothetical biomarkers increasingly associated with critical illness during hospitalization, and determined the biomarker strength and sample size necessary to improve risk classification beyond a best clinical model. Results Of 57,647 encounters, 3,121 (5.4%) were hospitalized with critical illness and 54,526 (94.6%) without critical illness. The addition of a moderate strength biomarker (odds ratio=3.0 for critical illness) to a clinical model improved discrimination (c-statistic 0.85 vs. 0.8, p<0.01), reclassification (net reclassification improvement=0.15, 95%CI: 0.13,0.18), and increased the proportion of cases in the highest risk categoryby+8.6% (95%CI: 7.5,10.8%). Introducing correlation between the biomarker and physiological variables in the clinical risk score did not modify the results. Statistically significant changes in net reclassification required a sample size of at least 1000 subjects. Conclusions Clinical models for triage of critical illness could be significantly improved by incorporating biomarkers, yet, substantial sample sizes and biomarker strength may be required. PMID:23566734
Quantifying the transmission potential of pandemic influenza
NASA Astrophysics Data System (ADS)
Chowell, Gerardo; Nishiura, Hiroshi
2008-03-01
This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Visual and brainstem auditory evoked potentials in infants with severe vitamin B12 deficiency.
Demir, Nihat; Koç, Ahmet; Abuhandan, Mahmut; Calik, Mustafa; Işcan, Akin
2015-01-01
Vitamin B12 plays an important role in the development of mental, motor, cognitive, and social functions via its role in DNA synthesis and nerve myelination. Its deficiency in infants might cause neuromotor retardation as well as megaloblastic anemia. The objective of this study was to investigate the effects of infantile vitamin B12 deficiency on evoked brain potentials and determine whether improvement could be obtained with vitamin B12 replacement at appropriate dosages. Thirty patients with vitamin B12 deficiency and 30 age-matched healthy controls were included in the study. Hematological parameters, visual evoked potentials, and brainstem auditory evoked potentials tests were performed prior to treatment, 1 week after treatment, and 3 months after treatment. Visual evoked potentials (VEPs) and brainstem auditory evoked potentials (BAEPs) were found to be prolonged in 16 (53.3%) and 15 (50%) patients, respectively. Statistically significant improvements in VEP and BAEP examinations were determined 3 months after treatment. Three months after treatment, VEP and BAEP examinations returned to normal in 81.3% and 53.3% of subjects with prolonged VEPs and BAEPs, respectively. These results demonstrate that vitamin B12 deficiency in infants causes significant impairment in the auditory and visual functioning tests of the brain, such as VEP and BAEP.
Climate and dengue transmission: evidence and implications.
Morin, Cory W; Comrie, Andrew C; Ernst, Kacey
2013-01-01
Climate influences dengue ecology by affecting vector dynamics, agent development, and mosquito/human interactions. Although these relationships are known, the impact climate change will have on transmission is unclear. Climate-driven statistical and process-based models are being used to refine our knowledge of these relationships and predict the effects of projected climate change on dengue fever occurrence, but results have been inconsistent. We sought to identify major climatic influences on dengue virus ecology and to evaluate the ability of climate-based dengue models to describe associations between climate and dengue, simulate outbreaks, and project the impacts of climate change. We reviewed the evidence for direct and indirect relationships between climate and dengue generated from laboratory studies, field studies, and statistical analyses of associations between vectors, dengue fever incidence, and climate conditions. We assessed the potential contribution of climate-driven, process-based dengue models and provide suggestions to improve their performance. Relationships between climate variables and factors that influence dengue transmission are complex. A climate variable may increase dengue transmission potential through one aspect of the system while simultaneously decreasing transmission potential through another. This complexity may at least partly explain inconsistencies in statistical associations between dengue and climate. Process-based models can account for the complex dynamics but often omit important aspects of dengue ecology, notably virus development and host-species interactions. Synthesizing and applying current knowledge of climatic effects on all aspects of dengue virus ecology will help direct future research and enable better projections of climate change effects on dengue incidence.
Formulating Spatially Varying Performance in the Statistical Fusion Framework
Landman, Bennett A.
2012-01-01
To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513
Whitehead, Lisa; Seaton, Philippa
2016-05-16
Long-term conditions and their concomitant management place considerable pressure on patients, communities, and health care systems worldwide. International clinical guidelines on the majority of long-term conditions recommend the inclusion of self-management programs in routine management. Self-management programs have been associated with improved health outcomes; however, the successful and sustainable transfer of research programs into clinical practice has been inconsistent. Recent developments in mobile technology, such as mobile phone and tablet computer apps, could help in developing a platform for the delivery of self-management interventions that are adaptable, of low cost, and easily accessible. We conducted a systematic review to assess the effectiveness of mobile phone and tablet apps in self-management of key symptoms of long-term conditions. We searched PubMed, Embase, EBSCO databases, the Cochrane Library, and The Joanna Briggs Institute Library for randomized controlled trials that assessed the effectiveness of mobile phone and tablet apps in self-management of diabetes mellitus, cardiovascular disease, and chronic lung diseases from 2005-2016. We searched registers of current and ongoing trials, as well as the gray literature. We then checked the reference lists of all primary studies and review papers for additional references. The last search was run in February 2016. Of the 9 papers we reviewed, 6 of the interventions demonstrated a statistically significant improvement in the primary measure of clinical outcome. Where the intervention comprised an app only, 3 studies demonstrated a statistically significant improvement. Interventions to address diabetes mellitus (5/9) were the most common, followed by chronic lung disease (3/9) and cardiovascular disease (1/9). A total of 3 studies included multiple intervention groups using permutations of an intervention involving an app. The duration of the intervention ranged from 6 weeks to 1 year, and final follow-up data ranged from 3 months to 1 year. Sample size ranged from 48 to 288 participants. The evidence indicates the potential of apps in improving symptom management through self-management interventions. The use of apps in mHealth has the potential to improve health outcomes among those living with chronic diseases through enhanced symptom control. Further innovation, optimization, and rigorous research around the potential of apps in mHealth technology will move the field toward the reality of improved health care delivery and outcomes.
Preoperative and post-operative sleep quality evaluation in rotator cuff tear patients.
Serbest, Sancar; Tiftikçi, Uğur; Askın, Aydogan; Yaman, Ferda; Alpua, Murat
2017-07-01
The aim of this study was to examine the potential relationship between subjective sleep quality and degree of pain in patients with rotator cuff repair. Thirty-one patients who underwent rotator cuff repair prospectively completed the Pittsburgh Sleep Quality Index, the Western Ontario Rotator Cuff Index, and the Constant and Murley shoulder scores before surgery and at 6 months after surgery. Preoperative demographic, clinical, and radiologic parameters were also evaluated. The study analysed 31 patients with a median age of 61 years. There was a significant difference preoperatively versus post-operatively in terms of all PSQI global scores and subdivisions (p < 0.001). A statistically significant improvement was determined by the Western Ontario Rotator Cuff Scale and the Constant and Murley shoulder scores (p ˂ 0.001). Sleep disorders are commonly seen in patients with rotator cuff tear, and after repair, there is an increase in the quality of sleep with a parallel improvement in shoulder functions. However, no statistically significant correlation was determined between arthroscopic procedures and the size of the tear and sleep quality. It is suggested that rotator cuff tear repair improves the quality of sleep and the quality of life. IV.
Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana
2018-06-01
This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stachel, Shawn J.; Zerbinatti, Celina; Rudd, Michael T.
2016-04-14
Herein, we describe the development of a functionally selective liver X receptor β (LXRβ) agonist series optimized for Emax selectivity, solubility, and physical properties to allow efficacy and safety studies in vivo. Compound 9 showed central pharmacodynamic effects in rodent models, evidenced by statistically significant increases in apolipoprotein E (apoE) and ATP-binding cassette transporter levels in the brain, along with a greatly improved peripheral lipid safety profile when compared to those of full dual agonists. These findings were replicated by subchronic dosing studies in non-human primates, where cerebrospinal fluid levels of apoE and amyloid-β peptides were increased concomitantly with anmore » improved peripheral lipid profile relative to that of nonselective compounds. These results suggest that optimization of LXR agonists for Emax selectivity may have the potential to circumvent the adverse lipid-related effects of hepatic LXR activity.« less
Heffez, Dan S; Ross, Ruth E; Shade-Zeldow, Yvonne; Kostas, Konstantinos; Morrissey, Mary; Elias, Dean A; Shepard, Alan
2007-09-01
Some patients with fibromyalgia also exhibit the neurological signs of cervical myelopathy. We sought to determine if treatment of cervical myelopathy in patients with fibromyalgia improves the symptoms of fibromyalgia and the patients' quality of life. A non-randomized, prospective, case control study comparing the outcome of surgical (n = 40) versus non-surgical (n = 31) treatment of cervical myelopathy in patients with fibromyalgia was conducted. Outcomes were compared using SF-36, screening test for somatization, HADS, MMPI-2 scale 1 (Hypochondriasis), and self reported severity of symptoms 1 year after treatment. There was no significant difference in initial clinical presentation or demographic characteristics between the patients treated by surgical decompression and those treated by non-surgical means. There was a striking and statistically significant improvement in all symptoms attributed to the fibromyalgia syndrome in the surgical patients but not in the non-surgical patients at 1 year following the treatment of cervical myelopathy (P
Physiological improvement with moderate exercise in type II diabetic neuropathy.
Fisher, M A; Langbein, W E; Collins, E G; Williams, K; Corzine, L
2007-01-01
The objective of this study was to demonstrate improvement in nerve function with moderate exercise in patients with type II diabetic neuropathies. Fives subjects with type II diabetes mellitus and distal, predominantly sensory polyneuropathies were studied. The subjects completed an 8-week program of a supervised moderate exercise program (40-75% of maximal 02 uptake reserve) with a subsequent 16-week program of monitored similar exercise. The same experienced electrophysiologist performed the electrodiagnostic studies both before and after the 24-week exercise period. These studies monitored physiological changes (conduction velocities, response amplitudes) in motor and sensory fibers as well as F-wave latencies. The exercise program produced a documented increase in aerobic exercise capacity. Despite the small number of subjects studied and the relatively short exercise period, there was a statistically significant improvement in nearly all electrophysiological parameters evaluated post exercise including motor conduction velocities and amplitudes, sensory conduction velocities, and F-wave latencies. This improvement included a statistically significant improvement in absolute median motor evoked response amplitudes as well as the recording of sensory nerve action potentials not present prior to exercise. There were no adverse effects from the exercise. This study supports the hypothesis that exercise can be performed safely in patients with type II diabetic neuropathies and can produce improvement in their nerve function. This study also supports the hypothesis that ischemia may have a meaningful role in the pathogenesis of neuropathies in patients with type II diabetes mellitus.
Treatment of dry eye syndrome with orally administered CF101: data from a phase 2 clinical trial.
Avni, Isaac; Garzozi, Hanna J; Barequet, Irina S; Segev, Fanni; Varssano, David; Sartani, Gil; Chetrit, Noa; Bakshi, Erez; Zadok, David; Tomkins, Oren; Litvin, Gilad; Jacobson, Kenneth A; Fishman, Sari; Harpaz, Zivit; Farbstein, Motti; Yehuda, Sara Bar; Silverman, Michael H; Kerns, William D; Bristol, David R; Cohn, Ilan; Fishman, Pnina
2010-07-01
To explore the safety and efficacy of CF101, an A(3) adenosine receptor agonist, in patients with moderate to severe dry eye syndrome. Phase 2, multicenter, randomized, double-masked, placebo-controlled, parallel-group study. Sixty-eight patients completed the study, 35 patients in the placebo group and 33 patients in the CF101 group. Patients were treated orally with either 1 mg CF101 pills or matching vehicle-filled placebo pills, given twice daily for 12 weeks, followed by a 2-week posttreatment observation. An improvement of more than 25% over baseline at week 12 in one of the following parameters: (1) tear break-up time (BUT); (2) superficial punctate keratitis assessed by fluorescein staining results; and (3) Schirmer tear test 1 results. Clinical laboratory safety tests, ophthalmic examinations, intraocular pressure (IOP) measurements, electrocardiographic evaluations, vital sign measurements, and monitoring of adverse events. A statistically significant increase in the proportion of patients who achieved more than 25% improvement in the corneal staining and in the clearance of corneal staining was noted between the CF101-treated group and the placebo group. Treatment with CF101 resulted in a statistically significant improvement in the mean change from baseline at week 12 of the corneal staining, BUT, and tear meniscus (TM) height in the CF101-treated group. CF101 was well tolerated and exhibited an excellent safety profile with no serious adverse events. A statistically significant decrease from baseline was observed in the IOP of the CF101-treated group in comparison with the placebo group. CF101, given orally, induced a statistically significant improvement in the corneal staining and an improvement in the BUT and TM in patients with moderate to severe dry eye syndrome. The drug was very well tolerated. These data and the anti-inflammatory characteristic of CF101 support further study of the drug as a potential treatment for the signs and symptoms of dry eye syndrome. Proprietary or commercial disclosure may be found after the references. Copyright 2010 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Treatment of Dry Eye Syndrome with Orally Administered CF101: Data from a Phase 2 Clinical Trial
Avni, Isaac; Garzozi, Hanna J.; Barequet, Irina S.; Segev, Fanni; Varssano, David; Sartani, Gil; Chetrit, Noa; Bakshi, Erez; Zadok, David; Tomkins, Oren; Litvin, Gilad; Jacobson, Kenneth A.; Fishman, Sari; Harpaz, Zivit; Farbstein, Motti; Bar Yehuda, Sara; Silverman, Michael H.; Kerns, William D.; Bristol, David R.; Cohn, Ilan; Fishman, Pnina
2013-01-01
Objective To explore the safety and efficacy of CF101, an A3 adenosine receptor agonist, in patients with moderate-to-severe dry eye syndrome Design Phase 2, multicenter, randomized, double-masked, placebo-controlled, parallel-group study. Participants 68 patients completed the study, 35 patients in the placebo group and 33 patients in the CF101 group. Intervention Patients were orally treated with either 1 mg CF101 pills or matching vehicle-filled placebo pills, given twice daily for 12 weeks, followed by a 2-week post-treatment observation. Main Outcome Measures Efficacy an improvement of >25% over baseline at week 12 in one of the following parameters: (a) tear break-up time (BUT); (b) superficial punctate keratitis assessed by fluorescein staining (FS); (c) Schirmer tear test 1 (ST1). Safety clinical laboratory safety tests, ophthalmic examinations, intraocular pressure (IOP) measurements, electrocardiographic evaluations, vital sign measurements and monitoring of adverse events. Results A statistically significant increase in the proportion of patients who achieved more than 25% improvement in the corneal staining and in the clearance of corneal staining was noted between the CF101-treated group and the placebo group. Treatment with CF101 resulted in a statistically significant improvement in the mean change from baseline at week 12 of the corneal staining, BUT, and tear meniscus (TM) height in the CF101-treated group CF101 was well tolerated and exhibited an excellent safety profile with no serious adverse events. A statistically significant decrease from baseline was observed in the IOP of the CF101-treated group in comparison with the placebo group. Conclusions CF101, given orally, induced a statistically significant improvement in the corneal staining and an improvement in the BUT and TM in patients with moderate-to-severe dry eye syndrome. The drug was very well tolerated. These data and the anti-inflammatory characteristic of CF101 support further study of the drug as a potential treatment for the signs and symptoms of dry eye syndrome. PMID:20304499
Tabash, Mohammed I; Hussein, Rim A; Mahmoud, Aleya H; El-Borgy, Mohamed D; Abu-Hamad, Bassam A
2016-04-01
In health care facilities, pharmaceutical waste is generally discharged down the drain or sent to landfill. Poor knowledge about their potential downstream impacts may be a primary factor for improper disposal behavior. The objective of this study was to determine the impact of an intervention program on knowledge and practice of health care staff regarding pharmaceutical waste management. The study was designed as a pre/posttest intervention study. Total sample size was 530 in the pre-intervention phase, and then a subsample of 69 individuals was selected for the intervention and the post-intervention phases. Paired-sample t test was used to assess the difference between pretest and follow-up test results. A statistically significant improvement in knowledge and practice was achieved (P<0.001). Poor knowledge and poor practice levels (scores<50%) were found to improve to satisfactory levels (scores≥75%). Therefore, educational programs could be considered as an effective tool for changing health care staff practice in pharmaceutical waste management. In health care facilities, pharmaceutical waste is generally discharged down the drain or sent to landfill. A lack of knowledge about the potential impacts of this type of waste may be a leading factor in improper disposal behavior. Following an educational program, statistically significant improvement in knowledge and practice of health care staff as regards to pharmaceutical waste management (PWM) was achieved. It is thus recommended that authorities implement training-of-trainers (TOT) programs to educate health care staff on PWM and organize refreshment workshops regularly.
Omnibus risk assessment via accelerated failure time kernel machine modeling.
Sinnott, Jennifer A; Cai, Tianxi
2013-12-01
Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai, Tonini, and Lin, 2011). In this article, we derive testing and prediction methods for KM regression under the accelerated failure time (AFT) model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. © 2013, The International Biometric Society.
Improving the performance of a filling line based on simulation
NASA Astrophysics Data System (ADS)
Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.
2016-08-01
The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.
Change in perception of sclerotherapy results after exposure to pre-post intervention photographs.
Santiago, Fabricio R; Piscoya, Mario; Chi, Yung-Wei
2018-05-01
Objective To evaluate patients' self-perception of cosmetic improvement before and after they were presented with pre- and postprocedure photographs after sclerotherapy with 75% dextrose. Methods Treatments included sclerotherapy of reticular and varicose veins using 75% dextrose. All treated limbs were photographed and classified according to Clinical, Etiology, Anatomy, and Pathology classification and Venous Clinical Severity Score pre- and posttreatment. The patients were queried before and after viewing the photos during these visits and indicated if they were very unsatisfied, dissatisfied, satisfied, or very satisfied. Nonparametric kappa correlation coefficient and a Chi square test were used to measure associations among agreement (p < 0.05 indicated statistical significance). The paired Wilcoxon test was used to compare statistical differences in mean Venous Clinical Severity Scores measured at different times (p < 0.05 indicated statistical significance). Data were analyzed using STATA software (version 12). Results Individuals were more satisfied with the results of sclerotherapy after exposure to images portraying their limbs two months after the procedure (p = 0.0028). This effect was maintained six months after sclerotherapy (p = 0.0027). Conclusion Patient exposure to pre- and postsurgical photographs is a simple intervention with the potential of improving patient satisfaction up to six months after treatment with sclerotherapy.
Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop
NASA Technical Reports Server (NTRS)
Park, Michael A.; Nemec, Marian
2017-01-01
A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.
Evaluation of risk communication in a mammography patient decision aid.
Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B
2016-07-01
We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.
2018-03-01
Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.
Evaluation of risk communication in a mammography patient decision aid
Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.
2016-01-01
Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020
Beyond δ : Tailoring marked statistics to reveal modified gravity
NASA Astrophysics Data System (ADS)
Valogiannis, Georgios; Bean, Rachel
2018-01-01
Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.
Tropical forest heterogeneity from TanDEM-X InSAR and lidar observations in Indonesia
NASA Astrophysics Data System (ADS)
De Grandi, Elsa Carla; Mitchard, Edward
2016-10-01
Fires exacerbated during El Niño Southern Oscillation are a serious threat in Indonesia leading to the destruction and degradation of tropical forests and emissions of CO2 in the atmosphere. Forest structural changes which occurred due to the 1997-1998 El Niño Southern Oscillation in the Sungai Wain Protection Forest (East Kalimantan, Indonesia), a previously intact forest reserve have led to the development of a range of landcover from secondary forest to areas dominated by grassland. These structural differences can be appreciated over large areas by remote sensing instruments such as TanDEM-X and LiDAR that provide information that are sensitive to vegetation vertical and horizontal structure. One-point statistics of TanDEM-X coherence (mean and CV) and LiDAR CHM (mean, CV) and derived metrics such as vegetation volume and canopy cover were tested for the discrimination between 4 landcover classes. Jeffries-Matusita (JM) separability was high between forest classes (primary or secondary forest) and non-forest (grassland) while, primary and secondary forest were not separable. The study tests the potential and the importance of potential of TanDEM-X coherence and LiDAR observations to characterize structural heterogeneity based on one-point statistics in tropical forest but requires improved characterization using two-point statistical measures.
Forest tree species discrimination in western Himalaya using EO-1 Hyperion
NASA Astrophysics Data System (ADS)
George, Rajee; Padalia, Hitendra; Kushwaha, S. P. S.
2014-05-01
The information acquired in the narrow bands of hyperspectral remote sensing data has potential to capture plant species spectral variability, thereby improving forest tree species mapping. This study assessed the utility of spaceborne EO-1 Hyperion data in discrimination and classification of broadleaved evergreen and conifer forest tree species in western Himalaya. The pre-processing of 242 bands of Hyperion data resulted into 160 noise-free and vertical stripe corrected reflectance bands. Of these, 29 bands were selected through step-wise exclusion of bands (Wilk's Lambda). Spectral Angle Mapper (SAM) and Support Vector Machine (SVM) algorithms were applied to the selected bands to assess their effectiveness in classification. SVM was also applied to broadband data (Landsat TM) to compare the variation in classification accuracy. All commonly occurring six gregarious tree species, viz., white oak, brown oak, chir pine, blue pine, cedar and fir in western Himalaya could be effectively discriminated. SVM produced a better species classification (overall accuracy 82.27%, kappa statistic 0.79) than SAM (overall accuracy 74.68%, kappa statistic 0.70). It was noticed that classification accuracy achieved with Hyperion bands was significantly higher than Landsat TM bands (overall accuracy 69.62%, kappa statistic 0.65). Study demonstrated the potential utility of narrow spectral bands of Hyperion data in discriminating tree species in a hilly terrain.
Eisenberg, Dan T A; Kuzawa, Christopher W; Hayes, M Geoffrey
2015-01-01
Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although, easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age, and between mother and offspring are examined. First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error as compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity in our study. Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, rerunning analyses of previous results with well position correction could serve as an independent test of the validity of these results. © 2015 Wiley Periodicals, Inc.
Dua, Anahita; Sudan, Ranjan; Desai, Sapan S
2014-01-01
The American Board of Surgery In-Training Examination (ABSITE) is a predictor of resident performance on the general surgery-qualifying examination and plays a role in obtaining competitive fellowships. A learning management system (LMS) permits the delivery of a structured curriculum that appeals to the modern resident owing to the ease of accessibility and all-in-one organization. This study hypothesizes that trainees using a structured surgeon-directed LMS will achieve improved ABSITE scores compared with those using an unstructured approach to the examination. A multidisciplinary print and digital review course with practice questions, review textbooks, weekly reading assignments, and slide and audio reviews integrated within an online LMS was made available to postgraduate year (PGY)-3 and PGY-4 residents in 2008 and 2009. Surveys were emailed requesting ABSITE scores to compare outcomes in those trainees that used the course with those who used an unstructured approach. Statistical analysis was conducted via descriptive statistics and Pearson chi-square with p < 0.05 deemed statistically significant. Surveys were mailed to 508 trainees. There was an 80% (408) response rate. Residents who used structured approaches in both the years achieved the highest scores, followed by those who adopted a structured approach in PGY-4. The residents using an unstructured approach in both the years showed no significant improvement. Residents who used a structured LMS performed significantly better than their counterparts who used an unstructured approach. A properly constructed online education curriculum has the potential to improve ABSITE scores. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Municipal Water Demand: Statistical and Management Issues
NASA Astrophysics Data System (ADS)
Martin, William E.
In the foreword to this volume, Charles W. Howe, general editor of the Westview Press series on water policy and management, states that the goal of this book is to emphasize “the potential for improved water management with reduced economic and environmental costs by utilizing modern methods of demand estimation that take into account user responsiveness to price, conservation measures, and economic-demographic changes.” The authors accomplish their purpose, but the book itself leaves much to be desired.
NASA Astrophysics Data System (ADS)
Li, S.; Rupp, D. E.; Hawkins, L.; Mote, P.; McNeall, D. J.; Sarah, S.; Wallom, D.; Betts, R. A.
2017-12-01
This study investigates the potential to reduce known summer hot/dry biases over Pacific Northwest in the UK Met Office's atmospheric model (HadAM3P) by simultaneously varying multiple model parameters. The bias-reduction process is done through a series of steps: 1) Generation of perturbed physics ensemble (PPE) through the volunteer computing network weather@home; 2) Using machine learning to train "cheap" and fast statistical emulators of climate model, to rule out regions of parameter spaces that lead to model variants that do not satisfy observational constraints, where the observational constraints (e.g., top-of-atmosphere energy flux, magnitude of annual temperature cycle, summer/winter temperature and precipitation) are introduced sequentially; 3) Designing a new PPE by "pre-filtering" using the emulator results. Steps 1) through 3) are repeated until results are considered to be satisfactory (3 times in our case). The process includes a sensitivity analysis to find dominant parameters for various model output metrics, which reduces the number of parameters to be perturbed with each new PPE. Relative to observational uncertainty, we achieve regional improvements without introducing large biases in other parts of the globe. Our results illustrate the potential of using machine learning to train cheap and fast statistical emulators of climate model, in combination with PPEs in systematic model improvement.
Ng, Chin Ting Justin; Newman, Simon; Harris, Simon; Clarke, Susannah; Cobb, Justin
2017-07-01
Patient-specific instrumentation (PSI) has the potential to offer numerous benefits-not least of all, improved resection accuracy; but its potential has not been realised in clinical studies. An explanation may be the focus of such studies on the total knee replacement (TKR-a common procedure, with which surgeons are generally very familiar. Consequently, we sought to investigate the potential role of PSI in guiding novice surgeons to perform the more technically demanding and less familiar lateral unicondylar knee replacement (LUKR). Twelve orthopaedic trainees naive to LUKR were instructed to perform the procedure according to a pre-operative plan. These were carried out on synthetic sawbones and were completed once with conventional instrumentation alone and once with the adjunct of PSI, allowing a comparison of the plan adherence achieved by the two sets of instrumentation. There was a tendency for PSI to demonstrate improved plan adherence, though a statistically significant improvement was only seen in compound rotational error of the femoral implant (p = 0.004). PSI was, however, able to produce narrower standard deviations in the mean translational displacement of the femoral implant and also the mean rotational displacement of both implants, suggesting a higher degree of precision. Our study provides some evidence that PSI can improve the ability of novice surgeons to replicate a pre-operative plan, but our results suggest the need for larger-scale clinical studies to establish the role of PSI in this procedure.
Ferrada, Evandro; Vergara, Ismael A; Melo, Francisco
2007-01-01
The correct discrimination between native and near-native protein conformations is essential for achieving accurate computer-based protein structure prediction. However, this has proven to be a difficult task, since currently available physical energy functions, empirical potentials and statistical scoring functions are still limited in achieving this goal consistently. In this work, we assess and compare the ability of different full atom knowledge-based potentials to discriminate between native protein structures and near-native protein conformations generated by comparative modeling. Using a benchmark of 152 near-native protein models and their corresponding native structures that encompass several different folds, we demonstrate that the incorporation of close non-bonded pairwise atom terms improves the discriminating power of the empirical potentials. Since the direct and unbiased derivation of close non-bonded terms from current experimental data is not possible, we obtained and used those terms from the corresponding pseudo-energy functions of a non-local knowledge-based potential. It is shown that this methodology significantly improves the discrimination between native and near-native protein conformations, suggesting that a proper description of close non-bonded terms is important to achieve a more complete and accurate description of native protein conformations. Some external knowledge-based energy functions that are widely used in model assessment performed poorly, indicating that the benchmark of models and the specific discrimination task tested in this work constitutes a difficult challenge.
2013-01-01
Background We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. Methods Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek’s p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps ′guideline – perform - falsify – reform′. Results 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. Conclusions The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity. PMID:24074151
Reimold, Matthias; Slifstein, Mark; Heinz, Andreas; Mueller-Schauenburg, Wolfgang; Bares, Roland
2006-06-01
Voxelwise statistical analysis has become popular in explorative functional brain mapping with fMRI or PET. Usually, results are presented as voxelwise levels of significance (t-maps), and for clusters that survive correction for multiple testing the coordinates of the maximum t-value are reported. Before calculating a voxelwise statistical test, spatial smoothing is required to achieve a reasonable statistical power. Little attention is being given to the fact that smoothing has a nonlinear effect on the voxel variances and thus the local characteristics of a t-map, which becomes most evident after smoothing over different types of tissue. We investigated the related artifacts, for example, white matter peaks whose position depend on the relative variance (variance over contrast) of the surrounding regions, and suggest improving spatial precision with 'masked contrast images': color-codes are attributed to the voxelwise contrast, and significant clusters (e.g., detected with statistical parametric mapping, SPM) are enlarged by including contiguous pixels with a contrast above the mean contrast in the original cluster, provided they satisfy P < 0.05. The potential benefit is demonstrated with simulations and data from a [11C]Carfentanil PET study. We conclude that spatial smoothing may lead to critical, sometimes-counterintuitive artifacts in t-maps, especially in subcortical brain regions. If significant clusters are detected, for example, with SPM, the suggested method is one way to improve spatial precision and may give the investigator a more direct sense of the underlying data. Its simplicity and the fact that no further assumptions are needed make it a useful complement for standard methods of statistical mapping.
2015-01-01
The conversion efficiency (εc) of absorbed radiation into biomass (MJ of dry matter per MJ of absorbed photosynthetically active radiation) is a component of yield potential that has been estimated at less than half the theoretical maximum. Various strategies have been proposed to improve εc, but a statistical analysis to establish baseline εc levels across different crop functional types is lacking. Data from 164 published εc studies conducted in relatively unstressed growth conditions were used to determine the means, greatest contributors to variation, and genetic trends in εc across important food and biofuel crop species. εc was greatest in biofuel crops (0.049–0.066), followed by C4 food crops (0.046–0.049), C3 nonlegumes (0.036–0.041), and finally C3 legumes (0.028–0.035). Despite confining our analysis to relatively unstressed growth conditions, total incident solar radiation and average growing season temperature most often accounted for the largest portion of εc variability. Genetic improvements in εc, when present, were less than 0.7% per year, revealing the unrealized potential of improving εc as a promising contributing strategy to meet projected future agricultural demand. PMID:25829463
Adams, Debra; Hine, Victoria; Bucior, Helen; Foster, Wendy; Mukombe, Nyarayi; Ryan, Jane; Smirthwaite, Sandra; Winfield, Jodie
2018-03-01
In response to the ongoing infection prevention (IP) challenges in England, a 90-day quality improvement (QI) collaborative programme was developed. The paper discusses the approach, benefits, challenges and evaluation of the programme. The objective of the collaborative was to develop new approaches to enable sustainable and effective IP. Six trusts in the region participated in the collaborative. Each defined their bespoke IP focus. There was no expectation that statistically significant measurable improvements would be identified during the short time frame. The experiences of the participants were sought both during the programme to facilitate its constant review and at the end of the programme to evaluate its effectiveness. The feedback focused on achievements, barriers to change and benefits of participating in a QI collaborative. To measure the potential success of the projects, participants completed the Model for Understanding Success in Quality framework. (MUSIQ; Kaplan et al., 2012). Since each trusts IP focus was bespoke commonalities of success were not evaluated. Participants identified a positive outcome from their QI interventions. The MUSIQ score identified the projects had the potential for success. The feedback from the participants demonstrated that it is worthy of further development.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2005-01-01
The purpose of the research was to develop and test improved hazard algorithms that could result in the development of sensors that are better able to anticipate potentially severe atmospheric turbulence, which affects aircraft safety. The research focused on employing numerical simulation models to develop improved algorithms for the prediction of aviation turbulence. This involved producing both research simulations and real-time simulations of environments predisposed to moderate and severe aviation turbulence. The research resulted in the following fundamental advancements toward the aforementioned goal: 1) very high resolution simulations of turbulent environments indicated how predictive hazard indices could be improved resulting in a candidate hazard index that indicated the potential for improvement over existing operational indices, 2) a real-time turbulence hazard numerical modeling system was improved by correcting deficiencies in its simulation of moist convection and 3) the same real-time predictive system was tested by running the code twice daily and the hazard prediction indices updated and improved. Additionally, a simple validation study was undertaken to determine how well a real time hazard predictive index performed when compared to commercial pilot observations of aviation turbulence. Simple statistical analyses were performed in this validation study indicating potential skill in employing the hazard prediction index to predict regions of varying intensities of aviation turbulence. Data sets from a research numerical model where provided to NASA for use in a large eddy simulation numerical model. A NASA contractor report and several refereed journal articles where prepared and submitted for publication during the course of this research.
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
Wavelet methodology to improve single unit isolation in primary motor cortex cells
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.
2016-01-01
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461
The impact of protocol on nurses' role stress: a longitudinal perspective.
Dodd-McCue, Diane; Tartaglia, Alexander; Veazey, Kenneth W; Streetman, Pamela S
2005-04-01
The study examined the impact of a protocol directed at increasing organ donation on the role stress and work attitudes of critical care nurses involved in potential organ donation cases. The research examined whether the protocol could positively affect nurses' perceptions of role stress, and if so, could the work environment improvements be sustained over time. The Family Communication Coordinator (FCC) protocol promotes effective communication during potential organ donation cases using a multidisciplinary team approach. Previous research found it associated with improved donation outcomes and with improved perceptions of role stress by critical care nurses. However, the previous study lacked methodological rigor necessary to determine causality and sustainability over time. The study used a quasi-experimental prospective longitudinal design. The sample included critical care nurses who had experience with potential organ donation cases with the protocol. Survey data were collected at 4 points over 2 years. Surveys used previously validated and reliable measures of role stress (role ambiguity, role conflict, role overload) and work attitudes (commitment, satisfaction). Interviews supplemented these data. The nurses' perceptions of role stress associated with potential organ donation cases dramatically dropped after the protocol was implemented. All measures of role stress, particularly role ambiguity and role conflict, showed statistically significant and sustained improvement. Nurses' professional, unit, and hospital commitment and satisfaction reflect an increasingly positive workplace. The results demonstrate that the FCC protocol positively influenced the workplace through its impact on role stress over the first 2 years following its implementation. The findings suggest that similar protocols may be appropriate in improving the critical care environment by reducing the stress and uncertainty of professionals involved in other end-of-life situations. However, the most striking implication relates to the reality of the workplace: meeting the goals of improved patient care outcomes and those of improving the healthcare work environment are not mutually exclusive and may be mutually essential.
Assessment of MSFCs Process for the Development and Activation of Space Act Agreements
NASA Technical Reports Server (NTRS)
Daugherty, Rachel A.
2014-01-01
A Space Act Agreement (SAA) is a contractual vehicle that NASA utilizes to form partnerships with non-NASA entities to stimulate cutting-edge innovation within the science and technology communities while concurrently supporting the NASA missions. SAAs are similar to traditional contracts in that they involve the commitment of Agency resources but allow more flexibility and are more cost effective to implement than traditional contracts. Consequently, the use of SAAs to develop partnerships has greatly increased over the past several years. To facilitate this influx of SAAs, Marshall Space Flight Center (MSFC) developed a process during a kaizen event to streamline and improve the quality of SAAs developed at the Center level. This study assessed the current SAA process to determine if improvements could be implemented to increase productivity, decrease time to activation, and improve the quality of deliverables. Using a combination of direct procedural observation, personnel interviews, and statistical analysis, elements of the process in need of remediation were identified and potential solutions developed. The findings focus primarily on the difficulties surrounding tracking and enforcing process adherence and communication issues among stakeholders. Potential solutions include utilizing customer relationship management (CRM) software to facilitate process coordination and co-locating or potentially merging the two separate organizations involved in SAA development and activation at MSFC.
NASA Astrophysics Data System (ADS)
Herman, J. D.; Steinschneider, S.; Nayak, M. A.
2017-12-01
Short-term weather forecasts are not codified into the operating policies of federal, multi-purpose reservoirs, despite their potential to improve service provision. This is particularly true for facilities that provide flood protection and water supply, since the potential flood damages are often too severe to accept the risk of inaccurate forecasts. Instead, operators must maintain empty storage capacity to mitigate flood risk, even if the system is currently in drought, as occurred in California from 2012-2016. This study investigates the potential for forecast-informed operating rules to improve water supply efficiency while maintaining flood protection, combining state-of-the-art weather hindcasts with a novel tree-based policy optimization framework. We hypothesize that forecasts need only accurately predict the occurrence of a storm, rather than its intensity, to be effective in regions like California where wintertime, synoptic-scale storms dominate the flood regime. We also investigate the potential for downstream groundwater injection to improve the utility of forecasts. These hypotheses are tested in a case study of Folsom Reservoir on the American River. Because available weather hindcasts are relatively short (10-20 years), we propose a new statistical framework to develop synthetic forecasts to assess the risk associated with inaccurate forecasts. The efficiency of operating policies is tested across a range of scenarios that include varying forecast skill and additional groundwater pumping capacity. Results suggest that the combined use of groundwater storage and short-term weather forecasts can substantially improve the tradeoff between water supply and flood control objectives in large, multi-purpose reservoirs in California.
Bringing modeling to the masses: A web based system to predict potential species distributions
Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul
2010-01-01
Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.
Effects of Inductively Coupled Plasma Hydrogen on Long-Wavelength Infrared HgCdTe Photodiodes
NASA Astrophysics Data System (ADS)
Boieriu, P.; Buurma, C.; Bommena, R.; Blissett, C.; Grein, C.; Sivananthan, S.
2013-12-01
Bulk passivation of semiconductors with hydrogen continues to be investigated for its potential to improve device performance. In this work, hydrogen-only inductively coupled plasma (ICP) was used to incorporate hydrogen into long-wavelength infrared HgCdTe photodiodes grown by molecular-beam epitaxy. Fully fabricated devices exposed to ICP showed statistically significant increases in zero-bias impedance values, improved uniformity, and decreased dark currents. HgCdTe photodiodes on Si substrates passivated with amorphous ZnS exhibited reductions in shunt currents, whereas devices on CdZnTe substrates passivated with polycrystalline CdTe exhibited reduced surface leakage, suggesting that hydrogen passivates defects in bulk HgCdTe and in CdTe.
Strategies for improving approximate Bayesian computation tests for synchronous diversification.
Overcast, Isaac; Bagley, Justin C; Hickerson, Michael J
2017-08-24
Estimating the variability in isolation times across co-distributed taxon pairs that may have experienced the same allopatric isolating mechanism is a core goal of comparative phylogeography. The use of hierarchical Approximate Bayesian Computation (ABC) and coalescent models to infer temporal dynamics of lineage co-diversification has been a contentious topic in recent years. Key issues that remain unresolved include the choice of an appropriate prior on the number of co-divergence events (Ψ), as well as the optimal strategies for data summarization. Through simulation-based cross validation we explore the impact of the strategy for sorting summary statistics and the choice of prior on Ψ on the estimation of co-divergence variability. We also introduce a new setting (β) that can potentially improve estimation of Ψ by enforcing a minimal temporal difference between pulses of co-divergence. We apply this new method to three empirical datasets: one dataset each of co-distributed taxon pairs of Panamanian frogs and freshwater fishes, and a large set of Neotropical butterfly sister-taxon pairs. We demonstrate that the choice of prior on Ψ has little impact on inference, but that sorting summary statistics yields substantially more reliable estimates of co-divergence variability despite violations of assumptions about exchangeability. We find the implementation of β improves estimation of Ψ, with improvement being most dramatic given larger numbers of taxon pairs. We find equivocal support for synchronous co-divergence for both of the Panamanian groups, but we find considerable support for asynchronous divergence among the Neotropical butterflies. Our simulation experiments demonstrate that using sorted summary statistics results in improved estimates of the variability in divergence times, whereas the choice of hyperprior on Ψ has negligible effect. Additionally, we demonstrate that estimating the number of pulses of co-divergence across co-distributed taxon-pairs is improved by applying a flexible buffering regime over divergence times. This improves the correlation between Ψ and the true variability in isolation times and allows for more meaningful interpretation of this hyperparameter. This will allow for more accurate identification of the number of temporally distinct pulses of co-divergence that generated the diversification pattern of a given regional assemblage of sister-taxon-pairs.
From dinner table to digital tablet: technology's potential for reducing loneliness in older adults.
McCausland, Lauren; Falk, Nancy L
2012-05-01
Statistics estimate that close to 35% of our nation's older individuals experience loneliness. Feelings of loneliness have been associated with physical and psychological illness in several research studies. As technology advances and connectivity through tablet devices becomes increasingly user friendly, the potential for tablets to reduce loneliness among older adults is substantial. This article discusses the issue of loneliness among older adults and suggests tablet technology as a tool to improve connectivity and reduce loneliness in the older adult population. As nurses, we have the opportunity to help enhance the quality of life for our clients. Tablet technology offers a new option that should be fully explored. Copyright 2012, SLACK Incorporated.
Protein mass spectra data analysis for clinical biomarker discovery: a global review.
Roy, Pascal; Truntzer, Caroline; Maucort-Boulch, Delphine; Jouve, Thomas; Molinari, Nicolas
2011-03-01
The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years there has been a growing interest in using high throughput technologies for the detection of such biomarkers. In particular, mass spectrometry appears as an exciting tool with great potential. However, to extract any benefit from the massive potential of clinical proteomic studies, appropriate methods, improvement and validation are required. To better understand the key statistical points involved with such studies, this review presents the main data analysis steps of protein mass spectra data analysis, from the pre-processing of the data to the identification and validation of biomarkers.
Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan
2018-06-01
The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.
Paré, Pierre; Lee, Joanna; Hawes, Ian A
2010-03-01
To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician's discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management.
Paré, Pierre; Math, Joanna Lee M; Hawes, Ian A
2010-01-01
OBJECTIVE: To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. METHODS: Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician’s discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. RESULTS: A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). CONCLUSIONS: A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management. PMID:20352148
Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks
Bock, Joel R.; Maewal, Akhilesh; Gough, David A.
2012-01-01
Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507
An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.
Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun
2016-12-01
The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.
Vestibular rehabilitation using video gaming in adults with dizziness: a pilot study.
Phillips, J S; Fitzgerald, J; Phillis, D; Underwood, A; Nunney, I; Bath, A
2018-03-01
To determine the effectiveness of vestibular rehabilitation using the Wii Fit balance platform, in adults with dizziness. A single-site prospective clinical trial was conducted in a university hospital in the UK. Forty patients with dizziness, who would normally be candidates for vestibular rehabilitation, were identified and considered as potential participants. Participants were randomised into either the treatment group (the Wii Fit group) or the control group (standard customised vestibular rehabilitation protocol). Participants were assessed over a 16-week period using several balance and quality of life questionnaires. Both exercise regimes resulted in a reduction of dizziness and an improvement in quality of life scores over time, but no statistically significant difference between the two interventions was identified. This pilot study demonstrated that use of the Wii Fit balance platform resulted in a statistically significant improvement in balance function and quality of life. Furthermore, outcomes were comparable to a similar group of individuals following a standard customised vestibular rehabilitation protocol. The study provides useful information to inform the design and execution of a larger clinical trial.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Inverse probability weighting for covariate adjustment in randomized studies
Li, Xiaochun; Li, Lingling
2013-01-01
SUMMARY Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting “favorable” model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a “favorable” model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. PMID:24038458
Inverse probability weighting for covariate adjustment in randomized studies.
Shen, Changyu; Li, Xiaochun; Li, Lingling
2014-02-20
Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.
Self-assessed performance improves statistical fusion of image labels
Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.
2014-01-01
Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance. Statistical fusion resulted in statistically indistinguishable performance from self-assessed weighted voting. The authors developed a new theoretical basis for using self-assessed performance in the framework of statistical fusion and demonstrated that the combined sources of information (both statistical assessment and self-assessment) yielded statistically significant improvement over the methods considered separately. Conclusions: The authors present the first systematic characterization of self-assessed performance in manual labeling. The authors demonstrate that self-assessment and statistical fusion yield similar, but complementary, benefits for label fusion. Finally, the authors present a new theoretical basis for combining self-assessments with statistical label fusion. PMID:24593721
A statistical model for predicting muscle performance
NASA Astrophysics Data System (ADS)
Byerly, Diane Leslie De Caix
The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing injury.
Blinded Validation of Breath Biomarkers of Lung Cancer, a Potential Ancillary to Chest CT Screening
Phillips, Michael; Bauer, Thomas L.; Cataneo, Renee N.; Lebauer, Cassie; Mundada, Mayur; Pass, Harvey I.; Ramakrishna, Naren; Rom, William N.; Vallières, Eric
2015-01-01
Background Breath volatile organic compounds (VOCs) have been reported as biomarkers of lung cancer, but it is not known if biomarkers identified in one group can identify disease in a separate independent cohort. Also, it is not known if combining breath biomarkers with chest CT has the potential to improve the sensitivity and specificity of lung cancer screening. Methods Model-building phase (unblinded): Breath VOCs were analyzed with gas chromatography mass spectrometry in 82 asymptomatic smokers having screening chest CT, 84 symptomatic high-risk subjects with a tissue diagnosis, 100 without a tissue diagnosis, and 35 healthy subjects. Multiple Monte Carlo simulations identified breath VOC mass ions with greater than random diagnostic accuracy for lung cancer, and these were combined in a multivariate predictive algorithm. Model-testing phase (blinded validation): We analyzed breath VOCs in an independent cohort of similar subjects (n = 70, 51, 75 and 19 respectively). The algorithm predicted discriminant function (DF) values in blinded replicate breath VOC samples analyzed independently at two laboratories (A and B). Outcome modeling: We modeled the expected effects of combining breath biomarkers with chest CT on the sensitivity and specificity of lung cancer screening. Results Unblinded model-building phase. The algorithm identified lung cancer with sensitivity 74.0%, specificity 70.7% and C-statistic 0.78. Blinded model-testing phase: The algorithm identified lung cancer at Laboratory A with sensitivity 68.0%, specificity 68.4%, C-statistic 0.71; and at Laboratory B with sensitivity 70.1%, specificity 68.0%, C-statistic 0.70, with linear correlation between replicates (r = 0.88). In a projected outcome model, breath biomarkers increased the sensitivity, specificity, and positive and negative predictive values of chest CT for lung cancer when the tests were combined in series or parallel. Conclusions Breath VOC mass ion biomarkers identified lung cancer in a separate independent cohort, in a blinded replicated study. Combining breath biomarkers with chest CT could potentially improve the sensitivity and specificity of lung cancer screening. Trial Registration ClinicalTrials.gov NCT00639067 PMID:26698306
Recent Development on O(+) - O Collision Frequency and Ionosphere-Thermosphere Coupling
NASA Technical Reports Server (NTRS)
Omidvar, K.; Menard, R.
1999-01-01
The collision frequency between an oxygen atom and its singly charged ion controls the momentum transfer between the ionosphere and the thermosphere. There has been a long standing discrepancy, extending over a decade, between the theoretical and empirical determination of this frequency: the empirical value of this frequency exceeded the theoretical value by a factor of 1.7. Recent improvements in theory were obtained by using accurate oxygen ion-oxygen atom potential energy curves, and partial wave quantum mechanical calculations. We now have applied three independent statistical methods to the observational data, obtained at the MIT/Millstone Hill Observatory, consisting of two sets A and B. These methods give results consistent with each other, and together with the recent theoretical improvements, bring the ratio close to unity, as it should be. The three statistical methods lead to an average for the ratio of the empirical to the theoretical values equal to 0.98, with an uncertainty of +/-8%, resolving the old discrepancy between theory and observation. The Hines statistics, and the lognormal distribution statistics, both give lower and upper bounds for the Set A equal to 0.89 and 1.02, respectively. The related bounds for the Set B are 1.06 and 1.17. The average values of these bounds thus bracket the ideal value of the ratio which should be equal to unity. The main source of uncertainties are errors in the profile of the oxygen atom density, which is of the order of 11%. An alternative method to find the oxygen atom density is being suggested.
Mirro, Amy E.; Brady, Samuel L.; Kaufman, Robert. A.
2016-01-01
Purpose To implement the maximum level of statistical iterative reconstruction that can be used to establish dose-reduced head CT protocols in a primarily pediatric population. Methods Select head examinations (brain, orbits, sinus, maxilla and temporal bones) were investigated. Dose-reduced head protocols using an adaptive statistical iterative reconstruction (ASiR) were compared for image quality with the original filtered back projection (FBP) reconstructed protocols in phantom using the following metrics: image noise frequency (change in perceived appearance of noise texture), image noise magnitude, contrast-to-noise ratio (CNR), and spatial resolution. Dose reduction estimates were based on computed tomography dose index (CTDIvol) values. Patient CTDIvol and image noise magnitude were assessed in 737 pre and post dose reduced examinations. Results Image noise texture was acceptable up to 60% ASiR for Soft reconstruction kernel (at both 100 and 120 kVp), and up to 40% ASiR for Standard reconstruction kernel. Implementation of 40% and 60% ASiR led to an average reduction in CTDIvol of 43% for brain, 41% for orbits, 30% maxilla, 43% for sinus, and 42% for temporal bone protocols for patients between 1 month and 26 years, while maintaining an average noise magnitude difference of 0.1% (range: −3% to 5%), improving CNR of low contrast soft tissue targets, and improving spatial resolution of high contrast bony anatomy, as compared to FBP. Conclusion The methodology in this study demonstrates a methodology for maximizing patient dose reduction and maintaining image quality using statistical iterative reconstruction for a primarily pediatric population undergoing head CT examination. PMID:27056425
Okokon, Enembe Oku; Roivainen, Päivi; Kheifets, Leeka; Mezei, Gabor; Juutilainen, Jukka
2014-01-01
Previous studies have shown that populations of multiapartment buildings with indoor transformer stations may serve as a basis for improved epidemiological studies on the relationship between childhood leukaemia and extremely-low-frequency (ELF) magnetic fields (MFs). This study investigated whether classification based on structural characteristics of the transformer stations would improve ELF MF exposure assessment. The data included MF measurements in apartments directly above transformer stations ("exposed" apartments) in 30 buildings in Finland, and reference apartments in the same buildings. Transformer structural characteristics (type and location of low-voltage conductors) were used to classify exposed apartments into high-exposure (HE) and intermediate-exposure (IE) categories. An exposure gradient was observed: both the time-average MF and time above a threshold (0.4 μT) were highest in the HE apartments and lowest in the reference apartments, showing a statistically significant trend. The differences between HE and IE apartments, however, were not statistically significant. A simulation exercise showed that the three-category classification did not perform better than a two-category classification (exposed and reference apartments) in detecting the existence of an increased risk. However, data on the structural characteristics of transformers is potentially useful for evaluating exposure-response relationship.
Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce
2013-03-01
New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.
Jochem, Warren C; Razzaque, Abdur; Root, Elisabeth Dowling
2016-09-01
Respiratory infections continue to be a public health threat, particularly to young children in developing countries. Understanding the geographic patterns of diseases and the role of potential risk factors can help improve future mitigation efforts. Toward this goal, this paper applies a spatial scan statistic combined with a zero-inflated negative-binomial regression to re-examine the impacts of a community-based treatment program on the geographic patterns of acute lower respiratory infection (ALRI) mortality in an area of rural Bangladesh. Exposure to arsenic-contaminated drinking water is also a serious threat to the health of children in this area, and the variation in exposure to arsenic must be considered when evaluating the health interventions. ALRI mortality data were obtained for children under 2 years old from 1989 to 1996 in the Matlab Health and Demographic Surveillance System. This study period covers the years immediately following the implementation of an ALRI control program. A zero-inflated negative binomial (ZINB) regression model was first used to simultaneously estimate mortality rates and the likelihood of no deaths in groups of related households while controlling for socioeconomic status, potential arsenic exposure, and access to care. Next a spatial scan statistic was used to assess the location and magnitude of clusters of ALRI mortality. The ZINB model was used to adjust the scan statistic for multiple social and environmental risk factors. The results of the ZINB models and spatial scan statistic suggest that the ALRI control program was successful in reducing child mortality in the study area. Exposure to arsenic-contaminated drinking water was not associated with increased mortality. Higher socioeconomic status also significantly reduced mortality rates, even among households who were in the treatment program area. Community-based ALRI interventions can be effective at reducing child mortality, though socioeconomic factors may continue to influence mortality patterns. The combination of spatial and non-spatial methods used in this paper has not been applied previously in the literature, and this study demonstrates the importance of such approaches for evaluating and improving public health intervention programs.
ERIC Educational Resources Information Center
Rabin, Laura A.; Nutter-Upham, Katherine E.
2010-01-01
We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…
Improving Student Understanding of Spatial Ecology Statistics
ERIC Educational Resources Information Center
Hopkins, Robert, II; Alberts, Halley
2015-01-01
This activity is designed as a primer to teaching population dispersion analysis. The aim is to help improve students' spatial thinking and their understanding of how spatial statistic equations work. Students use simulated data to develop their own statistic and apply that equation to experimental behavioral data for Gambusia affinis (western…
Throughput Benefit Assessment for Tactical Runway Configuration Management (TRCM)
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Fenbert, James W.
2014-01-01
The System-Oriented Runway Management (SORM) concept is a collection of needed capabilities focused on a more efficient use of runways while considering all of the factors that affect runway use. Tactical Runway Configuration Management (TRCM), one of the SORM capabilities, provides runway configuration and runway usage recommendations, monitoring the active runway configuration for suitability given existing factors, based on a 90 minute planning horizon. This study evaluates the throughput benefits using a representative sample of today's traffic volumes at three airports: Memphis International Airport (MEM), Dallas-Fort Worth International Airport (DFW), and John F. Kennedy International Airport (JFK). Based on this initial assessment, there are statistical throughput benefits for both arrivals and departures at MEM with an average of 4% for arrivals, and 6% for departures. For DFW, there is a statistical benefit for arrivals with an average of 3%. Although there is an average of 1% benefit observed for departures, it is not statistically significant. For JFK, there is a 12% benefit for arrivals, but a 2% penalty for departures. The results obtained are for current traffic volumes and should show greater benefit for increased future demand. This paper also proposes some potential TRCM algorithm improvements for future research. A continued research plan is being worked to implement these improvements and to re-assess the throughput benefit for today and future projected traffic volumes.
Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun
2017-08-01
Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2 = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.
Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values
Tong, Tiejun; Feng, Zeny; Hilton, Julia S.; Zhao, Hongyu
2013-01-01
Estimating the proportion of true null hypotheses, π0, has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π0 by incorporating the distribution pattern of the observed p-values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p-values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 − λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance. PMID:24078762
Kakourou, Alexia; Vach, Werner; Nicolardi, Simone; van der Burgt, Yuri; Mertens, Bart
2016-10-01
Mass spectrometry based clinical proteomics has emerged as a powerful tool for high-throughput protein profiling and biomarker discovery. Recent improvements in mass spectrometry technology have boosted the potential of proteomic studies in biomedical research. However, the complexity of the proteomic expression introduces new statistical challenges in summarizing and analyzing the acquired data. Statistical methods for optimally processing proteomic data are currently a growing field of research. In this paper we present simple, yet appropriate methods to preprocess, summarize and analyze high-throughput MALDI-FTICR mass spectrometry data, collected in a case-control fashion, while dealing with the statistical challenges that accompany such data. The known statistical properties of the isotopic distribution of the peptide molecules are used to preprocess the spectra and translate the proteomic expression into a condensed data set. Information on either the intensity level or the shape of the identified isotopic clusters is used to derive summary measures on which diagnostic rules for disease status allocation will be based. Results indicate that both the shape of the identified isotopic clusters and the overall intensity level carry information on the class outcome and can be used to predict the presence or absence of the disease.
Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values.
Tong, Tiejun; Feng, Zeny; Hilton, Julia S; Zhao, Hongyu
2013-01-01
Estimating the proportion of true null hypotheses, π 0 , has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π 0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π 0 by incorporating the distribution pattern of the observed p -values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p -values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 - λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance.
Spezia, Riccardo; Martínez-Nuñez, Emilio; Vazquez, Saulo; Hase, William L
2017-04-28
In this Introduction, we show the basic problems of non-statistical and non-equilibrium phenomena related to the papers collected in this themed issue. Over the past few years, significant advances in both computing power and development of theories have allowed the study of larger systems, increasing the time length of simulations and improving the quality of potential energy surfaces. In particular, the possibility of using quantum chemistry to calculate energies and forces 'on the fly' has paved the way to directly study chemical reactions. This has provided a valuable tool to explore molecular mechanisms at given temperatures and energies and to see whether these reactive trajectories follow statistical laws and/or minimum energy pathways. This themed issue collects different aspects of the problem and gives an overview of recent works and developments in different contexts, from the gas phase to the condensed phase to excited states.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson M.; Feng, Zhe; Burleyson, Casey D.
Regional cloud permitting model simulations of cloud populations observed during the 2011 ARM Madden Julian Oscillation Investigation Experiment/ Dynamics of Madden-Julian Experiment (AMIE/DYNAMO) field campaign are evaluated against radar and ship-based measurements. Sensitivity of model simulated surface rain rate statistics to parameters and parameterization of hydrometeor sizes in five commonly used WRF microphysics schemes are examined. It is shown that at 2 km grid spacing, the model generally overestimates rain rate from large and deep convective cores. Sensitivity runs involving variation of parameters that affect rain drop or ice particle size distribution (more aggressive break-up process etc) generally reduce themore » bias in rain-rate and boundary layer temperature statistics as the smaller particles become more vulnerable to evaporation. Furthermore significant improvement in the convective rain-rate statistics is observed when the horizontal grid-spacing is reduced to 1 km and 0.5 km, while it is worsened when run at 4 km grid spacing as increased turbulence enhances evaporation. The results suggest modulation of evaporation processes, through parameterization of turbulent mixing and break-up of hydrometeors may provide a potential avenue for correcting cloud statistics and associated boundary layer temperature biases in regional and global cloud permitting model simulations.« less
Arul, P
2017-01-01
Asphalts are bitumens that consist of complex of hydrocarbon mixtures and it is used mainly in road construction and maintenance. This study was undertaken to evaluate the micronucleus (MN) assay of exfoliated buccal epithelial cells in road construction workers using liquid-based cytology (LBC) preparation. Three different stains (May-Grunwald Giemsa, hematoxylin and eosin, and Papanicolaou) were used to evaluate the frequency of MN in exfoliated buccal epithelial of 100 participants (fifty road construction workers and fifty administrative staff) using LBC preparation. Statistical analysis was performed with Student's t-test, and P< 0.05 was considered statistically significant. The mean frequency of MN for cases was significantly higher than that of controls (P = 0.001) regardless of staining method used and also cases with exposure period of more than 5 years had statistically significant difference (P < 0.05) than cases with Conclusion: The present study concluded that workers exposed to asphalts during road construction exhibit a higher frequency of MN in exfoliated buccal epithelial cells and they are under the significant risk of cytogenetic damage. LBC preparation has potential application for the evaluation of frequency of MN. This technique may be advocated in those who are occupationally exposed to potentially carcinogenic agents in view of improvement in the smear quality and visualization of cell morphology.
Saravanakumar, Padmapriya; Higgins, Isabel Johanna; van der Riet, Pamela Jane; Marquez, Jodie; Sibbritt, David
2014-01-01
Abstract Falls amongst older people is a global public health concern. Whilst falling is not a typical feature of ageing, older people are more likely to fall. Fall injuries amongst older people are a leading cause of death and disability. Many older people do not do regular exercise so that they lose muscle tone, strength, and flexibility which affect balance and predispose them to falls. The management of falls in residential care settings is a major concern with strategies for prevention and monitoring a focus in this setting. Yoga and tai chi have shown potential to improve balance and prevent falls in older adults. They also have potential to improve pain and quality of life. The aim of this study was to determine the feasibility of conducting a three-arm randomised controlled trial (RCT) with frail older people in a residential care setting to test the hypothesis that a 14-week modified tai chi or yoga programme is more effective than usual care activity in improving balance function, quality of life, pain experience and in reducing number of falls. There were no statistically significant differences between the three groups in the occurrence of falls. Yoga demonstrated a slight decrease in fall incidence; quality of life improved for the tai chi group. Only the yoga group experienced a reduction in average pain scores though not statistically significant. The findings of the study suggest it is possible to safely implement modified yoga and tai chi in a residential care setting and evaluate this using RCT design. They show positive changes to balance, pain and quality of life and a high level of interest through attendance amongst the older participants. The results support offering tai chi and yoga to older people who are frail and dependent with physical and cognitive limitations.
Endahl, Lars A; Utzon, Jan
2002-09-16
It is well known that publication of hospital quality indicators may lead to improving of treatments. But the publication can also have some negative side effects: Focus may shift to the evaluated areas at the expense of non-evaluated areas. The most ill patients may be sorted out and high risk patients may be transferred to other hospitals or discharged in order to avoid their dying during hospitalisation and improve statistics. Overestimation of patient risk in order to improve relative treatment outcome. Increasing flow of patients to hospitals with high scores on quality indicators may cause imbalance between activities and budgets and hence longer waiting times and reduced quality of treatment. Negative publicity due to low scores on quality indicators may lead to under-utilisation of hospital capacity, patient and staff insecurity and staff wastage. Thus, publication of quality indicators may improve quality within the health sector, but it is very important to recognise potential pitfalls and negative side effects.
Zhang, Jian-Hua; Böhme, Johann F
2007-11-01
In this paper we report an adaptive regularization network (ARN) approach to realizing fast blind separation of cerebral evoked potentials (EPs) from background electroencephalogram (EEG) activity with no need to make any explicit assumption on the statistical (or deterministic) signal model. The ARNs are proposed to construct nonlinear EEG and EP signal models. A novel adaptive regularization training (ART) algorithm is proposed to improve the generalization performance of the ARN. Two adaptive neural modeling methods based on the ARN are developed and their implementation and performance analysis are also presented. The computer experiments using simulated and measured visual evoked potential (VEP) data have shown that the proposed ARN modeling paradigm yields computationally efficient and more accurate VEP signal estimation owing to its intrinsic model-free and nonlinear processing characteristics.
2016-01-01
Background Long-term conditions and their concomitant management place considerable pressure on patients, communities, and health care systems worldwide. International clinical guidelines on the majority of long-term conditions recommend the inclusion of self-management programs in routine management. Self-management programs have been associated with improved health outcomes; however, the successful and sustainable transfer of research programs into clinical practice has been inconsistent. Recent developments in mobile technology, such as mobile phone and tablet computer apps, could help in developing a platform for the delivery of self-management interventions that are adaptable, of low cost, and easily accessible. Objective We conducted a systematic review to assess the effectiveness of mobile phone and tablet apps in self-management of key symptoms of long-term conditions. Methods We searched PubMed, Embase, EBSCO databases, the Cochrane Library, and The Joanna Briggs Institute Library for randomized controlled trials that assessed the effectiveness of mobile phone and tablet apps in self-management of diabetes mellitus, cardiovascular disease, and chronic lung diseases from 2005–2016. We searched registers of current and ongoing trials, as well as the gray literature. We then checked the reference lists of all primary studies and review papers for additional references. The last search was run in February 2016. Results Of the 9 papers we reviewed, 6 of the interventions demonstrated a statistically significant improvement in the primary measure of clinical outcome. Where the intervention comprised an app only, 3 studies demonstrated a statistically significant improvement. Interventions to address diabetes mellitus (5/9) were the most common, followed by chronic lung disease (3/9) and cardiovascular disease (1/9). A total of 3 studies included multiple intervention groups using permutations of an intervention involving an app. The duration of the intervention ranged from 6 weeks to 1 year, and final follow-up data ranged from 3 months to 1 year. Sample size ranged from 48 to 288 participants. Conclusions The evidence indicates the potential of apps in improving symptom management through self-management interventions. The use of apps in mHealth has the potential to improve health outcomes among those living with chronic diseases through enhanced symptom control. Further innovation, optimization, and rigorous research around the potential of apps in mHealth technology will move the field toward the reality of improved health care delivery and outcomes. PMID:27185295
Wu, Johnny C; Gardner, David P; Ozer, Stuart; Gutell, Robin R; Ren, Pengyu
2009-08-28
The accurate prediction of the secondary and tertiary structure of an RNA with different folding algorithms is dependent on several factors, including the energy functions. However, an RNA higher-order structure cannot be predicted accurately from its sequence based on a limited set of energy parameters. The inter- and intramolecular forces between this RNA and other small molecules and macromolecules, in addition to other factors in the cell such as pH, ionic strength, and temperature, influence the complex dynamics associated with transition of a single stranded RNA to its secondary and tertiary structure. Since all of the factors that affect the formation of an RNAs 3D structure cannot be determined experimentally, statistically derived potential energy has been used in the prediction of protein structure. In the current work, we evaluate the statistical free energy of various secondary structure motifs, including base-pair stacks, hairpin loops, and internal loops, using their statistical frequency obtained from the comparative analysis of more than 50,000 RNA sequences stored in the RNA Comparative Analysis Database (rCAD) at the Comparative RNA Web (CRW) Site. Statistical energy was computed from the structural statistics for several datasets. While the statistical energy for a base-pair stack correlates with experimentally derived free energy values, suggesting a Boltzmann-like distribution, variation is observed between different molecules and their location on the phylogenetic tree of life. Our statistical energy values calculated for several structural elements were utilized in the Mfold RNA-folding algorithm. The combined statistical energy values for base-pair stacks, hairpins and internal loop flanks result in a significant improvement in the accuracy of secondary structure prediction; the hairpin flanks contribute the most.
NASA Astrophysics Data System (ADS)
Mukherjee, S.; Salazar, L.; Mittelstaedt, J.; Valdez, O.
2017-11-01
Supernovae in our universe are potential sources of gravitational waves (GW) that could be detected in a network of GW detectors like LIGO and Virgo. Core-collapse supernovae are rare, but the associated gravitational radiation is likely to carry profuse information about the underlying processes driving the supernovae. Calculations based on analytic models predict GW energies within the detection range of the Advanced LIGO detectors, out to tens of Mpc for certain types of signals e.g. coalescing binary neutron stars. For supernovae however, the corresponding distances are much less. Thus, methods that can improve the sensitivity of searches for GW signals from supernovae are desirable, especially in the advanced detector era. Several methods have been proposed based on various likelihood-based regulators that work on data from a network of detectors to detect burst-like signals (as is the case for signals from supernovae) from potential GW sources. To address this problem, we have developed an analysis pipeline based on a method of noise reduction known as the harmonic regeneration noise reduction (HRNR) algorithm. To demonstrate the method, sixteen supernova waveforms from the Murphy et al. 2009 catalog have been used in presence of LIGO science data. A comparative analysis is presented to show detection statistics for a standard network analysis as commonly used in GW pipelines and the same by implementing the new method in conjunction with the network. The result shows significant improvement in detection statistics.
The use of six sigma in health care management: are we using it to its full potential?
DelliFraine, Jami L; Wang, Zheng; McCaughey, Deirdre; Langabeer, James R; Erwin, Cathleen O
2014-01-01
Popular quality improvement tools such as Six Sigma (SS) claim to provide health care managers the opportunity to improve health care quality on the basis of sound methodology and data. However, it is unclear whether this quality improvement tool is being used correctly and improves health care quality. The authors conducted a comprehensive literature review to assess the correct use and implementation of SS and the empirical evidence demonstrating the relationship between SS and improved quality of care in health care organizations. The authors identified 310 articles on SS published in the last 15 years. However, only 55 were empirical peer-reviewed articles, 16 of which reported the correct use of SS. Only 7 of these articles included statistical analyses to test for significant changes in quality of care, and only 16 calculated defects per million opportunities or sigma level. This review demonstrates that there are significant gaps in the Six Sigma health care quality improvement literature and very weak evidence that Six Sigma is being used correctly to improve health care quality.
The use of six sigma in health care management: are we using it to its full potential?
DelliFraine, Jami L; Wang, Zheng; McCaughey, Deirdre; Langabeer, James R; Erwin, Cathleen O
2013-01-01
Popular quality improvement tools such as Six Sigma (SS) claim to provide health care managers the opportunity to improve health care quality on the basis of sound methodology and data. However, it is unclear whether this quality improvement tool is being used correctly and improves health care quality. The authors conducted a comprehensive literature review to assess the correct use and implementation of SS and the empirical evidence demonstrating the relationship between SS and improved quality of care in health care organizations. The authors identified 310 articles on SS published in the last 15 years. However, only 55 were empirical peer-reviewed articles, 16 of which reported the correct use of SS. Only 7 of these articles included statistical analyses to test for significant changes in quality of care, and only 16 calculated defects per million opportunities or sigma level. This review demonstrates that there are significant gaps in the Six Sigma health care quality improvement literature and very weak evidence that Six Sigma is being used correctly to improve health care quality.
Towards Enhanced Underwater Lidar Detection via Source Separation
NASA Astrophysics Data System (ADS)
Illig, David W.
Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical separation: The first investigations of statistical separation approaches for underwater lidar are presented. By demonstrating that target and backscatter returns have different statistical properties, a new separation axis is opened. This work investigates and quantifies performance of three statistical separation approaches. 4. Application of detection theory to underwater lidar: While many similar applications use detection theory to assess performance, less development has occurred in the underwater lidar field. This work applies these concepts to statistical separation approaches, providing another perspective in which to assess performance. In addition, by using detection theory approaches, statistical metrics can be used to associate a level of confidence in each ranging measurement. 5. Preliminary investigation of forward scatter suppression: If backscatter is sufficiently suppressed, forward scattering becomes a performance-limiting factor. This work presents a proof-of-concept demonstration of the potential for statistical separation approaches to suppress both forward and backward scatter. These results provide a demonstration of the capability that signal processing has to improve separation between target and backscatter. Separation capability improves in the transition from temporal to frequency to statistical separation approaches, with the statistical separation approaches improving target detection sensitivity by as much as 30 dB. Ranging and detection results demonstrate the enhanced performance this would allow in ranging applications. This increased performance is an important step in moving underwater lidar capability towards the requirements of the next generation of sensors.
Enhancement of MS Signal Processing For Improved Cancer Biomarker Discovery
NASA Astrophysics Data System (ADS)
Si, Qian
Technological advances in proteomics have shown great potential in detecting cancer at the earliest stages. One way is to use the time of flight mass spectroscopy to identify biomarkers, or early disease indicators related to the cancer. Pattern analysis of time of flight mass spectra data from blood and tissue samples gives great hope for the identification of potential biomarkers among the complex mixture of biological and chemical samples for the early cancer detection. One of the keys issues is the pre-processing of raw mass spectra data. A lot of challenges need to be addressed: unknown noise character associated with the large volume of data, high variability in the mass spectroscopy measurements, and poorly understood signal background and so on. This dissertation focuses on developing statistical algorithms and creating data mining tools for computationally improved signal processing for mass spectrometry data. I have introduced an advanced accurate estimate of the noise model and a half-supervised method of mass spectrum data processing which requires little knowledge about the data.
Vo, Mary L; Chin, Russell L; Miranda, Caroline; Latov, Norman
2017-10-01
Gait impairment is a common presenting symptom in patients with chronic inflammatory demyelinating polyneuropathy (CIDP). However, gait parameters have not previously been evaluated in detail as potential independent outcome measures. We prospectively measured changes in spatiotemporal gait parameters of 20 patients with CIDP at baseline and following treatment with intravenous immunoglobulin (IVIG), using GAITRite® a computerized walkway system with embedded sensors. Overall, study patients showed significant improvements in gait velocity, cadence, stride length, double support time, stance phase, and swing phase following IVIG treatment. Mean changes in velocity, stance phase, and swing phase, exhibited the greatest statistical significance among the subgroup that exhibited clinically meaningful improvement in Inflammatory Neuropathy Cause and Treatment disability score, Medical Research Council sum score, and grip strength. Assessment of gait parameters, in particular velocity, step phase and swing phase, is a potentially sensitive outcome measure for evaluating treatment response in CIDP. Muscle Nerve 56: 732-736, 2017. © 2017 Wiley Periodicals, Inc.
Improving estimates of air pollution exposure through ubiquitous sensing technologies
de Nazelle, Audrey; Seto, Edmund; Donaire-Gonzalez, David; Mendez, Michelle; Matamala, Jaume; Nieuwenhuijsen, Mark J; Jerrett, Michael
2013-01-01
Traditional methods of exposure assessment in epidemiological studies often fail to integrate important information on activity patterns, which may lead to bias, loss of statistical power or both in health effects estimates. Novel sensing technologies integrated with mobile phones offer potential to reduce exposure measurement error. We sought to demonstrate the usability and relevance of the CalFit smartphone technology to track person-level time, geographic location, and physical activity patterns for improved air pollution exposure assessment. We deployed CalFit-equipped smartphones in a free living-population of 36 subjects in Barcelona, Spain. Information obtained on physical activity and geographic location was linked to space-time air pollution mapping. For instance, we found on average travel activities accounted for 6% of people’s time and 24% of their daily inhaled NO2. Due to the large number of mobile phone users, this technology potentially provides an unobtrusive means of collecting epidemiologic exposure data at low cost. PMID:23416743
Process improvement methodologies uncover unexpected gaps in stroke care.
Kuner, Anthony D; Schemmel, Andrew J; Pooler, B Dustin; Yu, John-Paul J
2018-01-01
Background The diagnosis and treatment of acute stroke requires timed and coordinated effort across multiple clinical teams. Purpose To analyze the frequency and temporal distribution of emergent stroke evaluations (ESEs) to identify potential contributory workflow factors that may delay the initiation and subsequent evaluation of emergency department stroke patients. Material and Methods A total of 719 sentinel ESEs with concurrent neuroimaging were identified over a 22-month retrospective time period. Frequency data were tabulated and odds ratios calculated. Results Of all ESEs, 5% occur between 01:00 and 07:00. ESEs were most frequent during the late morning and early afternoon hours (10:00-14:00). Unexpectedly, there was a statistically significant decline in the frequency of ESEs that occur at the 14:00 time point. Conclusion Temporal analysis of ESEs in the emergency department allowed us to identify an unexpected decrease in ESEs and through process improvement methodologies (Lean and Six Sigma) and identify potential workflow elements contributing to this observation.
Educational Statistics and School Improvement. Statistics and the Federal Role in Education.
ERIC Educational Resources Information Center
Hawley, Willis D.
This paper focuses on how educational statistics might better serve the quest for educational improvement in elementary and secondary schools. A model for conceptualizing the sources and processes of school productivity is presented. The Learning Productivity Model suggests that school outcomes are the consequence of the interaction of five…
CORSSA: The Community Online Resource for Statistical Seismicity Analysis
Michael, Andrew J.; Wiemer, Stefan
2010-01-01
Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.
New statistical potential for quality assessment of protein models and a survey of energy functions
2010-01-01
Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048
Ross, Ruth E.; Shade-Zeldow, Yvonne; Kostas, Konstantinos; Morrissey, Mary; Elias, Dean A.; Shepard, Alan
2007-01-01
Some patients with fibromyalgia also exhibit the neurological signs of cervical myelopathy. We sought to determine if treatment of cervical myelopathy in patients with fibromyalgia improves the symptoms of fibromyalgia and the patients’ quality of life. A non-randomized, prospective, case control study comparing the outcome of surgical (n = 40) versus non-surgical (n = 31) treatment of cervical myelopathy in patients with fibromyalgia was conducted. Outcomes were compared using SF-36, screening test for somatization, HADS, MMPI-2 scale 1 (Hypochondriasis), and self reported severity of symptoms 1 year after treatment. There was no significant difference in initial clinical presentation or demographic characteristics between the patients treated by surgical decompression and those treated by non-surgical means. There was a striking and statistically significant improvement in all symptoms attributed to the fibromyalgia syndrome in the surgical patients but not in the non-surgical patients at 1 year following the treatment of cervical myelopathy (P ≤ 0.018–0.001, Chi-square or Fisher’s exact test). At the 1 year follow-up, there was a statistically significant improvement in both physical and mental quality of life as measured by the SF-36 score for the surgical group as compared to the non-surgical group (Repeated Measures ANOVA P < 0.01). There was a statistically significant improvement in the scores from Scale 1 of the MMPI-2 and the screening test for somatization disorder, and the anxiety and depression scores exclusively in the surgical patients (Wilcoxon signed rank, P < 0.001). The surgical treatment of cervical myelopathy due to spinal cord or caudal brainstem compression in patients carrying the diagnosis of fibromyalgia can result in a significant improvement in a wide array of symptoms usually attributed to fibromyalgia with attendant measurable improvements in the quality of life. We recommend detailed neurological and neuroradiological evaluation of patients with fibromyalgia in order to exclude compressive cervical myelopathy, a potentially treatable condition. PMID:17426987
Efficacy of Curcuma for Treatment of Osteoarthritis
Perkins, Kimberly; Sahy, William; Beckett, Robert D.
2016-01-01
The objective of this review is to identify, summarize, and evaluate clinical trials to determine the efficacy of curcuma in the treatment of osteoarthritis. A literature search for interventional studies assessing efficacy of curcuma was performed, resulting in 8 clinical trials. Studies have investigated the effect of curcuma on pain, stiffness, and functionality in patients with knee osteoarthritis. Curcuma-containing products consistently demonstrated statistically significant improvement in osteoarthritis-related endpoints compared with placebo, with one exception. When compared with active control, curcuma-containing products were similar to nonsteroidal anti-inflammatory drugs, and potentially to glucosamine. While statistical significant differences in outcomes were reported in a majority of studies, the small magnitude of effect and presence of major study limitations hinder application of these results. Further rigorous studies are needed prior to recommending curcuma as an effective alternative therapy for knee osteoarthritis. PMID:26976085
Boe, Debra Thingstad; Parsons, Helen
2009-01-01
Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964
Slattery, Rebecca A; Ort, Donald R
2015-06-01
The conversion efficiency (ε(c)) of absorbed radiation into biomass (MJ of dry matter per MJ of absorbed photosynthetically active radiation) is a component of yield potential that has been estimated at less than half the theoretical maximum. Various strategies have been proposed to improve ε(c), but a statistical analysis to establish baseline ε(c) levels across different crop functional types is lacking. Data from 164 published ε(c) studies conducted in relatively unstressed growth conditions were used to determine the means, greatest contributors to variation, and genetic trends in ε(c )across important food and biofuel crop species. ε(c) was greatest in biofuel crops (0.049-0.066), followed by C4 food crops (0.046-0.049), C3 nonlegumes (0.036-0.041), and finally C3 legumes (0.028-0.035). Despite confining our analysis to relatively unstressed growth conditions, total incident solar radiation and average growing season temperature most often accounted for the largest portion of ε(c) variability. Genetic improvements in ε(c), when present, were less than 0.7% per year, revealing the unrealized potential of improving ε(c) as a promising contributing strategy to meet projected future agricultural demand. © 2015 American Society of Plant Biologists. All Rights Reserved.
Multiscale decoding for reliable brain-machine interface performance over time.
Han-Lin Hsieh; Wong, Yan T; Pesaran, Bijan; Shanechi, Maryam M
2017-07-01
Recordings from invasive implants can degrade over time, resulting in a loss of spiking activity for some electrodes. For brain-machine interfaces (BMI), such a signal degradation lowers control performance. Achieving reliable performance over time is critical for BMI clinical viability. One approach to improve BMI longevity is to simultaneously use spikes and other recording modalities such as local field potentials (LFP), which are more robust to signal degradation over time. We have developed a multiscale decoder that can simultaneously model the different statistical profiles of multi-scale spike/LFP activity (discrete spikes vs. continuous LFP). This decoder can also run at multiple time-scales (millisecond for spikes vs. tens of milliseconds for LFP). Here, we validate the multiscale decoder for estimating the movement of 7 major upper-arm joint angles in a non-human primate (NHP) during a 3D reach-to-grasp task. The multiscale decoder uses motor cortical spike/LFP recordings as its input. We show that the multiscale decoder can improve decoding accuracy by adding information from LFP to spikes, while running at the fast millisecond time-scale of the spiking activity. Moreover, this improvement is achieved using relatively few LFP channels, demonstrating the robustness of the approach. These results suggest that using multiscale decoders has the potential to improve the reliability and longevity of BMIs.
Balsis, Steve; Choudhury, Tabina K; Geraci, Lisa; Benge, Jared F; Patrick, Christopher J
2018-04-01
Alzheimer's disease (AD) affects neurological, cognitive, and behavioral processes. Thus, to accurately assess this disease, researchers and clinicians need to combine and incorporate data across these domains. This presents not only distinct methodological and statistical challenges but also unique opportunities for the development and advancement of psychometric techniques. In this article, we describe relatively recent research using item response theory (IRT) that has been used to make progress in assessing the disease across its various symptomatic and pathological manifestations. We focus on applications of IRT to improve scoring, test development (including cross-validation and adaptation), and linking and calibration. We conclude by describing potential future multidimensional applications of IRT techniques that may improve the precision with which AD is measured.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Molecular Risk Factors for Schizophrenia.
Modai, Shira; Shomron, Noam
2016-03-01
Schizophrenia (SZ) is a complex and strongly heritable mental disorder, which is also associated with developmental-environmental triggers. As opposed to most diagnosable diseases (yet similar to other mental disorders), SZ diagnosis is commonly based on psychiatric evaluations. Recently, large-scale genetic and epigenetic approaches have been applied to SZ research with the goal of potentially improving diagnosis. Increased computational analyses and applied statistical algorithms may shed some light on the complex genetic and epigenetic pathways contributing to SZ pathogenesis. This review discusses the latest advances in molecular risk factors and diagnostics for SZ. Approaches such as these may lead to a more accurate definition of SZ and assist in creating extended and reliable clinical diagnoses with the potential for personalized treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.
Number needed to eat: pizza and resident conference attendance.
Cosimini, Michael J; Mackintosh, Liza; Chang, Todd P
2016-12-01
The didactic conference is a common part of the resident education curriculum. Given the demands of clinical responsibilities and restrictions on duty hours, maximising education is a challenge faced by all residency programmes. To date, little research exists with respect to how the provision of complimentary food affects physician and resident conference attendance. The objective of this study was to determine whether complimentary food improves resident arrival times and attendance at educational conferences and, furthermore, to test whether this provision is a potentially cost-effective tool for improving education. A retrospective review of 36 resident educational Friday noon conferences, including 1043 resident arrivals, was performed. Data were analysed for total attendance, arrival times, number needed to eat (NNE) and the percentage of residents arriving on time, and compared between days on which food was and was not provided. Median attendance was 3.7% higher (p = 0.04) on days on which food was provided, at a cost of US$46 for each additional resident in attendance. Arrival times were also statistically significantly improved when food was provided, with a median improvement of 0.7 minutes (p = 0.02) and an 11.0% increase in on-time arrivals (p < 0.001). The NNE was 10.6. Complimentary food improves both attendance and arrival times by a small, but statistically significant, degree. The provision of complimentary food can be considered as an incentive for attendance and on-time arrival at didactic educational sessions, although more cost-effective modalities may exist. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Schenone, Mauro; Ziebarth, Sarah; Duncan, Jose; Stokes, Lea; Hernandez, Angela
2018-02-05
To investigate the proportion of documented ultrasound findings that were unsupported by stored ultrasound images in the obstetric ultrasound unit, before and after the implementation of a quality improvement process consisting of a checklist and feedback. A quality improvement process was created involving utilization of a checklist and feedback from physician to sonographer. The feedback was based on findings of the physician's review of the report and images using a check list. To assess the impact of this process, two groups were compared. Group 1 consisted of 58 ultrasound reports created prior to initiation of the process. Group 2 included 65 ultrasound reports created after process implementation. Each chart was reviewed by a physician and a sonographer. Findings considered unsupported by stored images by both reviewers were used for analysis, and the proportion of unsupported findings was compared between the two groups. Results are expressed as mean ± standard error. A p value of < .05 was used to determine statistical significance. Univariate analysis of baseline characteristics and potential confounders showed no statistically significant difference between the groups. The mean proportion of unsupported findings in Group 1 was 5.1 ± 0.87, with Group 2 having a significantly lower proportion (2.6 ± 0.62) (p value = .018). Results suggest a significant decrease in the proportion of unsupported findings in ultrasound reports after quality improvement process implementation. Thus, we present a simple yet effective quality improvement process to reduce unsupported ultrasound findings.
Ha Dinh, Thi Thuy; Bonner, Ann; Clark, Robyn; Ramsbotham, Joanne; Hines, Sonia
2016-01-01
Chronic diseases are increasing worldwide and have become a significant burden to those affected by those diseases. Disease-specific education programs have demonstrated improved outcomes, although people do forget information quickly or memorize it incorrectly. The teach-back method was introduced in an attempt to reinforce education to patients. To date, the evidence regarding the effectiveness of health education employing the teach-back method in improved care has not yet been reviewed systematically. This systematic review examined the evidence on using the teach-back method in health education programs for improving adherence and self-management of people with chronic disease. Adults aged 18 years and over with one or more than one chronic disease.All types of interventions which included the teach-back method in an education program for people with chronic diseases. The comparator was chronic disease education programs that did not involve the teach-back method.Randomized and non-randomized controlled trials, cohort studies, before-after studies and case-control studies.The outcomes of interest were adherence, self-management, disease-specific knowledge, readmission, knowledge retention, self-efficacy and quality of life. Searches were conducted in CINAHL, MEDLINE, EMBASE, Cochrane CENTRAL, Web of Science, ProQuest Nursing and Allied Health Source, and Google Scholar databases. Search terms were combined by AND or OR in search strings. Reference lists of included articles were also searched for further potential references. Two reviewers conducted quality appraisal of papers using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument. Data were extracted using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument data extraction instruments. There was significant heterogeneity in selected studies, hence a meta-analysis was not possible and the results were presented in narrative form. Of the 21 articles retrieved in full, 12 on the use of the teach-back method met the inclusion criteria and were selected for analysis. Four studies confirmed improved disease-specific knowledge in intervention participants. One study showed a statistically significant improvement in adherence to medication and diet among type 2 diabetics patients in the intervention group compared to the control group (p < 0.001). Two studies found statistically significant improvements in self-efficacy (p = 0.0026 and p < 0.001) in the intervention groups. One study examined quality of life in heart failure patients but the results did not improve from the intervention (p = 0.59). Five studies found a reduction in readmission rates and hospitalization but these were not always statistically significant. Two studies showed improvement in daily weighing among heart failure participants, and in adherence to diet, exercise and foot care among those with type 2 diabetes. Overall, the teach-back method showed positive effects in a wide range of health care outcomes although these were not always statistically significant. Studies in this systematic review revealed improved outcomes in disease-specific knowledge, adherence, self-efficacy and the inhaler technique. There was a positive but inconsistent trend also seen in improved self-care and reduction of hospital readmission rates. There was limited evidence on improvement in quality of life or disease related knowledge retention.Evidence from the systematic review supports the use of the teach-back method in educating people with chronic disease to maximize their disease understanding and promote knowledge, adherence, self-efficacy and self-care skills.Future studies are required to strengthen the evidence on effects of the teach-back method. Larger randomized controlled trials will be needed to determine the effectiveness of the teach-back method in quality of life, reduction of readmission, and hospitalizations.
Robot-assisted gait training in patients with Parkinson disease: a randomized controlled trial.
Picelli, Alessandro; Melotti, Camilla; Origano, Francesca; Waldner, Andreas; Fiaschi, Antonio; Santilli, Valter; Smania, Nicola
2012-05-01
. Gait impairment is a common cause of disability in Parkinson disease (PD). Electromechanical devices to assist stepping have been suggested as a potential intervention. . To evaluate whether a rehabilitation program of robot-assisted gait training (RAGT) is more effective than conventional physiotherapy to improve walking. . A total of 41 patients with PD were randomly assigned to 45-minute treatment sessions (12 in all), 3 days a week, for 4 consecutive weeks of either robotic stepper training (RST; n = 21) using the Gait Trainer or physiotherapy (PT; n = 20) with active joint mobilization and a modest amount of conventional gait training. Participants were evaluated before, immediately after, and 1 month after treatment. Primary outcomes were 10-m walking speed and distance walked in 6 minutes. . Baseline measures revealed no statistical differences between groups, but the PT group walked 0.12 m/s slower; 5 patients withdrew. A statistically significant improvement was found in favor of the RST group (walking speed 1.22 ± 0.19 m/s [P = .035]; distance 366.06 ± 78.54 m [P < .001]) compared with the PT group (0.98 ± 0.32 m/s; 280.11 ± 106.61 m). The RAGT mean speed increased by 0.13 m/s, which is probably not clinically important. Improvements were maintained 1 month later. . RAGT may improve aspects of walking ability in patients with PD. Future trials should compare robotic assistive training with treadmill or equal amounts of overground walking practice.
NASA Astrophysics Data System (ADS)
Manzanas, R.; Lucero, A.; Weisheimer, A.; Gutiérrez, J. M.
2018-02-01
Statistical downscaling methods are popular post-processing tools which are widely used in many sectors to adapt the coarse-resolution biased outputs from global climate simulations to the regional-to-local scale typically required by users. They range from simple and pragmatic Bias Correction (BC) methods, which directly adjust the model outputs of interest (e.g. precipitation) according to the available local observations, to more complex Perfect Prognosis (PP) ones, which indirectly derive local predictions (e.g. precipitation) from appropriate upper-air large-scale model variables (predictors). Statistical downscaling methods have been extensively used and critically assessed in climate change applications; however, their advantages and limitations in seasonal forecasting are not well understood yet. In particular, a key problem in this context is whether they serve to improve the forecast quality/skill of raw model outputs beyond the adjustment of their systematic biases. In this paper we analyze this issue by applying two state-of-the-art BC and two PP methods to downscale precipitation from a multimodel seasonal hindcast in a challenging tropical region, the Philippines. To properly assess the potential added value beyond the reduction of model biases, we consider two validation scores which are not sensitive to changes in the mean (correlation and reliability categories). Our results show that, whereas BC methods maintain or worsen the skill of the raw model forecasts, PP methods can yield significant skill improvement (worsening) in cases for which the large-scale predictor variables considered are better (worse) predicted by the model than precipitation. For instance, PP methods are found to increase (decrease) model reliability in nearly 40% of the stations considered in boreal summer (autumn). Therefore, the choice of a convenient downscaling approach (either BC or PP) depends on the region and the season.
Petropoulou, Maria; Nikolakopoulou, Adriani; Veroniki, Areti-Angeliki; Rios, Patricia; Vafaei, Afshin; Zarin, Wasifa; Giannatsi, Myrsini; Sullivan, Shannon; Tricco, Andrea C; Chaimani, Anna; Egger, Matthias; Salanti, Georgia
2017-02-01
To assess the characteristics and core statistical methodology specific to network meta-analyses (NMAs) in clinical research articles. We searched MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews from inception until April 14, 2015, for NMAs of randomized controlled trials including at least four different interventions. Two reviewers independently screened potential studies, whereas data abstraction was performed by a single reviewer and verified by a second. A total of 456 NMAs, which included a median (interquartile range) of 21 (13-40) studies and 7 (5-9) treatment nodes, were assessed. A total of 125 NMAs (27%) were star networks; this proportion declined from 100% in 2005 to 19% in 2015 (P = 0.01 by test of trend). An increasing number of NMAs discussed transitivity or inconsistency (0% in 2005, 86% in 2015, P < 0.01) and 150 (45%) used appropriate methods to test for inconsistency (14% in 2006, 74% in 2015, P < 0.01). Heterogeneity was explored in 256 NMAs (56%), with no change over time (P = 0.10). All pairwise effects were reported in 234 NMAs (51%), with some increase over time (P = 0.02). The hierarchy of treatments was presented in 195 NMAs (43%), the probability of being best was most commonly reported (137 NMAs, 70%), but use of surface under the cumulative ranking curves increased steeply (0% in 2005, 33% in 2015, P < 0.01). Many NMAs published in the medical literature have significant limitations in both the conduct and reporting of the statistical analysis and numerical results. The situation has, however, improved in recent years, in particular with respect to the evaluation of the underlying assumptions, but considerable room for further improvements remains. Copyright © 2016 Elsevier Inc. All rights reserved.
A geostatistical state-space model of animal densities for stream networks.
Hocking, Daniel J; Thorson, James T; O'Neil, Kyle; Letcher, Benjamin H
2018-06-21
Population dynamics are often correlated in space and time due to correlations in environmental drivers as well as synchrony induced by individual dispersal. Many statistical analyses of populations ignore potential autocorrelations and assume that survey methods (distance and time between samples) eliminate these correlations, allowing samples to be treated independently. If these assumptions are incorrect, results and therefore inference may be biased and uncertainty under-estimated. We developed a novel statistical method to account for spatio-temporal correlations within dendritic stream networks, while accounting for imperfect detection in the surveys. Through simulations, we found this model decreased predictive error relative to standard statistical methods when data were spatially correlated based on stream distance and performed similarly when data were not correlated. We found that increasing the number of years surveyed substantially improved the model accuracy when estimating spatial and temporal correlation coefficients, especially from 10 to 15 years. Increasing the number of survey sites within the network improved the performance of the non-spatial model but only marginally improved the density estimates in the spatio-temporal model. We applied this model to Brook Trout data from the West Susquehanna Watershed in Pennsylvania collected over 34 years from 1981 - 2014. We found the model including temporal and spatio-temporal autocorrelation best described young-of-the-year (YOY) and adult density patterns. YOY densities were positively related to forest cover and negatively related to spring temperatures with low temporal autocorrelation and moderately-high spatio-temporal correlation. Adult densities were less strongly affected by climatic conditions and less temporally variable than YOY but with similar spatio-temporal correlation and higher temporal autocorrelation. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Motivating factors among Iranian nurses
Negarandeh, Reza; Dehghan-Nayeri, Nahid; Ghasemi, Elham
2015-01-01
Background: One of the most important challenges of Iranian health care system is “quality of care,” and it is assumed that motivated nurses are more ready to provide better care. There are limited studies investigating Iranian nurses’ motivations; however, factors which motivate them have not been studied yet. Identifying the motivating factors enables nurse managers to inspire nurses for continuous quality improvement. The aim of this study was to identify motivating factors for Iranian hospital nurses. Materials and Methods: This is a cross-sectional descriptive study in which 310 nurses working at 14 hospitals of Tehran University of Medical Sciences were selected by proportionate stratified random sampling. Data were collected in 2010 by a researcher-developed questionnaire. Descriptive statistics and independent t-test, analysis of variance, Tukey post-hoc test, Chi-Square and Fisher's exact test were used for statistical analysis by Statistical Package for Social Sciences (SPSS) version 16. Results: The mean score of motivation was 90.53 ± 10.76 (range: 59–121). Four motivating factors including “career development” (22.63 ± 5.66), “job characteristics” (34.29 ± 4), “job authority” (18.48 ± 2.79), and “recognition” (15.12 ± 2.5) were recognized. The least mean of the motivation score, considering the number of items, was 3.23 for career development, while the highest mean was 3.81 for job characteristics. Conclusions: The findings showed that motivation of nurses was at a medium level, which calls for improvement. The factors that have the greatest potential to motivate nurses were identified in this study and they can help managers to achieve the goal of continuous quality improvement. PMID:26257797
Péron, Julien; Pond, Gregory R; Gan, Hui K; Chen, Eric X; Almufti, Roula; Maillet, Denis; You, Benoit
2012-07-03
The Consolidated Standards of Reporting Trials (CONSORT) guidelines were developed in the mid-1990s for the explicit purpose of improving clinical trial reporting. However, there is little information regarding the adherence to CONSORT guidelines of recent publications of randomized controlled trials (RCTs) in oncology. All phase III RCTs published between 2005 and 2009 were reviewed using an 18-point overall quality score for reporting based on the 2001 CONSORT statement. Multivariable linear regression was used to identify features associated with improved reporting quality. To provide baseline data for future evaluations of reporting quality, RCTs were also assessed according to the 2010 revised CONSORT statement. All statistical tests were two-sided. A total of 357 RCTs were reviewed. The mean 2001 overall quality score was 13.4 on a scale of 0-18, whereas the mean 2010 overall quality score was 19.3 on a scale of 0-27. The overall RCT reporting quality score improved by 0.21 points per year from 2005 to 2009. Poorly reported items included method used to generate the random allocation (adequately reported in 29% of trials), whether and how blinding was applied (41%), method of allocation concealment (51%), and participant flow (59%). High impact factor (IF, P = .003), recent publication date (P = .008), and geographic origin of RCTs (P = .003) were independent factors statistically significantly associated with higher reporting quality in a multivariable regression model. Sample size, tumor type, and positivity of trial results were not associated with higher reporting quality, whereas funding source and treatment type had a borderline statistically significant impact. The results show that numerous items remained unreported for many trials. Thus, given the potential impact of poorly reported trials, oncology journals should require even stricter adherence to the CONSORT guidelines.
Copie, X; Blankoff, I; Hnatkova, K; Fei, L; Camm, A J; Malik, M
1996-06-01
The authors studied the possibility of improving the reproducibility of the signal averaged ECG by increasing the number of averaged QRS complexes. One hundred patients were included in the study. In each cases, 400 QRS complexes were recorded on twice, consecutively, in strictly identical conditions. During each recording, the total duration of the amplified and averaged QRS complex (tQRS), the duration of the terminal signal below 40 microV (LAS) and the root mean square of the amplitude of the last 40 ms (RMS) were determined for 100, 200, 300 and 400 recorded QRS complexes. The presence of late potentials was defined as the positivity of two of the following criteria: tQRS > 114 ms, LAS > 38 ms, RMS < 20 microV. The number of contradictory diagnostic conclusions between two successive recordings of the same duration decreased progressively with the number of averaged QRS complexes: 10 for 100 QRS, 10 for 200 QRS, 9 for 300 QRS and 6 for 400 QRS complexes, but this improvement was not statistically significant. The absolute differences of tQRS and RMS between two successive recordings of the same duration were statistically different for the four durations of recording (p = 0.05) and there was a tendency towards statistical significance for LAS (p = 0.09). The best quantitative reproducibility of the 3 parameters was obtained with the recording of 300 QRS complexes. In conclusion, the reproducibility of the signal averaged ECG is improved when the number of average QRS complexes is increased. The authors' results suggests that reproducibility this is optimal with the amplification and averaging of 300 QRS complexes.
Efficient Simulation of Tropical Cyclone Pathways with Stochastic Perturbations
NASA Astrophysics Data System (ADS)
Webber, R.; Plotkin, D. A.; Abbot, D. S.; Weare, J.
2017-12-01
Global Climate Models (GCMs) are known to statistically underpredict intense tropical cyclones (TCs) because they fail to capture the rapid intensification and high wind speeds characteristic of the most destructive TCs. Stochastic parametrization schemes have the potential to improve the accuracy of GCMs. However, current analysis of these schemes through direct sampling is limited by the computational expense of simulating a rare weather event at fine spatial gridding. The present work introduces a stochastically perturbed parametrization tendency (SPPT) scheme to increase simulated intensity of TCs. We adapt the Weighted Ensemble algorithm to simulate the distribution of TCs at a fraction of the computational effort required in direct sampling. We illustrate the efficiency of the SPPT scheme by comparing simulations at different spatial resolutions and stochastic parameter regimes. Stochastic parametrization and rare event sampling strategies have great potential to improve TC prediction and aid understanding of tropical cyclogenesis. Since rising sea surface temperatures are postulated to increase the intensity of TCs, these strategies can also improve predictions about climate change-related weather patterns. The rare event sampling strategies used in the current work are not only a novel tool for studying TCs, but they may also be applied to sampling any range of extreme weather events.
Roshanov, Pavel S; Misra, Shikha; Gerstein, Hertzel C; Garg, Amit X; Sebaldt, Rolf J; Mackay, Jean A; Weise-Kelly, Lorraine; Navarro, Tamara; Wilczynski, Nancy L; Haynes, R Brian
2011-08-03
The use of computerized clinical decision support systems (CCDSSs) may improve chronic disease management, which requires recurrent visits to multiple health professionals, ongoing disease and treatment monitoring, and patient behavior modification. The objective of this review was to determine if CCDSSs improve the processes of chronic care (such as diagnosis, treatment, and monitoring of disease) and associated patient outcomes (such as effects on biomarkers and clinical exacerbations). We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for potentially eligible articles published up to January 2010. We included randomized controlled trials that compared the use of CCDSSs to usual practice or non-CCDSS controls. Trials were eligible if at least one component of the CCDSS was designed to support chronic disease management. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of relevant outcomes. Of 55 included trials, 87% (n = 48) measured system impact on the process of care and 52% (n = 25) of those demonstrated statistically significant improvements. Sixty-five percent (36/55) of trials measured impact on, typically, non-major (surrogate) patient outcomes, and 31% (n = 11) of those demonstrated benefits. Factors of interest to decision makers, such as cost, user satisfaction, system interface and feature sets, unique design and deployment characteristics, and effects on user workflow were rarely investigated or reported. A small majority (just over half) of CCDSSs improved care processes in chronic disease management and some improved patient health. Policy makers, healthcare administrators, and practitioners should be aware that the evidence of CCDSS effectiveness is limited, especially with respect to the small number and size of studies measuring patient outcomes.
2011-01-01
Background The use of computerized clinical decision support systems (CCDSSs) may improve chronic disease management, which requires recurrent visits to multiple health professionals, ongoing disease and treatment monitoring, and patient behavior modification. The objective of this review was to determine if CCDSSs improve the processes of chronic care (such as diagnosis, treatment, and monitoring of disease) and associated patient outcomes (such as effects on biomarkers and clinical exacerbations). Methods We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for potentially eligible articles published up to January 2010. We included randomized controlled trials that compared the use of CCDSSs to usual practice or non-CCDSS controls. Trials were eligible if at least one component of the CCDSS was designed to support chronic disease management. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of relevant outcomes. Results Of 55 included trials, 87% (n = 48) measured system impact on the process of care and 52% (n = 25) of those demonstrated statistically significant improvements. Sixty-five percent (36/55) of trials measured impact on, typically, non-major (surrogate) patient outcomes, and 31% (n = 11) of those demonstrated benefits. Factors of interest to decision makers, such as cost, user satisfaction, system interface and feature sets, unique design and deployment characteristics, and effects on user workflow were rarely investigated or reported. Conclusions A small majority (just over half) of CCDSSs improved care processes in chronic disease management and some improved patient health. Policy makers, healthcare administrators, and practitioners should be aware that the evidence of CCDSS effectiveness is limited, especially with respect to the small number and size of studies measuring patient outcomes. PMID:21824386
Outcomes of a novel minimalist approach for the treatment of cubital tunnel syndrome.
Lan, Zheng D; Tatsui, Claudio E; Jalali, Ali; Humphries, William E; Rilea, Katheryn; Patel, Akash; Ehni, Bruce L
2015-06-01
We describe a minimalist approach to perform in situ decompression of the ulnar nerve. Our technique employs a unique small skin incision strategically placed to minimize postoperative scarring over the ulnar nerve and potentially decrease the risk of iatrogenic injury to the medial antebrachial cutaneous nerve. We retrospectively report the outcome of patients who have undergone this procedure at our institution, the Michael E. DeBakey Veterans Affairs Medical Center, from January 1 2007 through November 29 2010. All individuals underwent in situ decompression via the previously described minimalist approach. Outcome variables were Louisiana State University Medical Center (LSU) ulnar neuropathy grade, patient satisfaction, subjective improvement, complications and re-operation rate. A total of 44 procedures were performed in this cohort of 41 patients. Overall, patients' postoperative LSU grades showed a statistically significant improvement (p=0.0019) compared to preoperative grades. Improvement of at least one grade in the LSU scale was observed in 50% of the procedures with a preoperative grade of four or less. Overall procedure satisfaction rate was 88% (39 of 44) with 70% (31 of 44) of the procedures resulting in improvement of symptoms. There were no intraoperative or postoperative complications. One patient required re-operation due to failure of neurological improvement. Our minimalistic approach to perform in situ decompression of the ulnar nerve at the cubital tunnel is both safe and effective. We observed a statistically significant improvement in LSU ulnar neuropathy grades and a success rate comparable to those reported for other more extensive surgical techniques while providing the benefit of a smaller incision, less scarring, decreased risk of iatrogenic nerve injury and minimal complications. Copyright © 2015 Elsevier Ltd. All rights reserved.
Galloway, Claire R; Lebois, Evan P; Shagarabi, Shezza L; Hernandez, Norma A; Manns, Joseph R
2014-01-01
Acetylcholine signaling through muscarinic receptors has been shown to benefit memory performance in some conditions, but pan-muscarinic activation also frequently leads to peripheral side effects. Drug therapies that selectively target M1 or M4 muscarinic receptors could potentially improve memory while minimizing side effects mediated by the other muscarinic receptor subtypes. The ability of three recently developed drugs that selectively activate M1 or M4 receptors to improve recognition memory was tested by giving Long-Evans rats subcutaneous injections of three different doses of the M1 agonist VU0364572, the M1 positive allosteric modulator BQCA or the M4 positive allosteric modulator VU0152100 before performing an object recognition memory task. VU0364572 at 0.1 mg/kg, BQCA at 1.0 mg/kg and VU0152100 at 3.0 and 30.0 mg/kg improved the memory performance of rats that performed poorly at baseline, yet the improvements in memory performance were the most statistically robust for VU0152100 at 3.0 mg/kg. The results suggested that selective M1 and M4 receptor activation each improved memory but that the likelihood of obtaining behavioral efficacy at a given dose might vary between subjects even in healthy groups depending on baseline performance. These results also highlighted the potential of drug therapies that selectively target M1 or M4 receptors to improve memory performance in individuals with impaired memory.
Thermal conductance of and heat generation in tire-pavement interface and effect on aircraft braking
NASA Technical Reports Server (NTRS)
Miller, C. D.
1976-01-01
A finite-difference analysis was performed on temperature records obtained from a free rolling automotive tire and from pavement surface. A high thermal contact conductance between tire and asphalt was found on a statistical basis. Average slip due to squirming between tire and asphalt was about 1.5 mm. Consequent friction heat was estimated as 64 percent of total power absorbed by bias-ply, belted tire. Extrapolation of results to aircraft tire indicates potential braking improvement by even moderate increase of heat absorbing capacity of runway surface.
1985-08-01
interhospital output miy, they still propose that hospitals should have relative stability in their output mix over short periods of time (2-3 years). 3... should reflect what is believed to be the ultimate objective of the health (and hospital) system--the improvement of health levels." 5 4 J. Lave’s Review...169 potential cases for study, 119 met all of the parameters. Concerns 53 about statistical outliers diminishing sample size if they should be
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
Identification of Water Bodies in a Landsat 8 OLI Image Using a J48 Decision Tree.
Acharya, Tri Dev; Lee, Dong Ha; Yang, In Tae; Lee, Jae Kang
2016-07-12
Water bodies are essential to humans and other forms of life. Identification of water bodies can be useful in various ways, including estimation of water availability, demarcation of flooded regions, change detection, and so on. In past decades, Landsat satellite sensors have been used for land use classification and water body identification. Due to the introduction of a New Operational Land Imager (OLI) sensor on Landsat 8 with a high spectral resolution and improved signal-to-noise ratio, the quality of imagery sensed by Landsat 8 has improved, enabling better characterization of land cover and increased data size. Therefore, it is necessary to explore the most appropriate and practical water identification methods that take advantage of the improved image quality and use the fewest inputs based on the original OLI bands. The objective of the study is to explore the potential of a J48 decision tree (JDT) in identifying water bodies using reflectance bands from Landsat 8 OLI imagery. J48 is an open-source decision tree. The test site for the study is in the Northern Han River Basin, which is located in Gangwon province, Korea. Training data with individual bands were used to develop the JDT model and later applied to the whole study area. The performance of the model was statistically analysed using the kappa statistic and area under the curve (AUC). The results were compared with five other known water identification methods using a confusion matrix and related statistics. Almost all the methods showed high accuracy, and the JDT was successfully applied to the OLI image using only four bands, where the new additional deep blue band of OLI was found to have the third highest information gain. Thus, the JDT can be a good method for water body identification based on images with improved resolution and increased size.
Keith, Jeff; Westbury, Chris; Goldman, James
2015-09-01
Corpus-based semantic space models, which primarily rely on lexical co-occurrence statistics, have proven effective in modeling and predicting human behavior in a number of experimental paradigms that explore semantic memory representation. The most widely studied extant models, however, are strongly influenced by orthographic word frequency (e.g., Shaoul & Westbury, Behavior Research Methods, 38, 190-195, 2006). This has the implication that high-frequency closed-class words can potentially bias co-occurrence statistics. Because these closed-class words are purported to carry primarily syntactic, rather than semantic, information, the performance of corpus-based semantic space models may be improved by excluding closed-class words (using stop lists) from co-occurrence statistics, while retaining their syntactic information through other means (e.g., part-of-speech tagging and/or affixes from inflected word forms). Additionally, very little work has been done to explore the effect of employing morphological decomposition on the inflected forms of words in corpora prior to compiling co-occurrence statistics, despite (controversial) evidence that humans perform early morphological decomposition in semantic processing. In this study, we explored the impact of these factors on corpus-based semantic space models. From this study, morphological decomposition appears to significantly improve performance in word-word co-occurrence semantic space models, providing some support for the claim that sublexical information-specifically, word morphology-plays a role in lexical semantic processing. An overall decrease in performance was observed in models employing stop lists (e.g., excluding closed-class words). Furthermore, we found some evidence that weakens the claim that closed-class words supply primarily syntactic information in word-word co-occurrence semantic space models.
Viewing health expenditures, payment and coping mechanisms with an equity lens in Nigeria
2013-01-01
Background This paper examines socio-economic and geographic differences in payment and payment coping mechanisms for health services in southeast Nigeria. It shows the extent to which the poor and rural dwellers disproportionally bear the burden of health care costs and offers policy recommendations for improvements. Methods Questionnaires were used to collect data from 3071 randomly selected households in six communities in southeast Nigeria using a four week recall. The sample was divided into quintiles (Q1-Q5) using a socio-economic status (SES) index as well as into geographic groups (rural, peri-urban and urban). Tabulations and logistic regression were used to determine the relationships between payment and payment coping mechanisms and key independent variables. Q1/Q5 and rural/urban ratios were the measures of equity. Results Most of the respondents used out-of-pocket spending (OOPS) and own money to pay for healthcare. There was statistically significant geographic differences in the use of own money to pay for health services indicating more use among rural dwellers. Logistic regression showed statistically significant geographic differences in the use of both OOPS and own money when controlling for the effects of potential cofounders. Conclusions This study shows statistically significant geographic differences in the use of OOPS and own money to pay for health services. Though the SES differences were not statistically significant, they showed high equity ratios indicating more use among poor and rural dwellers. The high expenditure incurred on drugs alone highlights the need for expediting pro-poor interventions like exemptions and waivers aimed at improving access to health care for the vulnerable poor and rural dwellers. PMID:23497246
Building a database for statistical characterization of ELMs on DIII-D
NASA Astrophysics Data System (ADS)
Fritch, B. J.; Marinoni, A.; Bortolon, A.
2017-10-01
Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.
Wavelet methodology to improve single unit isolation in primary motor cortex cells.
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2015-05-15
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.
Statistical algorithms improve accuracy of gene fusion detection
Hsieh, Gillian; Bierman, Rob; Szabo, Linda; Lee, Alex Gia; Freeman, Donald E.; Watson, Nathaniel; Sweet-Cordero, E. Alejandro
2017-01-01
Abstract Gene fusions are known to play critical roles in tumor pathogenesis. Yet, sensitive and specific algorithms to detect gene fusions in cancer do not currently exist. In this paper, we present a new statistical algorithm, MACHETE (Mismatched Alignment CHimEra Tracking Engine), which achieves highly sensitive and specific detection of gene fusions from RNA-Seq data, including the highest Positive Predictive Value (PPV) compared to the current state-of-the-art, as assessed in simulated data. We show that the best performing published algorithms either find large numbers of fusions in negative control data or suffer from low sensitivity detecting known driving fusions in gold standard settings, such as EWSR1-FLI1. As proof of principle that MACHETE discovers novel gene fusions with high accuracy in vivo, we mined public data to discover and subsequently PCR validate novel gene fusions missed by other algorithms in the ovarian cancer cell line OVCAR3. These results highlight the gains in accuracy achieved by introducing statistical models into fusion detection, and pave the way for unbiased discovery of potentially driving and druggable gene fusions in primary tumors. PMID:28541529
Enhancing residents’ neonatal resuscitation competency through unannounced simulation-based training
Surcouf, Jeffrey W.; Chauvin, Sheila W.; Ferry, Jenelle; Yang, Tong; Barkemeyer, Brian
2013-01-01
Background Almost half of pediatric third-year residents surveyed in 2000 had never led a resuscitation event. With increasing restrictions on residency work hours and a decline in patient volume in some hospitals, there is potential for fewer opportunities. Purpose Our primary purpose was to test the hypothesis that an unannounced mock resuscitation in a high-fidelity in-situ simulation training program would improve both residents’ self-confidence and observed performance of adopted best practices in neonatal resuscitation. Methods Each pediatric and medicine–pediatric resident in one pediatric residency program responded to an unannounced scenario that required resuscitation of the high fidelity infant simulator. Structured debriefing followed in the same setting, and a second cycle of scenario response and debriefing occurred before ending the 1-hour training experience. Measures included pre- and post-program confidence questionnaires and trained observer assessments of live and videotaped performances. Results Statistically significant pre–post gains for self-confidence were observed for 8 of the 14 NRP critical behaviors (p=0.00–0.03) reflecting knowledge, technical, and non-technical (teamwork) skills. The pre–post gain in overall confidence score was statistically significant (p=0.00). With a maximum possible assessment score of 41, the average pre–post gain was 8.28 and statistically significant (p<0.001). Results of the video-based assessments revealed statistically significant performance gains (p<0.0001). Correlation between live and video-based assessments were strong for pre–post training scenario performances (pre: r=0.64, p<0.0001; post: r=0.75, p<0.0001). Conclusions Results revealed high receptivity to in-situ, simulation-based training and significant positive gains in confidence and observed competency-related abilities. Results support the potential for other applications in residency and continuing education. PMID:23522399
NASA Astrophysics Data System (ADS)
di Luca, Alejandro; de Elía, Ramón; Laprise, René
2012-03-01
Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions.
de Manincor, Michael; Bensoussan, Alan; Smith, Caroline A; Barr, Kylie; Schweickle, Monica; Donoghoe, Lee-Lee; Bourchier, Suzannah; Fahey, Paul
2016-09-01
Depression and anxiety are leading causes of disability worldwide. Current treatments are primarily pharmaceutical and psychological. Questions remain about effectiveness and suitability for different people. Previous research suggests potential benefits of yoga for reducing depression and anxiety. The aim of this study is to investigate the effects of an individualized yoga intervention. A sample of 101 people with symptoms of depression and/or anxiety participated in a randomized controlled trial comparing a 6-week yoga intervention with waitlist control. Yoga was additional to usual treatment. The control group was offered the yoga following the waitlist period. Measures included Depression Anxiety Stress Scale (DASS-21), Kessler Psychological Distress Scale (K10), Short-Form Health Survey (SF12), Scale of Positive and Negative Experience (SPANE), Flourishing Scale (FS), and Connor-Davidson Resilience Scale (CD-RISC2). There were statistically significant differences between yoga and control groups on reduction of depression scores (-4.30; 95% CI: -7.70, -0.01; P = .01; ES -.44). Differences in reduced anxiety scores were not statistically significant (-1.91; 95% CI: -4.58, 0.76; P = .16). Statistically significant differences in favor of yoga were also found on total DASS (P = .03), K10, SF12 mental health, SPANE, FS, and resilience scores (P < .01 for each). Differences in stress and SF12 physical health scores were not statistically significant. Benefits were maintained at 6-week follow-up. Yoga plus regular care was effective in reducing symptoms of depression compared with regular care alone. Further investigation is warranted regarding potential benefits in anxiety. Individualized yoga may be particularly beneficial in mental health care in the broader community. © 2016 Wiley Periodicals, Inc.
Kuwatsuka, Yachiyo
2016-01-01
Observational studies from national and international registries with large volumes of patients are commonly performed to identify superior strategies for hematopoietic stem cell transplantation. Major international and national stem cell transplant registries collect outcome data using electronic data capture systems, and a systematic study support process has been developed. Statistical support for studies is available from some major international registries, and international and national registries also mutually collaborate to promote stem cell transplant outcome studies and transplant-related activities. Transplant registries additionally take measures to improve data quality to further improve the quality of outcome studies by utilizing data capture systems and manual data management. Data auditing can potentially even further improve data quality; however, human and budgetary resources can be limiting factors in system construction and audits of the Japanese transplant registry are not currently performed.
Evans, Jamie; Fitch, Christopher; Collard, Sharon; Henderson, Claire
2018-04-27
In recent years, the UK debt collection industry has taken steps to improve its policies and practices in relation to customers with mental health problems. Little data, however, have been collected to evidence change. This paper examines whether the reported attitudes and practices of debt collection staff when working with customers with mental health problems have changed between 2010 and 2016. This paper draws on descriptive and regression analyses of two cross-sectional surveys of debt collection staff: one conducted in 2010 and one conducted in 2016. All variables analysed show statistically significant changes between 2010 and 2016 indicative of improved reported attitudes and practices. While results suggest an improvement in attitudes and practice may have occurred between 2010 and 2016, research is required to understand this potential shift, its likely causes, and concrete impact on customers.
Topiramate on the quality of life in childhood epilepsy.
Jung, Da-Eun; Kim, Heung-Dong; Hur, Yun-Jung; Eom, So-Yong
2011-10-01
This study evaluated the effect of topiramate (TPM) on the quality of life (QOL) in childhood epilepsy, using the Korean quality of life in childhood epilepsy (K-QOLCE) questionnaire. An open label, prospective, observational study of the families of 664 children with epilepsy from 41 centers was conducted. The parents completed the K-QOLCE at the baseline visit and again 6months after starting TPM treatment. The parents reported the seizure frequency at both assessment dates. Statistically significant improvements in all K-QOLCE domains except social functioning were found at 6months after starting TPM treatment from the baseline-scores (P<0.05). However, improved QOL scores were not dependent on the reduction in seizure frequency. TPM significantly improved QOL in children with epilepsy, suggesting its potential clinical benefits. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
ERIC Educational Resources Information Center
Vaughn, Brandon K.; Wang, Pei-Yu
2009-01-01
The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…
Statistical Reasoning Ability, Self-Efficacy, and Value Beliefs in a University Statistics Course
ERIC Educational Resources Information Center
Olani, A.; Hoekstra, R.; Harskamp, E.; van der Werf, G.
2011-01-01
Introduction: The study investigated the degree to which students' statistical reasoning abilities, statistics self-efficacy, and perceived value of statistics improved during a reform based introductory statistics course. The study also examined whether the changes in these learning outcomes differed with respect to the students' mathematical…
Webster, R J; Williams, A; Marchetti, F; Yauk, C L
2018-07-01
Mutations in germ cells pose potential genetic risks to offspring. However, de novo mutations are rare events that are spread across the genome and are difficult to detect. Thus, studies in this area have generally been under-powered, and no human germ cell mutagen has been identified. Whole Genome Sequencing (WGS) of human pedigrees has been proposed as an approach to overcome these technical and statistical challenges. WGS enables analysis of a much wider breadth of the genome than traditional approaches. Here, we performed power analyses to determine the feasibility of using WGS in human families to identify germ cell mutagens. Different statistical models were compared in the power analyses (ANOVA and multiple regression for one-child families, and mixed effect model sampling between two to four siblings per family). Assumptions were made based on parameters from the existing literature, such as the mutation-by-paternal age effect. We explored two scenarios: a constant effect due to an exposure that occurred in the past, and an accumulating effect where the exposure is continuing. Our analysis revealed the importance of modeling inter-family variability of the mutation-by-paternal age effect. Statistical power was improved by models accounting for the family-to-family variability. Our power analyses suggest that sufficient statistical power can be attained with 4-28 four-sibling families per treatment group, when the increase in mutations ranges from 40 to 10% respectively. Modeling family variability using mixed effect models provided a reduction in sample size compared to a multiple regression approach. Much larger sample sizes were required to detect an interaction effect between environmental exposures and paternal age. These findings inform study design and statistical modeling approaches to improve power and reduce sequencing costs for future studies in this area. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Ebbs, Phillip; Middleton, Paul M; Bonner, Ann; Loudfoot, Allan; Elliott, Peter
2012-07-01
Is the Clinical Safety Chart clinical improvement programme (CIP) effective at improving paramedic key performance indicator (KPI) results within the Ambulance Service of New South Wales? The CIP intervention area was compared with the non-intervention area in order to determine whether there was a statistically significant improvement in KPI results. The CIP was associated with a statistically significant improvement in paramedic KPI results within the intervention area. The strategies used within this CIP are recommended for further consideration.
2017-01-01
Recent advances in understanding protein folding have benefitted from coarse-grained representations of protein structures. Empirical energy functions derived from these techniques occasionally succeed in distinguishing native structures from their corresponding ensembles of nonnative folds or decoys which display varying degrees of structural dissimilarity to the native proteins. Here we utilized atomic coordinates of single protein chains, comprising a large diverse training set, to develop and evaluate twelve all-atom four-body statistical potentials obtained by exploring alternative values for a pair of inherent parameters. Delaunay tessellation was performed on the atomic coordinates of each protein to objectively identify all quadruplets of interacting atoms, and atomic potentials were generated via statistical analysis of the data and implementation of the inverted Boltzmann principle. Our potentials were evaluated using benchmarking datasets from Decoys-‘R'-Us, and comparisons were made with twelve other physics- and knowledge-based potentials. Ranking 3rd, our best potential tied CHARMM19 and surpassed AMBER force field potentials. We illustrate how a generalized version of our potential can be used to empirically calculate binding energies for target-ligand complexes, using HIV-1 protease-inhibitor complexes for a practical application. The combined results suggest an accurate and efficient atomic four-body statistical potential for protein structure prediction and assessment. PMID:29119109
Crandall, K. Jason; Steenbergen, Katryn I.
2015-01-01
Combining exercise, health education, and the game of bingo may help older adults remain independent. The objective was to determine whether a 10-week health promotion program (Bingocize®) improves functional performance and health knowledge in older adults. Participants were assigned to experimental (n = 13) or control (n = 14) groups. The intervention was administered twice per week at two independent living facilities. Pre and postfunctional performance and health knowledge were measured. Mixed between–within subject ANOVA was used to detect differences between groups (p < .05). Improvements were found in all dependent variables except lower body flexibility, systolic blood pressure, and health knowledge. Adherence was 97.31% ± 2.59%. Bingocize® has the potential to help older adults remain independent by improving functional performance. Statistical improvements in health knowledge were not found, but future researchers may explore modifying the health education component or using a different measure of health knowledge to detect changes. PMID:28138476
Zhang, Wei; Gkritza, Konstantina; Keren, Nir; Nambisan, Shashi
2011-10-01
This paper investigates potential gender and age differences in conviction and crash occurrence subsequent to being directed to attend Iowa's Driver Improvement Program (DIP). Binary logit models were developed to investigate the factors that influence conviction occurrence after DIP by gender and age. Because of the low crash occurrence subsequent to DIP, association rules were applied to investigate the factors that influence crash occurrence subsequent to DIP, in lieu of econometric models. There were statistical significant differences by driver gender, age, and conviction history in the likelihood of subsequent convictions. However, this paper found no association between DIP outcome, crash history, and crash occurrence. Evaluating the differences in conviction and crash occurrence subsequent to DIP between female and male drivers, and among different age groups can lead to improvements of the effectiveness of DIPs and help to identify low-cost intervention measures, customized based on drivers' gender and age, for improving driving behaviors. Copyright © 2011 National Safety Council and Elsevier Ltd. All rights reserved.
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2018-06-01
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
Utturkar, Sagar M.; Klingeman, Dawn Marie; Land, Miriam L.; ...
2014-06-14
Our motivation with this work was to assess the potential of different types of sequence data combined with de novo and hybrid assembly approaches to improve existing draft genome sequences. Our results show Illumina, 454 and PacBio sequencing technologies were used to generate de novo and hybrid genome assemblies for four different bacteria, which were assessed for quality using summary statistics (e.g. number of contigs, N50) and in silico evaluation tools. Differences in predictions of multiple copies of rDNA operons for each respective bacterium were evaluated by PCR and Sanger sequencing, and then the validated results were applied as anmore » additional criterion to rank assemblies. In general, assemblies using longer PacBio reads were better able to resolve repetitive regions. In this study, the combination of Illumina and PacBio sequence data assembled through the ALLPATHS-LG algorithm gave the best summary statistics and most accurate rDNA operon number predictions. This study will aid others looking to improve existing draft genome assemblies. As to availability and implementation–all assembly tools except CLC Genomics Workbench are freely available under GNU General Public License.« less
Ensemble stacking mitigates biases in inference of synaptic connectivity.
Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N
2018-01-01
A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.
Social motivation and health in college club swimming.
Anderson, Austin R; Ramos, William D
2018-03-22
Participation in recreational sport clubs on campus is a popular student activity nationwide. These sport-based organizations provide a host of benefits within recognized dimensions of health and wellness. Understanding participants' motives for engaging in these types of activities can provide insight in design and delivery and enhance participant health. This study focuses on outcomes related to the social motivations for participation in a recreational sport swim club and their potential relationship to social health. Current members of recreational swimming clubs were contacted for participation in the study from March-April 2016. A Leisure Motivation Scale (LMS) survey was sent electronically to 196 collegiate swim clubs nationwide. Aggregate and multivariate analyses from 1011 responses were conducted to examine the social motivation and motivational differences of participants. Social motivations emerged as the predominate motivational construct, indicating important implications for social health improvement through participation. Demographically, results indicated no statistically significant differences in social motivation factors based on participant gender, and statistically significant differences within participant race, university affiliation and practice frequency. Impacts of these findings are important for practitioners and participants when evaluating the potential these programs have to influence participant social health.
NASA Astrophysics Data System (ADS)
Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.
2012-12-01
Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent/divergent ice zones, (ii) provide datasets that support enhanced parameterizations in numerical models as well as model initialization and validation, (iii) parameters of interest to Arctic stakeholders for marine navigation and ice engineering studies, and (iv) statistics that support algorithm development for the next-generation of airborne and satellite altimeters, including NASA's ICESat-2 mission. We describe the potential contribution our results can make towards the improvement of coupled ice-ocean numerical models, and discuss how data synthesis and integration with high-resolution models may improve our understanding of sea ice variability and our capabilities in predicting the future state of the ice pack.
Perspective: chemical dynamics simulations of non-statistical reaction dynamics
Ma, Xinyou; Hase, William L.
2017-01-01
Non-statistical chemical dynamics are exemplified by disagreements with the transition state (TS), RRKM and phase space theories of chemical kinetics and dynamics. The intrinsic reaction coordinate (IRC) is often used for the former two theories, and non-statistical dynamics arising from non-IRC dynamics are often important. In this perspective, non-statistical dynamics are discussed for chemical reactions, with results primarily obtained from chemical dynamics simulations and to a lesser extent from experiment. The non-statistical dynamical properties discussed are: post-TS dynamics, including potential energy surface bifurcations, product energy partitioning in unimolecular dissociation and avoiding exit-channel potential energy minima; non-RRKM unimolecular decomposition; non-IRC dynamics; direct mechanisms for bimolecular reactions with pre- and/or post-reaction potential energy minima; non-TS theory barrier recrossings; and roaming dynamics. This article is part of the themed issue ‘Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces’. PMID:28320906
Clayson, Peter E; Miller, Gregory A
2017-01-01
Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.
Fidler, Meredith C; Beusmans, Jack; Panorchan, Paul; Van Goor, Fredrick
2017-01-01
Ivacaftor, a CFTR potentiator that enhances chloride transport by acting directly on CFTR to increase its channel gating activity, has been evaluated in patients with different CFTR mutations. Several previous analyses have reported no statistical correlation between change from baseline in ppFEV 1 and reduction in sweat chloride levels for individuals treated with ivacaftor. The objective of the post hoc analysis described here was to expand upon previous analyses and evaluate the correlation between sweat chloride levels and absolute ppFEV 1 changes across multiple cohorts of patients with different CF-causing mutations who were treated with ivacaftor. The goal of the analysis was to help define the potential value of sweat chloride as a pharmacodynamic biomarker for use in CFTR modulator trials. For any given study, reductions in sweat chloride levels and improvements in absolute ppFEV 1 were not correlated for individual patients. However, when the data from all studies were combined, a statistically significant correlation between sweat chloride levels and ppFEV 1 changes was observed (p<0.0001). Thus, sweat chloride level changes in response to potentiation of the CFTR protein by ivacaftor appear to be a predictive pharmacodynamic biomarker of lung function changes on a population basis but are unsuitable for the prediction of treatment benefits for individuals. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Randomized controlled trial of a positive affect intervention for people newly diagnosed with HIV
Moskowitz, Judith T.; Carrico, Adam W.; Duncan, Larissa G.; Cohn, Michael A.; Cheung, Elaine O.; Batchelder, Abigail; Martinez, Lizet; Segawa, Eisuke; Acree, Michael; Folkman, Susan
2017-01-01
Objective We conducted a randomized controlled trial to determine whether IRISS (Intervention for those Recently Informed of their Seropositive Status), a positive affect skills intervention, improved positive emotion, psychological health, physical health, and health behaviors in people newly diagnosed with HIV. Method 159 participants who had received an HIV diagnosis in the past 3 months were randomized to a 5-session, in-person, individually-delivered positive affect skills intervention or an attention-matched control condition. Results For the primary outcome of positive affect, the group difference in change from baseline over time did not reach statistical significance (p = .12; d = .30). Planned secondary analyses within assessment point showed that the intervention led to higher levels of past-day positive affect at 5, 10, and 15 months post diagnosis compared to an attention control. For antidepressant use, the between group difference in change from baseline was statistically significant (p = .006; d = −.78 baseline to 15 months) and the difference in change over time for intrusive and avoidant thoughts related to HIV was also statistically significant (p = .048; d = .29). Contrary to findings for most health behavior interventions in which effects wane over the follow up period, effect sizes in IRISS seemed to increase over time for most outcomes. Conclusions This comparatively brief positive affect skills intervention achieved modest improvements in psychological health, and may have the potential to support adjustment to a new HIV diagnosis. PMID:28333512
NASA Astrophysics Data System (ADS)
Rakesh, V.; Kantharao, B.
2017-03-01
Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events
NASA Astrophysics Data System (ADS)
Seibert, S. P.; Skublics, D.; Ehret, U.
2014-09-01
The coordinated operation of reservoirs in large-scale river basins has great potential to improve flood mitigation. However, this requires large scale hydrological models to translate the effect of reservoir operation to downstream points of interest, in a quality sufficient for the iterative development of optimized operation strategies. And, of course, it requires reservoirs large enough to make a noticeable impact. In this paper, we present and discuss several methods dealing with these prerequisites for reservoir operation using the example of three major floods in the Bavarian Danube basin (45,000 km2) and nine reservoirs therein: We start by presenting an approach for multi-criteria evaluation of model performance during floods, including aspects of local sensitivity to simulation quality. Then we investigate the potential of joint hydrologic-2d-hydrodynamic modeling to improve model performance. Based on this, we evaluate upper limits of reservoir impact under idealized conditions (perfect knowledge of future rainfall) with two methods: Detailed simulations and statistical analysis of the reservoirs' specific retention volume. Finally, we investigate to what degree reservoir operation strategies optimized for local (downstream vicinity to the reservoir) and regional (at the Danube) points of interest are compatible. With respect to model evaluation, we found that the consideration of local sensitivities to simulation quality added valuable information not included in the other evaluation criteria (Nash-Sutcliffe efficiency and Peak timing). With respect to the second question, adding hydrodynamic models to the model chain did, contrary to our expectations, not improve simulations, despite the fact that under idealized conditions (using observed instead of simulated lateral inflow) the hydrodynamic models clearly outperformed the routing schemes of the hydrological models. Apparently, the advantages of hydrodynamic models could not be fully exploited when fed by output from hydrological models afflicted with systematic errors in volume and timing. This effect could potentially be reduced by joint calibration of the hydrological-hydrodynamic model chain. Finally, based on the combination of the simulation-based and statistical impact assessment, we identified one reservoir potentially useful for coordinated, regional flood mitigation for the Danube. While this finding is specific to our test basin, the more interesting and generally valid finding is that operation strategies optimized for local and regional flood mitigation are not necessarily mutually exclusive, sometimes they are identical, sometimes they can, due to temporal offsets, be pursued simultaneously.
Second-hand smoking and carboxyhemoglobin levels in children: a prospective observational study.
Yee, Branden E; Ahmed, Mohammed I; Brugge, Doug; Farrell, Maureen; Lozada, Gustavo; Idupaganthi, Raghu; Schumann, Roman
2010-01-01
To establish baseline noninvasive carboxyhemoglobin (COHb) levels in children and determine the influence of exposure to environmental sources of carbon monoxide (CO), especially environmental tobacco smoke, on such levels. Second-hand smoking may be a risk factor for adverse outcomes following anesthesia and surgery in children (1) and may potentially be preventable. Parents and their children between the ages of 1-12 were enrolled on the day of elective surgery. The preoperative COHb levels of the children were assessed noninvasively using a CO-Oximeter (Radical-7 Rainbow SET Pulse CO-Oximeter; Masimo, Irvine, CA, USA). The parents were asked to complete an environmental air-quality questionnaire. The COHb levels were tabulated and correlated with responses to the survey in aggregate analysis. Statistical analyses were performed using the nonparametric Mann-Whitney and Kruskal-Wallis tests. P < 0.05 was statistically significant. Two hundred children with their parents were enrolled. Children exposed to parental smoking had higher COHb levels than the children of nonsmoking controls. Higher COHb values were seen in the youngest children, ages 1-2, exposed to parental cigarette smoke. However, these trends did not reach statistical significance, and confidence intervals were wide. This study revealed interesting trends of COHb levels in children presenting for anesthesia and surgery. However, the COHb levels measured in our patients were close to the error margin of the device used in our study. An expected improvement in measurement technology may allow screening children for potential pulmonary perioperative risk factors in the future.
Metz, Anneke M
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.
Disjunctive Normal Shape and Appearance Priors with Applications to Image Segmentation.
Mesadi, Fitsum; Cetin, Mujdat; Tasdizen, Tolga
2015-10-01
The use of appearance and shape priors in image segmentation is known to improve accuracy; however, existing techniques have several drawbacks. Active shape and appearance models require landmark points and assume unimodal shape and appearance distributions. Level set based shape priors are limited to global shape similarity. In this paper, we present a novel shape and appearance priors for image segmentation based on an implicit parametric shape representation called disjunctive normal shape model (DNSM). DNSM is formed by disjunction of conjunctions of half-spaces defined by discriminants. We learn shape and appearance statistics at varying spatial scales using nonparametric density estimation. Our method can generate a rich set of shape variations by locally combining training shapes. Additionally, by studying the intensity and texture statistics around each discriminant of our shape model, we construct a local appearance probability map. Experiments carried out on both medical and natural image datasets show the potential of the proposed method.
Gamma Oscillations of Spiking Neural Populations Enhance Signal Discrimination
Masuda, Naoki; Doiron, Brent
2007-01-01
Selective attention is an important filter for complex environments where distractions compete with signals. Attention increases both the gamma-band power of cortical local field potentials and the spike-field coherence within the receptive field of an attended object. However, the mechanisms by which gamma-band activity enhances, if at all, the encoding of input signals are not well understood. We propose that gamma oscillations induce binomial-like spike-count statistics across noisy neural populations. Using simplified models of spiking neurons, we show how the discrimination of static signals based on the population spike-count response is improved with gamma induced binomial statistics. These results give an important mechanistic link between the neural correlates of attention and the discrimination tasks where attention is known to enhance performance. Further, they show how a rhythmicity of spike responses can enhance coding schemes that are not temporally sensitive. PMID:18052541
Bilateral preictal signature of phase-amplitude coupling in canine epilepsy.
Gagliano, Laura; Bou Assi, Elie; Nguyen, Dang K; Rihana, Sandy; Sawan, Mohamad
2018-01-01
Seizure forecasting would improve the quality of life of patients with refractory epilepsy. Although early findings were optimistic, no single feature has been found capable of individually characterizing brain dynamics during transition to seizure. Cross-frequency phase amplitude coupling has been recently proposed as a precursor of seizure activity. This work evaluates the existence of a statistically significant difference in mean phase amplitude coupling distribution between the preictal and interictal states of seizures in dogs with bilaterally implanted intracranial electrodes. Results show a statistically significant change (p<0.05) of phase amplitude coupling during the preictal phase. This change is correlated with the position of implanted electrodes and is more significant within high-gamma frequency bands. These findings highlight the potential benefit of bilateral iEEG analysis and the feasibility of seizure forecasting based on slow modulation of high frequency amplitude. Copyright © 2017 Elsevier B.V. All rights reserved.
A statistical model for water quality predictions from a river discharge using coastal observations
NASA Astrophysics Data System (ADS)
Kim, S.; Terrill, E. J.
2007-12-01
Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.
Efficacy of Curcuma for Treatment of Osteoarthritis.
Perkins, Kimberly; Sahy, William; Beckett, Robert D
2017-01-01
The objective of this review is to identify, summarize, and evaluate clinical trials to determine the efficacy of curcuma in the treatment of osteoarthritis. A literature search for interventional studies assessing efficacy of curcuma was performed, resulting in 8 clinical trials. Studies have investigated the effect of curcuma on pain, stiffness, and functionality in patients with knee osteoarthritis. Curcuma-containing products consistently demonstrated statistically significant improvement in osteoarthritis-related endpoints compared with placebo, with one exception. When compared with active control, curcuma-containing products were similar to nonsteroidal anti-inflammatory drugs, and potentially to glucosamine. While statistical significant differences in outcomes were reported in a majority of studies, the small magnitude of effect and presence of major study limitations hinder application of these results. Further rigorous studies are needed prior to recommending curcuma as an effective alternative therapy for knee osteoarthritis. © The Author(s) 2016.
McGuire, Thomas G; Ayanian, John Z; Ford, Daniel E; Henke, Rachel E M; Rost, Kathryn M; Zaslavsky, Alan M
2008-01-01
Objective To test for discrimination by race/ethnicity arising from clinical uncertainty in treatment for depression, also known as “statistical discrimination.” Data Sources We used survey data from 1,321 African-American, Hispanic, and white adults identified with depression in primary care. Surveys were administered every six months for two years in the Quality Improvement for Depression (QID) studies. Study Design To examine whether and how change in depression severity affects change in treatment intensity by race/ethnicity, we used multivariate cross-sectional and change models that difference out unobserved time-invariant patient characteristics potentially correlated with race/ethnicity. Data Collection/Extraction Methods Treatment intensity was operationalized as expenditures on drugs, primary care, and specialty services, weighted by national prices from the Medical Expenditure Panel Survey. Patient race/ethnicity was collected at baseline by self-report. Principal Findings Change in depression severity is less associated with change in treatment intensity in minority patients than in whites, consistent with the hypothesis of statistical discrimination. The differential effect by racial/ethnic group was accounted for by use of mental health specialists. Conclusions Enhanced physician–patient communication and use of standardized depression instruments may reduce statistical discrimination arising from clinical uncertainty and be useful in reducing racial/ethnic inequities in depression treatment. PMID:18370966
The exposome concept: a challenge and a potential driver for environmental health research.
Siroux, Valérie; Agier, Lydiane; Slama, Rémy
2016-06-01
The exposome concept was defined in 2005 as encompassing all environmental exposures from conception onwards, as a new strategy to evidence environmental disease risk factors. Although very appealing, the exposome concept is challenging in many respects. In terms of assessment, several hundreds of time-varying exposures need to be considered, but increasing the number of exposures assessed should not be done at the cost of increased exposure misclassification. Accurately assessing the exposome currently requires numerous measurements, which rely on different technologies; resulting in an expensive set of protocols. In the future, high-throughput 'omics technologies may be a promising technique to integrate a wide range of exposures from a small numbers of biological matrices. Assessing the association between many exposures and health raises statistical challenges. Due to the correlation structure of the exposome, existing statistical methods cannot fully and efficiently untangle the exposures truly affecting the health outcome from correlated exposures. Other statistical challenges relate to accounting for exposure misclassification or identifying synergistic effects between exposures. On-going exposome projects are trying to overcome technical and statistical challenges. From a public health perspective, a better understanding of the environmental risk factors should open the way to improved prevention strategies. Copyright ©ERS 2016.
The Southampton-York Natural Scenes (SYNS) dataset: Statistics of surface attitude
Adams, Wendy J.; Elder, James H.; Graf, Erich W.; Leyland, Julian; Lugtigheid, Arthur J.; Muryy, Alexander
2016-01-01
Recovering 3D scenes from 2D images is an under-constrained task; optimal estimation depends upon knowledge of the underlying scene statistics. Here we introduce the Southampton-York Natural Scenes dataset (SYNS: https://syns.soton.ac.uk), which provides comprehensive scene statistics useful for understanding biological vision and for improving machine vision systems. In order to capture the diversity of environments that humans encounter, scenes were surveyed at random locations within 25 indoor and outdoor categories. Each survey includes (i) spherical LiDAR range data (ii) high-dynamic range spherical imagery and (iii) a panorama of stereo image pairs. We envisage many uses for the dataset and present one example: an analysis of surface attitude statistics, conditioned on scene category and viewing elevation. Surface normals were estimated using a novel adaptive scale selection algorithm. Across categories, surface attitude below the horizon is dominated by the ground plane (0° tilt). Near the horizon, probability density is elevated at 90°/270° tilt due to vertical surfaces (trees, walls). Above the horizon, probability density is elevated near 0° slant due to overhead structure such as ceilings and leaf canopies. These structural regularities represent potentially useful prior assumptions for human and machine observers, and may predict human biases in perceived surface attitude. PMID:27782103
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.
Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H
2016-07-01
Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.
Transpalpebral electrotherapy for dry age-related macular degeneration (AMD): an exploratory trial.
Anastassiou, Gerasimos; Schneegans, Anna-Lena; Selbach, Michael; Kremmer, Stephan
2013-01-01
To evaluate the effect of transpalpebral electrotherapy on patients with dry age-related macular degeneration (AMD). 22 patients were randomized in two groups to either receive therapy (n = 12) or placebo (n = 10). There was no statistically significant difference for age and initial visual acuity (VA) between the two groups (p = 0.6; ANOVA). Treatment was performed on 5 consecutive days. On each day two sessions were applied. Every session included 8 spots (40 sec/spot) around the eye globe. The current applied (changing frequency 5-80 Hz) varied individually between 150 and 220 μA. Patients were examined before treatment, at the end of the 5-day treatment period, after 4 weeks and at 6 months. Examinations included a standardized VA testing, using ETDRS letters, contrast sensitivity, macular sensitivity and fixation stability using microperimetry and measurements with SD-OCT. At the end of week 1, mean VA improved markedly (p = 0.001; T test), with 7 out of 12 patients showing an improvement of more than 5 letters. After 4 weeks, there was an improvement of more than 10 letters in 3 patients (mean + 5.7 letters; p = 0.001; T test) whereas at 6 months a loss of 1.6 letters was observed. Only 4 (33%) of our patients did not show any improvement at all. Contrast sensitivity displayed a similar pattern. Within one week after treatment, there was a rapid improvement (+4.4 optotypes; p = 0.006; T test). After 6 months, contrast sensitivity declined again (+1.5 optotypes; p = 0.2; T test). Compared to the placebo group changes on VA failed statistical significance (p = 0.1 at 4 week; T test) whereas changes on contrast sensitivity were statistically significant (p = 0.01 at week 4; T test). No adverse events were seen or reported during the study period. To the best of our knowledge, this is the first report of a transpalpebral electrostimulation in patients with dry AMD that demonstrates a temporary increase in visual function in some of these patients; results that seem to justify further research on this potential treatment option for dry AMD.
Kon, Elizaveta; Filardo, Giuseppe; Brittberg, Mats; Busacca, Maurizio; Condello, Vincenzo; Engebretsen, Lars; Marlovits, Stefan; Niemeyer, Philipp; Platzer, Patrik; Posthumus, Michael; Verdonk, Peter; Verdonk, Renè; Victor, Jan; van der Merwe, Willem; Widuchowski, Wojciech; Zorzi, Claudio; Marcacci, Maurilio
2017-09-14
The increasing awareness on the role of subchondral bone in the etiopathology of articular surface lesions led to the development of osteochondral scaffolds. While safety and promising results have been suggested, there are no trials proving the real potential of the osteochondral regenerative approach. Aim was to assess the benefit provided by a nanostructured collagen-hydroxyapatite (coll-HA) multilayer scaffold for the treatment of chondral and osteochondral knee lesions. In this multicentre randomized controlled clinical trial, 100 patients affected by symptomatic chondral and osteochondral lesions were treated and evaluated for up to 2 years (51 study group and 49 control group). A biomimetic coll-HA scaffold was studied, and bone marrow stimulation (BMS) was used as reference intervention. Primary efficacy measurement was IKDC subjective score at 2 years. Secondary efficacy measurements were: KOOS, IKDC Knee Examination Form, Tegner and VAS Pain scores evaluated at 6, 12 and 24 months. Tissue regeneration was evaluated with MRI MOCART scoring system at 6, 12 and 24 months. An external independent agency was involved to ensure data correctness and objectiveness. A statistically significant improvement of all clinical scores was obtained from basal evaluation to 2-year follow-up in both groups, although no overall statistically significant differences were detected between the two treatments. Conversely, the subgroup of patients affected by deep osteochondral lesions (i.e. Outerbridge grade IV and OCD) showed a statistically significant better IKDC subjective outcome (+12.4 points, p = 0.036) in the coll-HA group. Statistically significant better results were also found for another challenging group: sport active patients (+16.0, p = 0.027). Severe adverse events related to treatment were documented only in three patients in the coll-HA group and in one in the BMS group. The MOCART score showed no statistical difference between the two groups. This study highlighted the safety and potential of a biomimetic implant. While no statistically significant differences were found compared to BMS for chondral lesions, this procedure can be considered a suitable option for the treatment of osteochondral lesions. I.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
2018-01-01
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
Increased orthogeriatrician involvement in hip fracture care and its impact on mortality in England.
Neuburger, Jenny; Currie, Colin; Wakeman, Robert; Johansen, Antony; Tsang, Carmen; Plant, Fay; Wilson, Helen; Cromwell, David A; van der Meulen, Jan; De Stavola, Bianca
2017-03-01
to describe the increase in orthogeriatrician involvement in hip fracture care in England and its association with improvements in time to surgery and mortality. analysis of Hospital Episode Statistics for 196,401 patients presenting with hip fracture to 150 hospitals in England between 1 April 2010 and 28 February 2014, combined with data on orthogeriatrician hours from a national organisational survey. we examined changes in the average number of hours worked by orthogeriatricians in orthopaedic departments per patient with hip fracture, and their potential effect on mortality within 30 days of presentation. The role of prompt surgery (on day of or day after presentation) was explored as a potential confounding factor. Associations were assessed using conditional Poisson regression models with adjustment for patients' sex, age and comorbidity and year, with hospitals treated as fixed effects. between 2010 and 2013, there was an increase of 2.5 hours per patient in the median number of hours worked by orthogeriatricians-from 1.5 to 4.0 hours. An increase of 2.5 hours per patient was associated with a relative reduction in mortality of 3.4% (95% confidence interval 0.9% to 5.9%, P = 0.01). This corresponds to an absolute reduction of approximately 0.3%. Higher numbers of orthogeriatrician hours were associated with higher rates of prompt surgery, but were independently associated with lower mortality. in the context of initiatives to improve hip fracture care, we identified statistically significant and robust associations between increased orthogeriatrician hours per patient and reduced 30-day mortality. © Crown copyright 2016
Monitoring of International Space Station Telemetry Using Shewhart Control Charts
NASA Technical Reports Server (NTRS)
Fitch, Jeffery T.; Simon, Alan L.; Gouveia, John A.; Hillin, Andrew M.; Hernandez, Steve A.
2012-01-01
Shewhart control charts have been established as an expedient method for analyzing dynamic, trending data in order to identify anomalous subsystem performance as soon as such performance would exceed a statistically established baseline. Additionally, this leading indicator tool integrates a selection methodology that reduces false positive indications, optimizes true leading indicator events, minimizes computer processor unit duty cycles, and addresses human factor concerns (i.e., the potential for flight-controller data overload). This innovation leverages statistical process control, and provides a relatively simple way to allow flight controllers to focus their attention on subtle system changes that could lead to dramatic off-nominal system performance. Finally, this capability improves response time to potential hardware damage and/or crew injury, thereby improving space flight safety. Shewhart control charts require normalized data. However, the telemetry from the ISS Early External Thermal Control System (EETCS) was not normally distributed. A method for normalizing the data was implemented, as was a means of selecting data windows, the number of standard deviations (Sigma Level), the number of consecutive points out of limits (Sequence), and direction (increasing or decreasing trend data). By varying these options, and treating them like dial settings, the number of nuisance alerts and leading indicators were optimized. The goal was to capture all leading indicators while minimizing the number of nuisances. Lean Six Sigma (L6S) design of experiment methodologies were employed. To optimize the results, Perl programming language was used to automate the massive amounts of telemetry data, control chart plots, and the data analysis.
ERIC Educational Resources Information Center
Larson-Hall, Jenifer; Herrington, Richard
2010-01-01
In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…
Using health statistics: a Nightingale legacy.
Schloman, B F
2001-01-01
No more forceful example of the value of using health statistics to understand and improve health conditions exists than displayed by Florence Nightingale. The recent book by Dossey (1999), Florence Nightingale: Mystic, Visionary, Healer, relates the dramatic tale of Nightingale s use of statistics to understand the causes of deaths in the Crimean War and of her advocacy to standardize the collection of medical data within the army and in civilian hospitals. For her, the use of health statistics was a major tool to improve health and influence public opinion.
Baryon interactions from lattice QCD with physical masses —S = -3 sector: Ξ∑ and Ξ∑-Λ∑—
NASA Astrophysics Data System (ADS)
Ishii, Noriyoshi; Aoki, Sinya; Doi, Takumi; Gongyo, Shinya; Hatsuda, Tetsuo; Ikeda, Yoichi; Inoue, Takashi; Iritani, Takumi; Miyamoto, Takaya; Nemura, Hidekatsu; Sasaki, Kenji
2018-03-01
Hyperon-nucleon and hyperon-hyperon interactions are important in studying the properties of hypernuclei in hypernuclear physics. However, unlike the nucleons which are quite stable, hyperons are unstable so that the direct scattering experiments are difficult, which leads to the large uncertainty in the phenomenological determination of hyperon potentials. In this talk, we use the gauge configurations generated at the (almost) physical point (mπ = 146 MeV) on a huge spatial volume (8:1fm)4 to present our latest result on the hyperon-hyperon potentials in S = -3 sector (Ξ∑ single channel and Ξ∑- ΞΛ; coupled channel) from the Nambu-Bethe-Salpeter wave functions based on the HAL QCD method with improved statistics.
Trends in the predictive performance of raw ensemble weather forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Scheuerer, Michael; Pappenberger, Florian; Bogner, Konrad; Haiden, Thomas
2015-04-01
Over the last two decades the paradigm in weather forecasting has shifted from being deterministic to probabilistic. Accordingly, numerical weather prediction (NWP) models have been run increasingly as ensemble forecasting systems. The goal of such ensemble forecasts is to approximate the forecast probability distribution by a finite sample of scenarios. Global ensemble forecast systems, like the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble, are prone to probabilistic biases, and are therefore not reliable. They particularly tend to be underdispersive for surface weather parameters. Hence, statistical post-processing is required in order to obtain reliable and sharp forecasts. In this study we apply statistical post-processing to ensemble forecasts of near-surface temperature, 24-hour precipitation totals, and near-surface wind speed from the global ECMWF model. Our main objective is to evaluate the evolution of the difference in skill between the raw ensemble and the post-processed forecasts. The ECMWF ensemble is under continuous development, and hence its forecast skill improves over time. Parts of these improvements may be due to a reduction of probabilistic bias. Thus, we first hypothesize that the gain by post-processing decreases over time. Based on ECMWF forecasts from January 2002 to March 2014 and corresponding observations from globally distributed stations we generate post-processed forecasts by ensemble model output statistics (EMOS) for each station and variable. Parameter estimates are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over rolling training periods that consist of the n days preceding the initialization dates. Given the higher average skill in terms of CRPS of the post-processed forecasts for all three variables, we analyze the evolution of the difference in skill between raw ensemble and EMOS forecasts. The fact that the gap in skill remains almost constant over time, especially for near-surface wind speed, suggests that improvements to the atmospheric model have an effect quite different from what calibration by statistical post-processing is doing. That is, they are increasing potential skill. Thus this study indicates that (a) further model development is important even if one is just interested in point forecasts, and (b) statistical post-processing is important because it will keep adding skill in the foreseeable future.
Effective potentials in nonlinear polycrystals and quadrature formulae
NASA Astrophysics Data System (ADS)
Michel, Jean-Claude; Suquet, Pierre
2017-08-01
This study presents a family of estimates for effective potentials in nonlinear polycrystals. Noting that these potentials are given as averages, several quadrature formulae are investigated to express these integrals of nonlinear functions of local fields in terms of the moments of these fields. Two of these quadrature formulae reduce to known schemes, including a recent proposition (Ponte Castañeda 2015 Proc. R. Soc. A 471, 20150665 (doi:10.1098/rspa.2015.0665)) obtained by completely different means. Other formulae are also reviewed that make use of statistical information on the fields beyond their first and second moments. These quadrature formulae are applied to the estimation of effective potentials in polycrystals governed by two potentials, by means of a reduced-order model proposed by the authors (non-uniform transformation field analysis). It is shown how the quadrature formulae improve on the tangent second-order approximation in porous crystals at high stress triaxiality. It is found that, in order to retrieve a satisfactory accuracy for highly nonlinear porous crystals under high stress triaxiality, a quadrature formula of higher order is required.
Effective potentials in nonlinear polycrystals and quadrature formulae.
Michel, Jean-Claude; Suquet, Pierre
2017-08-01
This study presents a family of estimates for effective potentials in nonlinear polycrystals. Noting that these potentials are given as averages, several quadrature formulae are investigated to express these integrals of nonlinear functions of local fields in terms of the moments of these fields. Two of these quadrature formulae reduce to known schemes, including a recent proposition (Ponte Castañeda 2015 Proc. R. Soc. A 471 , 20150665 (doi:10.1098/rspa.2015.0665)) obtained by completely different means. Other formulae are also reviewed that make use of statistical information on the fields beyond their first and second moments. These quadrature formulae are applied to the estimation of effective potentials in polycrystals governed by two potentials, by means of a reduced-order model proposed by the authors (non-uniform transformation field analysis). It is shown how the quadrature formulae improve on the tangent second-order approximation in porous crystals at high stress triaxiality. It is found that, in order to retrieve a satisfactory accuracy for highly nonlinear porous crystals under high stress triaxiality, a quadrature formula of higher order is required.
Alshehry, Zahir H; Mundra, Piyushkumar A; Barlow, Christopher K; Mellett, Natalie A; Wong, Gerard; McConville, Malcolm J; Simes, John; Tonkin, Andrew M; Sullivan, David R; Barnes, Elizabeth H; Nestel, Paul J; Kingwell, Bronwyn A; Marre, Michel; Neal, Bruce; Poulter, Neil R; Rodgers, Anthony; Williams, Bryan; Zoungas, Sophia; Hillis, Graham S; Chalmers, John; Woodward, Mark; Meikle, Peter J
2016-11-22
Clinical lipid measurements do not show the full complexity of the altered lipid metabolism associated with diabetes mellitus or cardiovascular disease. Lipidomics enables the assessment of hundreds of lipid species as potential markers for disease risk. Plasma lipid species (310) were measured by a targeted lipidomic analysis with liquid chromatography electrospray ionization-tandem mass spectrometry on a case-cohort (n=3779) subset from the ADVANCE trial (Action in Diabetes and Vascular Disease: Preterax and Diamicron-MR Controlled Evaluation). The case-cohort was 61% male with a mean age of 67 years. All participants had type 2 diabetes mellitus with ≥1 additional cardiovascular risk factors, and 35% had a history of macrovascular disease. Weighted Cox regression was used to identify lipid species associated with future cardiovascular events (nonfatal myocardial infarction, nonfatal stroke, and cardiovascular death) and cardiovascular death during a 5-year follow-up period. Multivariable models combining traditional risk factors with lipid species were optimized with the Akaike information criteria. C statistics and NRIs were calculated within a 5-fold cross-validation framework. Sphingolipids, phospholipids (including lyso- and ether- species), cholesteryl esters, and glycerolipids were associated with future cardiovascular events and cardiovascular death. The addition of 7 lipid species to a base model (14 traditional risk factors and medications) to predict cardiovascular events increased the C statistic from 0.680 (95% confidence interval [CI], 0.678-0.682) to 0.700 (95% CI, 0.698-0.702; P<0.0001) with a corresponding continuous NRI of 0.227 (95% CI, 0.219-0.235). The prediction of cardiovascular death was improved with the incorporation of 4 lipid species into the base model, showing an increase in the C statistic from 0.740 (95% CI, 0.738-0.742) to 0.760 (95% CI, 0.757-0.762; P<0.0001) and a continuous net reclassification index of 0.328 (95% CI, 0.317-0.339). The results were validated in a subcohort with type 2 diabetes mellitus (n=511) from the LIPID trial (Long-Term Intervention With Pravastatin in Ischemic Disease). The improvement in the prediction of cardiovascular events, above traditional risk factors, demonstrates the potential of plasma lipid species as biomarkers for cardiovascular risk stratification in diabetes mellitus. URL: https://clinicaltrials.gov. Unique identifier: NCT00145925. © 2016 American Heart Association, Inc.
Murphy, Susan L; Barber, Mary; Homer, Kristen; Dodge, Carole; Cutter, Gary; Khanna, Dinesh
2018-01-30
To determine feasibility and preliminary effects of an occupational therapy treatment to improve upper extremity (UE) function in patients with early systemic sclerosis (SSc) who have UE contractures. A one-arm pilot clinical rehabilitation trial was conducted at a university health system. Participants with SSc and ≥ 1 UE contracture (n = 21) participated in a total of 8 weekly in-person occupational therapy sessions. The therapy consisted of thermal modalities, tissue mobilization, and UE mobility. Between sessions, participants were instructed to complete UE home exercises. Feasibility was measured by percent enrollment and session attendance and duration. The primary outcome measure was the QuickDASH, secondary and exploratory outcomes included PROMIS physical function, objective UE measures, and skin thickening. Linear mixed models were performed to determine treatment effects on primary and secondary outcomes. Fifty percent (24/48) of potentially eligible participants were interested. Of those, 88% (21/24) enrolled; and nineteen out of 21 (91%) completed all sessions. The mean (SD) age was 47.9 years (± 16.1); 100% had diffuse SSc, and mean disease duration was 3.1 years. At 8 weeks, participants reported statistically significant improvement on QuickDASH and PROMIS physical function measures (p =.0012 and p = .004). Forty-seven and 53% percent of the sample achieved improvements that exceeded minimally important differences. In-person treatment sessions were feasible for individuals with SSc and demonstrated statistically significant and clinically meaningful improvements on UE and physical function. Future studies need to examine effects against a control condition and examine durability of treatment effects. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Jørgensen, R; Licht, R W; Lysaker, P H; Munk-Jørgensen, P; Buck, K D; Jensen, S O W; Hansson, L; Zoffmann, V
2015-07-01
Poor insight has a negative impact on the outcome in schizophrenia; consequently, poor insight is a logical target for treatment. However, neither medication nor psychosocial interventions have been demonstrated to improve poor insight. A method originally designed for diabetes patients to improve their illness management, Guided Self-Determination (GSD), has been adapted for use in patients with schizophrenia (GSD-SZ). The purpose of this study was to investigate the effect on insight of GSD-SZ as a supplement to treatment as usual (TAU) as compared to TAU alone in outpatients diagnosed with schizophrenia. The design was an open randomized trial. The primary hypothesis was cognitive insight would improve in those patients who received GSD-SZ+TAU as assessed by the BCIS. We additionally explored whether the intervention led to changes in clinical insight, self-perceived recovery, self-esteem, social functioning and symptom severity. Assessments were conducted at baseline, and at 3-, 6- and 12-month follow-up. Analysis was based on the principles of intention to treat and potential confounders were taken into account through applying a multivariate approach. A total of 101 participants were randomized to GSD-SZ+TAU (n=50) or to TAU alone (n=51). No statistically significant differences were found on the cognitive insight. However, at 12-month follow-up, clinical insight (measured by G12 from the Positive and Negative Syndrome Scale), symptom severity, and social functioning had statistically significantly improved in the intervention group as compared to the control group. "Improving insight in patients diagnosed with schizophrenia", NCT01282307, http://clinicaltrials.gov/. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Intraocular methotrexate in the treatment of uveitis and uveitic cystoid macular edema.
Taylor, Simon R J; Habot-Wilner, Zohar; Pacheco, Patricio; Lightman, Sue L
2009-04-01
A pilot study to evaluate the use of intravitreal methotrexate (MTX) for the treatment of uveitis and uveitic cystoid macular edema (CME). Prospective, consecutive, interventional case series. Fifteen eyes of 15 patients with a unilateral exacerbation of noninfectious intermediate, posterior uveitis, or panuveitis and/or CME such that visual acuity (VA) was 20/40 or worse, together with a history of increased intraocular pressure (IOP) in response to corticosteroid administration. Intravitreal injection of 400 microg in 0.1 ml MTX. The primary outcome measure was VA (using the Early Treatment Diabetic Retinopathy Study chart). Other outcome measures included ocular inflammation scores, time to relapse, levels of systemic corticosteroid and immunosuppressive therapy, and ocular coherence tomography. Potential complications of intravitreal MTX injection, including cataract progression, vitreous hemorrhage, retinal detachment, and corneal epitheliopathy, were assessed. VA improved at all time points and was statistically significant at the 3- and 6-month follow-up examinations. The mean visual improvement was 4 lines at 3 months and 4.5 lines at 6 months, with no statistical difference between the best VA obtained after MTX injection and after previous corticosteroid treatment, including intravitreal triamcinolone acetate injection. Five patients relapsed after a median of 4 months; a similar improvement was seen after re-injection. Ocular inflammation scores improved at all time points, and systemic immunosuppressive medication was reduced in 3 of 7 patients taking this at the start of the trial. In patients with uveitis and uveitic CME, intravitreal MTX can improve VA and reduce CME and, in some patients, allows the reduction of immunosuppressive therapy. Relapse occurs at a median of 4 months in some patients, but reinjection has similar efficacy.
ERIC Educational Resources Information Center
Petocz, Agnes; Newbery, Glenn
2010-01-01
Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…
NASA Astrophysics Data System (ADS)
Giuliani, M.; Pianosi, F.; Castelletti, A.
2015-11-01
Advances in Environmental monitoring systems are making a wide range of data available at increasingly higher temporal and spatial resolution. This creates an opportunity to enhance real-time understanding of water systems conditions and to improve prediction of their future evolution, ultimately increasing our ability to make better decisions. Yet, many water systems are still operated using very simple information systems, typically based on simple statistical analysis and the operator's experience. In this work, we propose a framework to automatically select the most valuable information to inform water systems operations supported by quantitative metrics to operationally and economically assess the value of this information. The Hoa Binh reservoir in Vietnam is used to demonstrate the proposed framework in a multiobjective context, accounting for hydropower production and flood control. First, we quantify the expected value of perfect information, meaning the potential space for improvement under the assumption of exact knowledge of the future system conditions. Second, we automatically select the most valuable information that could be actually used to improve the Hoa Binh operations. Finally, we assess the economic value of sample information on the basis of the resulting policy performance. Results show that our framework successfully select information to enhance the performance of the operating policies with respect to both the competing objectives, attaining a 40% improvement close to the target trade-off selected as potentially good compromise between hydropower production and flood control.
Costigan, S A; Eather, N; Plotnikoff, R C; Taaffe, D R; Lubans, D R
2015-10-01
High-intensity interval training (HIIT) may be a feasible and efficacious strategy for improving health-related fitness in young people. The objective of this systematic review and meta-analysis was to evaluate the utility of HIIT to improve health-related fitness in adolescents and to identify potential moderators of training effects. Studies were considered eligible if they: (1) examined adolescents (13-18 years); (2) examined health-related fitness outcomes; (3) involved an intervention of ≥4 weeks in duration; (4) included a control or moderate intensity comparison group; and (5) prescribed high-intensity activity for the HIIT condition. Meta-analyses were conducted to determine the effect of HIIT on health-related fitness components using Comprehensive Meta-analysis software and potential moderators were explored (ie, study duration, risk of bias and type of comparison group). The effects of HIIT on cardiorespiratory fitness and body composition were large, and medium, respectively. Study duration was a moderator for the effect of HIIT on body fat percentage. Intervention effects for waist circumference and muscular fitness were not statistically significant. HIIT is a feasible and time-efficient approach for improving cardiorespiratory fitness and body composition in adolescent populations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Hughes, Kim; Flynn, Tanya; de Zoysa, Janak; Dalbeth, Nicola; Merriman, Tony R
2014-02-01
Increased serum urate predicts chronic kidney disease independent of other risk factors. The use of xanthine oxidase inhibitors coincides with improved renal function. Whether this is due to reduced serum urate or reduced production of oxidants by xanthine oxidase or another physiological mechanism remains unresolved. Here we applied Mendelian randomization, a statistical genetics approach allowing disentangling of cause and effect in the presence of potential confounding, to determine whether lowering of serum urate by genetic modulation of renal excretion benefits renal function using data from 7979 patients of the Atherosclerosis Risk in Communities and Framingham Heart studies. Mendelian randomization by the two-stage least squares method was done with serum urate as the exposure, a uric acid transporter genetic risk score as instrumental variable, and estimated glomerular filtration rate and serum creatinine as the outcomes. Increased genetic risk score was associated with significantly improved renal function in men but not in women. Analysis of individual genetic variants showed the effect size associated with serum urate did not correlate with that associated with renal function in the Mendelian randomization model. This is consistent with the possibility that the physiological action of these genetic variants in raising serum urate correlates directly with improved renal function. Further studies are required to understand the mechanism of the potential renal function protection mediated by xanthine oxidase inhibitors.
Cook, Seth L; Ma, Zhao
2014-02-15
Rangelands can be managed to increase soil carbon and help mitigate emissions of carbon dioxide. This study assessed Utah rangeland owner's environmental values, beliefs about climate change, and awareness of and attitudes towards carbon sequestration, as well as their perceptions of potential policy strategies for promoting carbon sequestration on private rangelands. Data were collected from semi-structured interviews and a statewide survey of Utah rangeland owners, and were analyzed using descriptive and bivariate statistics. Over two-thirds of respondents reported some level of awareness of carbon sequestration and a generally positive attitude towards it, contrasting to their lack of interest in participating in a relevant program in the future. Having a positive attitude was statistically significantly associated with having more "biocentric" environmental values, believing the climate had been changing over the past 30 years, and having a stronger belief of human activities influencing the climate. Respondents valued the potential ecological benefits of carbon sequestration more than the potential financial or climate change benefits. Additionally, respondents indicated a preference for educational approaches over financial incentives. They also preferred to work with a private agricultural entity over a non-profit or government entity on improving land management practices to sequester carbon. These results suggest potential challenges for developing technically sound and socially acceptable policies and programs for promoting carbon sequestration on private rangelands. Potential strategies for overcoming these challenges include emphasizing the ecological benefits associated with sequestering carbon to appeal to landowners with ecologically oriented management objectives, enhancing the cooperation between private agricultural organizations and government agencies, and funneling resources for promoting carbon sequestration into existing land management and conservation programs that may produce carbon benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.
Single-Item Measurement of Suicidal Behaviors: Validity and Consequences of Misclassification
Millner, Alexander J.; Lee, Michael D.; Nock, Matthew K.
2015-01-01
Suicide is a leading cause of death worldwide. Although research has made strides in better defining suicidal behaviors, there has been less focus on accurate measurement. Currently, the widespread use of self-report, single-item questions to assess suicide ideation, plans and attempts may contribute to measurement problems and misclassification. We examined the validity of single-item measurement and the potential for statistical errors. Over 1,500 participants completed an online survey containing single-item questions regarding a history of suicidal behaviors, followed by questions with more precise language, multiple response options and narrative responses to examine the validity of single-item questions. We also conducted simulations to test whether common statistical tests are robust against the degree of misclassification produced by the use of single-items. We found that 11.3% of participants that endorsed a single-item suicide attempt measure engaged in behavior that would not meet the standard definition of a suicide attempt. Similarly, 8.8% of those who endorsed a single-item measure of suicide ideation endorsed thoughts that would not meet standard definitions of suicide ideation. Statistical simulations revealed that this level of misclassification substantially decreases statistical power and increases the likelihood of false conclusions from statistical tests. Providing a wider range of response options for each item reduced the misclassification rate by approximately half. Overall, the use of single-item, self-report questions to assess the presence of suicidal behaviors leads to misclassification, increasing the likelihood of statistical decision errors. Improving the measurement of suicidal behaviors is critical to increase understanding and prevention of suicide. PMID:26496707
Trickey, Amber W; Crosby, Moira E; Singh, Monika; Dort, Jonathan M
2014-12-01
The application of evidence-based medicine to patient care requires unique skills of the physician. Advancing residents' abilities to accurately evaluate the quality of evidence is built on understanding of fundamental research concepts. The American Board of Surgery In-Training Examination (ABSITE) provides a relevant measure of surgical residents' knowledge of research design and statistics. We implemented a research education curriculum in an independent academic medical center general residency program, and assessed the effect on ABSITE scores. The curriculum consisted of five 1-hour monthly research and statistics lectures. The lectures were presented before the 2012 and 2013 examinations. Forty residents completing ABSITE examinations from 2007 to 2013 were included in the study. Two investigators independently identified research-related item topics from examination summary reports. Correct and incorrect responses were compared precurriculum and postcurriculum. Regression models were calculated to estimate improvement in postcurriculum scores, adjusted for individuals' scores over time and postgraduate year level. Residents demonstrated significant improvement in postcurriculum examination scores for research and statistics items. Correct responses increased 27% (P < .001). Residents were 5 times more likely to achieve a perfect score on research and statistics items postcurriculum (P < .001). Residents at all levels demonstrated improved research and statistics scores after receiving the curriculum. Because the ABSITE includes a wide spectrum of research topics, sustained improvements suggest a genuine level of understanding that will promote lifelong evaluation and clinical application of the surgical literature.
Fricke, Silke; Burgoyne, Kelly; Bowyer-Crane, Claudine; Kyriacou, Maria; Zosimidou, Alexandra; Maxwell, Liam; Lervåg, Arne; Snowling, Margaret J; Hulme, Charles
2017-10-01
Oral language skills are a critical foundation for literacy and more generally for educational success. The current study shows that oral language skills can be improved by providing suitable additional help to children with language difficulties in the early stages of formal education. We conducted a randomized controlled trial with 394 children in England, comparing a 30-week oral language intervention programme starting in nursery (N = 132) with a 20-week version of the same programme starting in Reception (N = 133). The intervention groups were compared to an untreated waiting control group (N = 129). The programmes were delivered by trained teaching assistants (TAs) working in the children's schools/nurseries. All testers were blind to group allocation. Both the 20- and 30-week programmes produced improvements on primary outcome measures of oral language skill compared to the untreated control group. Effect sizes were small to moderate (20-week programme: d = .21; 30-week programme: d = .30) immediately following the intervention and were maintained at follow-up 6 months later. The difference in improvement between the 20-week and 30-week programmes was not statistically significant. Neither programme produced statistically significant improvements in children's early word reading or reading comprehension skills (secondary outcome measures). This study provides further evidence that oral language interventions can be delivered successfully by trained TAs to children with oral language difficulties in nursery and Reception classes. The methods evaluated have potentially important policy implications for early education. © 2017 Association for Child and Adolescent Mental Health.
Husnu, Tokgoz; Ersoz, Akyurek; Bulent, Erol; Tacettin, Ornek; Remzi, Altin; Bulent, Akduman; Aydin, Mungan
2015-03-01
The aim of this age-matched, controlled, prospective clinical study was to investigate frequency and degree of erectile dysfunction (ED) in patients with obstructive sleep apnea syndrome (OSAS) and to evaluate the results of only continuous positive airway pressure (CPAP) therapy on ED in patients with OSAS. A total of 90 patients were evaluated for potential OSAS. They were given an International Index of Erectile Function questionnaire (IIEF) and Beck Depression Inventory. Sixty-two patients with the diagnosis of OSAS were regarded as study group. Twenty-eight patients in whom the OSAS was excluded, were regarded as the control group. Biochemical and hormonal laboratory evaluation were performed. Then all patients underwent a full-night in laboratory polysomnography examination. The degree of OSAS were evaluated by an expert from chest diseases department. When compared to the control group, a decrease in IIEF-5 scores was found in patients with OSAS. However, this decrease was not statistically significant. After 3 months of CPAP usage in patients with mild to moderate and severe degree OSAS, improvement in IIEF-5 scores was statistically significant. Mean value of IIEF-5 score was 16.63±5.91 before CPAP and were improved up to 20.92±6.79 (P=0.001). It is not certainly possible to say that OSAS is clearly associated with ED. However, after 3 months of regular CPAP usage, ED complaints in patients with OSAS might improve positively. Trials with larger series may give more conclusive data.
Improving estimates of air pollution exposure through ubiquitous sensing technologies.
de Nazelle, Audrey; Seto, Edmund; Donaire-Gonzalez, David; Mendez, Michelle; Matamala, Jaume; Nieuwenhuijsen, Mark J; Jerrett, Michael
2013-05-01
Traditional methods of exposure assessment in epidemiological studies often fail to integrate important information on activity patterns, which may lead to bias, loss of statistical power, or both in health effects estimates. Novel sensing technologies integrated with mobile phones offer potential to reduce exposure measurement error. We sought to demonstrate the usability and relevance of the CalFit smartphone technology to track person-level time, geographic location, and physical activity patterns for improved air pollution exposure assessment. We deployed CalFit-equipped smartphones in a free-living population of 36 subjects in Barcelona, Spain. Information obtained on physical activity and geographic location was linked to space-time air pollution mapping. We found that information from CalFit could substantially alter exposure estimates. For instance, on average travel activities accounted for 6% of people's time and 24% of their daily inhaled NO2. Due to the large number of mobile phone users, this technology potentially provides an unobtrusive means of enhancing epidemiologic exposure data at low cost. Copyright © 2013 Elsevier Ltd. All rights reserved.
Yoga-enhanced cognitive behavioural therapy (Y-CBT) for anxiety management: a pilot study.
Khalsa, Manjit K; Greiner-Ferris, Julie M; Hofmann, Stefan G; Khalsa, Sat Bir S
2015-01-01
Cognitive behavioural therapy (CBT) is an effective treatment for generalized anxiety disorder, but there is still room for improvement. The aim of the present study was to examine the potential benefit of enriching CBT with kundalini yoga (Y-CBT). Participants consisted of treatment resistant clients at a community mental health clinic. A total of 32 participants enrolled in the study and 22 completed the programme. After the Y-CBT intervention, pre-post comparisons showed statistically significant improvements in state and trait anxiety, depression, panic, sleep and quality of life. Results from this preliminary study suggest that Y-CBT may have potential as a promising treatment for those suffering from generalized anxiety disorder. Yoga-enhanced cognitive behavioural therapy (Y-CBT) may be a promising new treatment for those suffering from generalized anxiety disorder. Y-CBT may also reduce depression in those suffering from generalized anxiety. Y-CBT may reduce depression and anxiety in a clinic population where clients suffer from multiple diagnoses including generalized anxiety disorder. Copyright © 2014 John Wiley & Sons, Ltd.
The use of administrative and other records for the analysis of internal migration.
1983-01-01
There are 5 main types of administrative records that are of potential use in the analysis of internal migration in Africa: 1) population registers, 2) electoral rolls, 3) school records, 4) labor or employment records, and 5) social security records. The population register provides legal identification for the individual and records his movements from 1 civil subdivision to another. The process of establishing a population register is not a simple one. All 5 of these records are incomplete, defective, and in most cases decentralized; yet, in spite of these limitations, administrative records are potential sources of migration data. Because of their imcompleteness, major biases are likely to arise in their use. The 1st step is for National Statistical Services to assist in improving the coverage of events expected to be registered in any of these records. The 2nd step is to try to use the data through some form of ratio of regression estimation. If use is not made of the records for migration data, it is unlikely that the quality of the migration data in the records will ever improve.
Synthetic Vision Enhanced Surface Operations With Head-Worn Display for Commercial Aircraft
NASA Technical Reports Server (NTRS)
Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Shelton, Kevin J.; Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Norman, R. M.
2007-01-01
Experiments and flight tests have shown that airport surface operations can be enhanced by using synthetic vision and associated technologies, employed on a Head-Up Display (HUD) and head-down display electronic moving maps (EMM). Although HUD applications have shown the greatest potential operational improvements, the research noted that two major limitations during ground operations were its monochrome form and limited, fixed field-of-regard. A potential solution to these limitations may be the application of advanced Head Worn Displays (HWDs) particularly during low-visibility operations wherein surface movement is substantially limited because of the impaired vision of pilots and air traffic controllers. The paper describes the results of ground simulation experiments conducted at the NASA Langley Research Center. The results of the experiments showed that the fully integrated HWD concept provided significantly improved path performance compared to using paper charts alone. When comparing the HWD and HUD concepts, there were no statistically-significant differences in path performance or subjective ratings of situation awareness and workload. Implications and directions for future research are described.
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754
Morbi, Abigail H M; Hamady, Mohamad S; Riga, Celia V; Kashef, Elika; Pearch, Ben J; Vincent, Charles; Moorthy, Krishna; Vats, Amit; Cheshire, Nicholas J W; Bicknell, Colin D
2012-08-01
To determine the type and frequency of errors during vascular interventional radiology (VIR) and design and implement an intervention to reduce error and improve efficiency in this setting. Ethical guidance was sought from the Research Services Department at Imperial College London. Informed consent was not obtained. Field notes were recorded during 55 VIR procedures by a single observer. Two blinded assessors identified failures from field notes and categorized them into one or more errors by using a 22-part classification system. The potential to cause harm, disruption to procedural flow, and preventability of each failure was determined. A preprocedural team rehearsal (PPTR) was then designed and implemented to target frequent preventable potential failures. Thirty-three procedures were observed subsequently to determine the efficacy of the PPTR. Nonparametric statistical analysis was used to determine the effect of intervention on potential failure rates, potential to cause harm and procedural flow disruption scores (Mann-Whitney U test), and number of preventable failures (Fisher exact test). Before intervention, 1197 potential failures were recorded, of which 54.6% were preventable. A total of 2040 errors were deemed to have occurred to produce these failures. Planning error (19.7%), staff absence (16.2%), equipment unavailability (12.2%), communication error (11.2%), and lack of safety consciousness (6.1%) were the most frequent errors, accounting for 65.4% of the total. After intervention, 352 potential failures were recorded. Classification resulted in 477 errors. Preventable failures decreased from 54.6% to 27.3% (P < .001) with implementation of PPTR. Potential failure rates per hour decreased from 18.8 to 9.2 (P < .001), with no increase in potential to cause harm or procedural flow disruption per failure. Failures during VIR procedures are largely because of ineffective planning, communication error, and equipment difficulties, rather than a result of technical or patient-related issues. Many of these potential failures are preventable. A PPTR is an effective means of targeting frequent preventable failures, reducing procedural delays and improving patient safety.
Shah, Nirmal V; Seth, Avinash K; Balaraman, R; Aundhia, Chintan J; Maheshwari, Rajesh A; Parmar, Ghanshyam R
2016-05-01
The objective of present work was to utilize potential of nanostructured lipid carriers (NLCs) for improvement in oral bioavailability of raloxifene hydrochloride (RLX). RLX loaded NLCs were prepared by solvent diffusion method using glyceryl monostearate and Capmul MCM C8 as solid lipid and liquid lipid, respectively. A full 3(2) factorial design was utilized to study the effect of two independent parameters namely solid lipid to liquid lipid ratio and concentration of stabilizer on the entrapment efficiency of prepared NLCs. The statistical evaluation confirmed pronounced improvement in entrapment efficiency when liquid lipid content in the formulation increased from 5% w/w to 15% w/w. Solid-state characterization studies (DSC and XRD) in optimized formulation NLC-8 revealed transformation of RLX from crystalline to amorphous form. Optimized formulation showed 32.50 ± 5.12 nm average particle size and -12.8 ± 3.2 mV zeta potential that impart good stability of NLCs dispersion. In vitro release study showed burst release for initial 8 h followed by sustained release up to 36 h. TEM study confirmed smooth surface discrete spherical nano sized particles. To draw final conclusion, in vivo pharmacokinetic study was carried out that showed 3.75-fold enhancements in bioavailability with optimized NLCs formulation than plain drug suspension. These results showed potential of NLCs for significant improvement in oral bioavailability of poorly soluble RLX.
Potentials for Platooning in U.S. Highway Freight Transport: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muratori, Matteo; Holden, Jacob; Lammert, Michael
2017-03-15
Smart technologies enabling connection among vehicles and between vehicles and infrastructure as well as vehicle automation to assist human operators are receiving significant attention as means for improving road transportation systems by reducing fuel consumption - and related emissions - while also providing additional benefits through improving overall traffic safety and efficiency. For truck applications, currently responsible for nearly three-quarters of the total U.S. freight energy use and greenhouse gas (GHG) emissions, platooning has been identified as an early feature for connected and automated vehicles (CAVs) that could provide significant fuel savings and improved traffic safety and efficiency without radicalmore » design or technology changes compared to existing vehicles. A statistical analysis was performed based on a large collection of real-world U.S. truck usage data to estimate the fraction of total miles that are technically suitable for platooning. In particular, our analysis focuses on estimating 'platoonable' mileage based on overall highway vehicle use and prolonged high-velocity traveling, establishing that about 65% of the total miles driven by combination trucks could be driven in platoon formation, leading to a 4% reduction in total truck fuel consumption. This technical potential for 'platoonable' miles in the U.S. provides an upper bound for scenario analysis considering fleet willingness to platoon as an estimate of overall benefits of early adoption of CAV technologies. A benefit analysis is proposed to assess the overall potential for energy savings and emissions mitigation by widespread implementation of highway platooning for trucks.« less
Shah, Nirmal V.; Seth, Avinash K.; Balaraman, R.; Aundhia, Chintan J.; Maheshwari, Rajesh A.; Parmar, Ghanshyam R.
2016-01-01
The objective of present work was to utilize potential of nanostructured lipid carriers (NLCs) for improvement in oral bioavailability of raloxifene hydrochloride (RLX). RLX loaded NLCs were prepared by solvent diffusion method using glyceryl monostearate and Capmul MCM C8 as solid lipid and liquid lipid, respectively. A full 32 factorial design was utilized to study the effect of two independent parameters namely solid lipid to liquid lipid ratio and concentration of stabilizer on the entrapment efficiency of prepared NLCs. The statistical evaluation confirmed pronounced improvement in entrapment efficiency when liquid lipid content in the formulation increased from 5% w/w to 15% w/w. Solid-state characterization studies (DSC and XRD) in optimized formulation NLC-8 revealed transformation of RLX from crystalline to amorphous form. Optimized formulation showed 32.50 ± 5.12 nm average particle size and −12.8 ± 3.2 mV zeta potential that impart good stability of NLCs dispersion. In vitro release study showed burst release for initial 8 h followed by sustained release up to 36 h. TEM study confirmed smooth surface discrete spherical nano sized particles. To draw final conclusion, in vivo pharmacokinetic study was carried out that showed 3.75-fold enhancements in bioavailability with optimized NLCs formulation than plain drug suspension. These results showed potential of NLCs for significant improvement in oral bioavailability of poorly soluble RLX. PMID:27222747
Moyib, O K; Mkumbira, J; Odunola, O A; Dixon, A G
2012-12-01
Cyanogenic potential (CNp) of cassava constitutes a serious problem for over 500 million people who rely on the crop as their main source of calories. Genetic diversity is a key to successful crop improvement for breeding new improved variability for target traits. Forty-three improved genotypes of cassava developed by International Institute of Tropical Agriculture (ITA), Ibadan, were characterized for CNp trait using 35 Simple Sequence.Repeat (SSR) markers. Essential colorimetry picric test was used for evaluation of CNp on a color scale of 1 to 14. The CNp scores obtained ranged from 3 to 9, with a mean score of 5.48 (+/- 0.09) based on Statistical Analysis System (SAS) package. TMS M98/ 0068 (4.0 +/- 0.25) was identified as the best genotype with low CNp while TMS M98/0028 (7.75 +/- 0.25) was the worst. The 43 genotypes were assigned into 7 phenotypic groups based on rank-sum analysis in SAS. Dissimilarity analysis representatives for windows generated a phylogenetic tree with 5 clusters which represented hybridizing groups. Each of the clusters (except 4) contained low CNp genotypes that could be used for improving the high CNp genotypes in the same or near cluster. The scatter plot of the genotypes showed that there was little or no demarcation for phenotypic CNp groupings in the molecular groupings. The result of this study demonstrated that SSR markers are powerful tools for the assessment of genetic variability, and proper identification and selection of parents for genetic improvement of low CNp trait among the IITA cassava collection.
Liu, Yafei; Zhang, You; Li, Chuang; Bai, Yun; Zhang, Daoming; Xue, Chunyu; Liu, Guangqing
2018-05-15
Pollutant emissions from incomplete combustion of raw coal in low-efficiency residential heating stoves greatly contribute to winter haze in China. Semi-coke coals and improved heating stoves are expected to lower air pollutant emissions and are vigorously promoted by the Chinese government in many national and local plans. In this study, the thermal performance and air pollutant emissions from semi-coke combustion in improved heating stoves were measured in a pilot rural county and compared to the baseline of burning raw coal to quantify the mitigation potential of air pollutant emissions. A total of five stove-fuel combinations were tested, and 27 samples from 27 different volunteered households were obtained. The heating efficiency of improved stoves increased, but fuel consumption appeared higher with more useful energy output compared to traditional stoves. The emission factors of PM 2.5 , SO 2 , and CO 2 of semi-coke burning in specified improved stoves were lower than the baseline of burning raw coal chunk, but no significant NOx and CO decreases were observed. The total amount of PM 2.5 and SO 2 emissions per household in one heating season was lower, but CO, CO 2 , and NOx increased when semi-coke coal and specified improved stoves were deployed. Most differences were not statistically significant due to the limited samples and large variation, indicating that further evaluation would be needed to make conclusions that could be considered for policy. Copyright © 2018 Elsevier Ltd. All rights reserved.
When Mathematics and Statistics Collide in Assessment Tasks
ERIC Educational Resources Information Center
Bargagliotti, Anna; Groth, Randall
2016-01-01
Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.
The non-statistical dynamics of the 18O + 32O2 isotope exchange reaction at two energies
NASA Astrophysics Data System (ADS)
Van Wyngarden, Annalise L.; Mar, Kathleen A.; Quach, Jim; Nguyen, Anh P. Q.; Wiegel, Aaron A.; Lin, Shi-Ying; Lendvay, Gyorgy; Guo, Hua; Lin, Jim J.; Lee, Yuan T.; Boering, Kristie A.
2014-08-01
The dynamics of the 18O(3P) + 32O2 isotope exchange reaction were studied using crossed atomic and molecular beams at collision energies (Ecoll) of 5.7 and 7.3 kcal/mol, and experimental results were compared with quantum statistical (QS) and quasi-classical trajectory (QCT) calculations on the O3(X1A') potential energy surface (PES) of Babikov et al. [D. Babikov, B. K. Kendrick, R. B. Walker, R. T. Pack, P. Fleurat-Lesard, and R. Schinke, J. Chem. Phys. 118, 6298 (2003)]. In both QS and QCT calculations, agreement with experiment was markedly improved by performing calculations with the experimental distribution of collision energies instead of fixed at the average collision energy. At both collision energies, the scattering displayed a forward bias, with a smaller bias at the lower Ecoll. Comparisons with the QS calculations suggest that 34O2 is produced with a non-statistical rovibrational distribution that is hotter than predicted, and the discrepancy is larger at the lower Ecoll. If this underprediction of rovibrational excitation by the QS method is not due to PES errors and/or to non-adiabatic effects not included in the calculations, then this collision energy dependence is opposite to what might be expected based on collision complex lifetime arguments and opposite to that measured for the forward bias. While the QCT calculations captured the experimental product vibrational energy distribution better than the QS method, the QCT results underpredicted rotationally excited products, overpredicted forward-bias and predicted a trend in the strength of forward-bias with collision energy opposite to that measured, indicating that it does not completely capture the dynamic behavior measured in the experiment. Thus, these results further underscore the need for improvement in theoretical treatments of dynamics on the O3(X1A') PES and perhaps of the PES itself in order to better understand and predict non-statistical effects in this reaction and in the formation of ozone (in which the intermediate O3* complex is collisionally stabilized by a third body). The scattering data presented here at two different collision energies provide important benchmarks to guide these improvements.
Gandy, M; Karin, E; Jones, M P; McDonald, S; Sharpe, L; Titov, N; Dear, B F
2018-05-13
The evidence for Internet-delivered pain management programs for chronic pain is growing, but there is little empirical understanding of how they effect change. Understanding mechanisms of clinical response to these programs could inform their effective development and delivery. A large sample (n = 396) from a previous randomized controlled trial of a validated internet-delivered psychological pain management program, the Pain Course, was used to examine the influence of three potential psychological mechanisms (pain acceptance, pain self-efficacy, fear of movement/re-injury) on treatment-related change in disability, depression, anxiety and average pain. Analyses involved generalized estimating equation models for clinical outcomes that adjusted for co-occurring change in psychological variables. This was paired with cross-lagged analysis to assess for evidence of causality. Analyses involved two time points, pre-treatment and post-treatment. Changes in pain-acceptance were strongly associated with changes in three (depression, anxiety and average pain) of the four clinical outcomes. Changes in self-efficacy were also strongly associated with two (anxiety and average pain) clinical outcomes. These findings suggest that participants were unlikely to improve in these clinical outcomes without also experiencing increases in their pain self-efficacy and pain acceptance. However, there was no clear evidence from cross-lagged analyses to currently support these psychological variables as direct mechanisms of clinical improvements. There was only statistical evidence to suggest higher levels of self-efficacy moderated improvements in depression. The findings suggest that, while clinical improvements are closely associated with improvements in pain acceptance and self-efficacy, these psychological variables may not drive the treatment effects observed. This study employed robust statistical techniques to assess the psychological mechanisms of an established internet-delivered pain management program. While clinical improvements (e.g. depression, anxiety, pain) were closely associated with improvements in psychological variables (e.g. pain self-efficacy and pain acceptance), these variables do not appear to be treatment mechanisms. © 2018 European Pain Federation - EFIC®.
Cosmological Constraints from Fourier Phase Statistics
NASA Astrophysics Data System (ADS)
Ali, Kamran; Obreschkow, Danail; Howlett, Cullan; Bonvin, Camille; Llinares, Claudio; Oliveira Franco, Felipe; Power, Chris
2018-06-01
Most statistical inference from cosmic large-scale structure relies on two-point statistics, i.e. on the galaxy-galaxy correlation function (2PCF) or the power spectrum. These statistics capture the full information encoded in the Fourier amplitudes of the galaxy density field but do not describe the Fourier phases of the field. Here, we quantify the information contained in the line correlation function (LCF), a three-point Fourier phase correlation function. Using cosmological simulations, we estimate the Fisher information (at redshift z = 0) of the 2PCF, LCF and their combination, regarding the cosmological parameters of the standard ΛCDM model, as well as a Warm Dark Matter (WDM) model and the f(R) and Symmetron modified gravity models. The galaxy bias is accounted for at the level of a linear bias. The relative information of the 2PCF and the LCF depends on the survey volume, sampling density (shot noise) and the bias uncertainty. For a volume of 1h^{-3}Gpc^3, sampled with points of mean density \\bar{n} = 2× 10^{-3} h3 Mpc^{-3} and a bias uncertainty of 13%, the LCF improves the parameter constraints by about 20% in the ΛCDM cosmology and potentially even more in alternative models. Finally, since a linear bias only affects the Fourier amplitudes (2PCF), but not the phases (LCF), the combination of the 2PCF and the LCF can be used to break the degeneracy between the linear bias and σ8, present in 2-point statistics.
Data on the migration of health-care workers: sources, uses, and challenges.
Diallo, Khassoum
2004-01-01
The migration of health workers within and between countries is a growing concern worldwide because of its impact on health systems in developing and developed countries alike. Policy decisions need to be made at the national, regional and international levels to manage more effectively this phenomenon, but those decisions will be effective and correctly implemented and evaluated only if they are based on adequate statistical data. Most statistics on the migration of health-care workers are neither complete nor fully comparable, and they are often underused, limited (because they often give only a broad description of the phenomena) and not as timely as required. There is also a conflict between the wide range of potential sources of data and the poor statistical evidence on the migration of health personnel. There are two major problems facing researchers who wish to provide evidence on this migration: the problems commonly faced when studying migration in general, such as definitional and comparability problems of "worker migrations" and those related to the specific movements of the health workforce. This paper presents information on the uses of statistics and those who use them, the strengths and limitations of the main data sources, and other challenges that need to be met to obtain good evidence on the migration of health workers. This paper also proposes methods to improve the collection, analysis, sharing, and use of statistics on the migration of health workers. PMID:15375450
van Klaveren, David; Steyerberg, Ewout W; Serruys, Patrick W; Kent, David M
2018-02-01
Clinical prediction models that support treatment decisions are usually evaluated for their ability to predict the risk of an outcome rather than treatment benefit-the difference between outcome risk with vs. without therapy. We aimed to define performance metrics for a model's ability to predict treatment benefit. We analyzed data of the Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery (SYNTAX) trial and of three recombinant tissue plasminogen activator trials. We assessed alternative prediction models with a conventional risk concordance-statistic (c-statistic) and a novel c-statistic for benefit. We defined observed treatment benefit by the outcomes in pairs of patients matched on predicted benefit but discordant for treatment assignment. The 'c-for-benefit' represents the probability that from two randomly chosen matched patient pairs with unequal observed benefit, the pair with greater observed benefit also has a higher predicted benefit. Compared to a model without treatment interactions, the SYNTAX score II had improved ability to discriminate treatment benefit (c-for-benefit 0.590 vs. 0.552), despite having similar risk discrimination (c-statistic 0.725 vs. 0.719). However, for the simplified stroke-thrombolytic predictive instrument (TPI) vs. the original stroke-TPI, the c-for-benefit (0.584 vs. 0.578) was similar. The proposed methodology has the potential to measure a model's ability to predict treatment benefit not captured with conventional performance metrics. Copyright © 2017 Elsevier Inc. All rights reserved.
Using Bayes' theorem for free energy calculations
NASA Astrophysics Data System (ADS)
Rogers, David M.
Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.
Effects of sibutramine alone and with alcohol on cognitive function in healthy volunteers
Wesnes, K A; Garratt, C; Wickens, M; Gudgeon, A; Oliver, S
2000-01-01
Aims To investigate the effects of sibutramine in combination with alcohol in a double-blind, randomised, placebo-controlled, four-way crossover study in 20 healthy volunteers. Methods On each study day each volunteer received either: sibutramine 20 mg+0.5 g kg−1 alcohol; sibutramine 20 mg+placebo alcohol; placebo capsules+0.5 g kg−1 alcohol; or placebo capsules+placebo alcohol. Alcohol was administered 2 h following ingestion of the study capsules. During each study day, assessments of cognitive performance were made prior to dosing, and at 3, 4.5, 6 and 10 h post dosing. Blood alcohol concentration was estimated using a breath alcometer immediately prior to each cognitive performance test session. Each study day was followed by a minimum 7 day washout period. Results Alcohol was found to produce statistically significant impairments in tests of attention (maximum impairment to speed of digit vigilance=49 ms) and episodic memory (maximum impairment to speed of word recognition=74 ms). Alcohol also increased body sway (maximum increase 17.4 units) and lowered self rated alertness (maximum decrease 13.6 mm). These effects were produced by an inferred blood alcohol level of 53.2 mg dl−1.Sibutramine was not found to potentiate any of the effects of alcohol. There was a small, yet statistically significant, interaction effect observed on the sensitivity index of the picture recognition task. In this test, the combined effects of sibutramine and alcohol were smaller than the impairments produced by alcohol alone. Sibutramine, when dosed alone, was associated with improved performance on several tasks. Sibutramine improved attention (mean speed of digit vigilance improved by 21 ms), picture recognition speed (improvement at 3=81) and motor control (tracking error at 3 h reduced by 1.58 mm). Also sibutramine improved postural stability (reducing body sway at 3 h by 14.2 units). Adverse events reported were unremarkable and consistent with the known pharmacology of sibutramine and alcohol. Conclusions There was little evidence of a clinically relevant interaction of sibutramine with the impairment of cognitive function produced by alcohol in healthy volunteers. The single statistically significant interaction indicated a reduction, rather than a worsening, of alcohol-induced impairment when sibutramine is taken concomitantly. Sibutramine when administered alone is associated with improved performance on several tasks. PMID:10671904
EoR Foregrounds: the Faint Extragalactic Radio Sky
NASA Astrophysics Data System (ADS)
Prandoni, Isabella
2018-05-01
A wealth of new data from upgraded and new radio interferometers are rapidly improving and transforming our understanding of the faint extra-galactic radio sky. Indeed the mounting statistics at sub-mJy and μJy flux levels is finally allowing us to get stringent observational constraints on the faint radio population and on the modeling of its various components. In this paper I will provide a brief overview of the latest results in areas that are potentially important for an accurate treatment of extra-galactic foregrounds in experiments designed to probe the Epoch of Reionization.
Self-consistent assessment of Englert-Schwinger model on atomic properties
NASA Astrophysics Data System (ADS)
Lehtomäki, Jouko; Lopez-Acevedo, Olga
2017-12-01
Our manuscript investigates a self-consistent solution of the statistical atom model proposed by Berthold-Georg Englert and Julian Schwinger (the ES model) and benchmarks it against atomic Kohn-Sham and two orbital-free models of the Thomas-Fermi-Dirac (TFD)-λvW family. Results show that the ES model generally offers the same accuracy as the well-known TFD-1/5 vW model; however, the ES model corrects the failure in the Pauli potential near-nucleus region. We also point to the inability of describing low-Z atoms as the foremost concern in improving the present model.
George, Stephen L; Buyse, Marc
2015-01-01
Highly publicized cases of fabrication or falsification of data in clinical trials have occurred in recent years and it is likely that there are additional undetected or unreported cases. We review the available evidence on the incidence of data fraud in clinical trials, describe several prominent cases, present information on motivation and contributing factors and discuss cost-effective ways of early detection of data fraud as part of routine central statistical monitoring of data quality. Adoption of these clinical trial monitoring procedures can identify potential data fraud not detected by conventional on-site monitoring and can improve overall data quality. PMID:25729561
Self-consistent assessment of Englert-Schwinger model on atomic properties.
Lehtomäki, Jouko; Lopez-Acevedo, Olga
2017-12-21
Our manuscript investigates a self-consistent solution of the statistical atom model proposed by Berthold-Georg Englert and Julian Schwinger (the ES model) and benchmarks it against atomic Kohn-Sham and two orbital-free models of the Thomas-Fermi-Dirac (TFD)-λvW family. Results show that the ES model generally offers the same accuracy as the well-known TFD-15vW model; however, the ES model corrects the failure in the Pauli potential near-nucleus region. We also point to the inability of describing low-Z atoms as the foremost concern in improving the present model.
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.
Zheng, Wenjun
2010-01-01
Abstract Protein conformational dynamics, despite its significant anharmonicity, has been widely explored by normal mode analysis (NMA) based on atomic or coarse-grained potential functions. To account for the anharmonic aspects of protein dynamics, this study proposes, and has performed, an anharmonic NMA (ANMA) based on the Cα-only elastic network models, which assume elastic interactions between pairs of residues whose Cα atoms or heavy atoms are within a cutoff distance. The key step of ANMA is to sample an anharmonic potential function along the directions of eigenvectors of the lowest normal modes to determine the mean-squared fluctuations along these directions. ANMA was evaluated based on the modeling of anisotropic displacement parameters (ADPs) from a list of 83 high-resolution protein crystal structures. Significant improvement was found in the modeling of ADPs by ANMA compared with standard NMA. Further improvement in the modeling of ADPs is attained if the interactions between a protein and its crystalline environment are taken into account. In addition, this study has determined the optimal cutoff distances for ADP modeling based on elastic network models, and these agree well with the peaks of the statistical distributions of distances between Cα atoms or heavy atoms derived from a large set of protein crystal structures. PMID:20550915
Rizwanullah, Md; Amin, Saima; Ahmad, Javed
2017-01-01
In the present study, rosuvastatin calcium-loaded nanostructured lipid carriers were developed and optimized for improved efficacy. The ROS-Ca-loaded NLC was prepared using melt emulsification ultrasonication technique and optimized by Box-Behnken statistical design. The optimized NLC composed of glyceryl monostearate (solid lipid) and capmul MCM EP (liquid lipid) as lipid phase (3% w/v), poloxamer 188 (1%) and tween 80 (1%) as surfactant. The mean particle size, polydispersity index (PDI), zeta potential (ζ) and entrapment efficiency (%) of optimized NLC formulation was observed to be 150.3 ± 4.67 nm, 0.175 ± 0.022, -32.9 ± 1.36 mV and 84.95 ± 5.63%, respectively. NLC formulation showed better in vitro release in simulated intestinal fluid (pH 6.8) than API suspension. Confocal laser scanning showed deeper permeation of formulation across rat intestine compared to rhodamine B dye solution. Pharmacokinetic study on female albino Wistar rats showed 5.4-fold increase in relative bioavailability with NLC compared to API suspension. Optimized NLC formulation also showed significant (p < 0.01) lipid lowering effect in hyperlipidemic rats. Therefore, NLC represents a great potential for improved efficacy of ROS-Ca after oral administration.
On improving the communication between models and data.
Dietze, Michael C; Lebauer, David S; Kooper, Rob
2013-09-01
The potential for model-data synthesis is growing in importance as we enter an era of 'big data', greater connectivity and faster computation. Realizing this potential requires that the research community broaden its perspective about how and why they interact with models. Models can be viewed as scaffolds that allow data at different scales to inform each other through our understanding of underlying processes. Perceptions of relevance, accessibility and informatics are presented as the primary barriers to broader adoption of models by the community, while an inability to fully utilize the breadth of expertise and data from the community is a primary barrier to model improvement. Overall, we promote a community-based paradigm to model-data synthesis and highlight some of the tools and techniques that facilitate this approach. Scientific workflows address critical informatics issues in transparency, repeatability and automation, while intuitive, flexible web-based interfaces make running and visualizing models more accessible. Bayesian statistics provides powerful tools for assimilating a diversity of data types and for the analysis of uncertainty. Uncertainty analyses enable new measurements to target those processes most limiting our predictive ability. Moving forward, tools for information management and data assimilation need to be improved and made more accessible. © 2013 John Wiley & Sons Ltd.
Uchôa, Severina Alice da Costa; Arcêncio, Ricardo Alexandre; Fronteira, Inês Santos Estevinho; Coêlho, Ardigleusa Alves; Martiniano, Claudia Santos; Brandão, Isabel Cristina Araújo; Yamamura, Mellina; Maroto, Renata Melo
2016-01-01
Objective: to analyze the influence of contextual indicators on the performance of municipalities regarding potential access to primary health care in Brazil and to discuss the contribution from nurses working on this access. Method: a multicenter descriptive study based on secondary data from External Evaluation of the National Program for Access and Quality Improvement in Primary Care, with the participation of 17,202 primary care teams. The chi-square test of proportions was used to verify differences between the municipalities stratified based on size of the coverage area, supply, coordination, and integration; when necessary, the chi-square test with Yates correction or Fisher's exact test were employed. For the population variable, the Kruskal-Wallis test was used. Results: the majority of participants were nurses (n=15.876; 92,3%). Statistically significant differences were observed between the municipalities in terms of territory (p=0.0000), availability (p=0.0000), coordination of care (p=0.0000), integration (p=0.0000) and supply (p=0.0000), verifying that the municipalities that make up area 6 tend to have better performance in these dimensions. Conclusion: areas 4,5 and 6 performed better in every analyzed dimension, and the nurse had a leading role in the potential to access primary health care in Brazil. PMID:26959332
Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.
Monica, Stefania; Ferrari, Gianluigi
2018-05-17
Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.
A DMAIC approach for process capability improvement an engine crankshaft manufacturing process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa
2014-05-01
The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.
Statistics Section. Management and Technology Division. Papers.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on library statistics, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "Network Statistics and Library Management," in which Glyn T. Evans (United States) suggests that network statistics can be used to improve internal library decisionmaking, enhance group resource…
Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.
Potter, Christine E; Wang, Tianlin; Saffran, Jenny R
2017-04-01
Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.
Second language experience facilitates statistical learning of novel linguistic materials
Potter, Christine E.; Wang, Tianlin; Saffran, Jenny R.
2016-01-01
Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In the present research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, six months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, while both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. PMID:27988939
Dudko, Yevgeni; Kruger, Estie; Tennant, Marc
2017-01-01
Australia is one of the least densely populated countries in the world, with a population concentrated on or around coastal areas. Up to 33% of the Australian population are likely to have untreated dental decay, while people with inadequate dentition (fewer than 21 teeth) account for up to 34% of Australian adults. Historically, inadequate access to public dental care has resulted in long waiting lists, received much media coverage and been the subject of a new federal and state initiative. The objective of this research was to gauge the potential for reducing the national dental waiting list through geographical advantage, which could arise from subcontracting the delivery of subsidised dental care to the existing network of private dental clinics across Australia. Eligible population data were collected from the Australian Bureau of Statistics website. Waiting list data from across Australia were collected from publicly available sources and confirmed through direct communication with each individual state or territory dental health body. Quantum geographic information system software was used to map distribution of the eligible population across Australia by statistical area, and to plot locations of government and private dental clinics. Catchment areas of 5 km for metropolitan clinics and 5 km and 50 km for rural clinics were defined. The number of people on the waiting list and those eligible for subsidised dental care covered by each of the catchment areas was calculated. Percentage of the eligible population and those on the waiting list that could benefit from the potential improvement in geographic access was ascertained for metropolitan and rural residents. Fifty three percent of people on the waiting list resided within metropolitan areas. Rural and remote residents made up 47% of the population waiting to receive care. The utilisation of both government and private dental clinics for the delivery of subsidised dental care to the eligible population has the potential to improve geographic access for up to 25% of those residing within metropolitan areas and up to 59% for eligible country residents. This research finds that utilisation of the existing network of private dental practices across Australia for delivery of subsidised dental care could dramatically increase geographic reach, reduce waiting lists, and possibly make good oral health a more realistic goal to achieve for the economically disadvantaged members of the community. In addition, this approach has the potential to improve service availability in rural and remote areas for entire communities where existing socioeconomic dynamics do not foster new practice start-up.
Investigation of statistical iterative reconstruction for dedicated breast CT
Makeev, Andrey; Glick, Stephen J.
2013-01-01
Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue. Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 μm microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters. Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose. Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose. PMID:23927318
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtz, R.; Kaplan, A.
Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-building elements and their functions in a fully-designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less
Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.
2017-01-01
Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (<40%) between the two methods Despite these differences in variable sets (expert versus statistical), models had high performance metrics (>0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable selection is a useful first step, especially when there is a need to model a large number of species or expert knowledge of the species is limited. Expert input can then be used to refine models that seem unrealistic or for species that experts believe are particularly sensitive to change. It also emphasizes the importance of using multiple models to reduce uncertainty and improve map outputs for conservation planning. Where outputs overlap or show the same direction of change there is greater certainty in the predictions. Areas of disagreement can be used for learning by asking why the models do not agree, and may highlight areas where additional on-the-ground data collection could improve the models.
Wang, Xiaobin; Cai, Dianxiong; Hoogmoed, Willem B; Oenema, Oene
2011-08-30
An apparently large disparity still exists between developed and developing countries in historical trends of the amounts of nitrogen (N) fertilizers consumed, and the same situation holds true in China. The situation of either N overuse or underuse has become one of the major limiting factors in agricultural production and economic development in China. The issue of food security in N-poor regions has been given the greatest attention internationally. Balanced and appropriate use of N fertilizer for enriching soil fertility is an effective step in preventing soil degradation, ensuring food security, and further contributing to poverty alleviation and rural economic development in the N-poor regions. Based on the China Statistical Yearbook (2007), there could be scope for improvement of N use efficiency (NUE) in N-rich regions by reducing N fertilizer input to an optimal level (≤180 kg N ha(-1)), and also potential for increasing yield in the N-poor regions by further increasing N fertilizer supply (up to 116 kg N ha(-1)). For the N-rich regions, the average estimated potential of N saving and NUE increase could be about 15% and 23%, respectively, while for the N-poor regions the average estimated potential for yield increase could be 21% on a regional scale, when N input is increased by 13%. The study suggests that to achieve the goals of regional yield improvement, it is necessary to readjust and optimize regional distribution of N fertilizer use between the N-poor and N-rich regions in China, in combination with other nutrient management practices. Copyright © 2011 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Using CRANID to test the population affinity of known crania.
Kallenberger, Lauren; Pilbrow, Varsha
2012-11-01
CRANID is a statistical program used to infer the source population of a cranium of unknown origin by comparing its cranial dimensions with a worldwide craniometric database. It has great potential for estimating ancestry in archaeological, forensic and repatriation cases. In this paper we test the validity of CRANID in classifying crania of known geographic origin. Twenty-three crania of known geographic origin but unknown sex were selected from the osteological collections of the University of Melbourne. Only 18 crania showed good statistical match with the CRANID database. Without considering accuracy of sex allocation, 11 crania were accurately classified into major geographic regions and nine were correctly classified to geographically closest available reference populations. Four of the five crania with poor statistical match were nonetheless correctly allocated to major geographical regions, although none was accurately assigned to geographically closest reference samples. We conclude that if sex allocations are overlooked, CRANID can accurately assign 39% of specimens to geographically closest matching reference samples and 48% to major geographic regions. Better source population representation may improve goodness of fit, but known sex-differentiated samples are needed to further test the utility of CRANID. © 2012 The Authors Journal of Anatomy © 2012 Anatomical Society.
Martian cratering 11. Utilizing decameter scale crater populations to study Martian history
NASA Astrophysics Data System (ADS)
Hartmann, W. K.; Daubar, I. J.
2017-03-01
New information has been obtained in recent years regarding formation rates and the production size-frequency distribution (PSFD) of decameter-scale primary Martian craters formed during recent orbiter missions. Here we compare the PSFD of the currently forming small primaries (P) with new data on the PSFD of the total small crater population that includes primaries and field secondaries (P + fS), which represents an average over longer time periods. The two data sets, if used in a combined manner, have extraordinary potential for clarifying not only the evolutionary history and resurfacing episodes of small Martian geological formations (as small as one or few km2) but also possible episodes of recent climatic change. In response to recent discussions of statistical methodologies, we point out that crater counts do not produce idealized statistics, and that inherent uncertainties limit improvements that can be made by more sophisticated statistical analyses. We propose three mutually supportive procedures for interpreting crater counts of small craters in this context. Applications of these procedures support suggestions that topographic features in upper meters of mid-latitude ice-rich areas date only from the last few periods of extreme Martian obliquity, and associated predicted climate excursions.
Reproducibility of ZrO2-based freeze casting for biomaterials.
Naleway, Steven E; Fickas, Kate C; Maker, Yajur N; Meyers, Marc A; McKittrick, Joanna
2016-04-01
The processing technique of freeze casting has been intensely researched for its potential to create porous scaffold and infiltrated composite materials for biomedical implants and structural materials. However, in order for this technique to be employed medically or commercially, it must be able to reliably produce materials in great quantities with similar microstructures and properties. Here we investigate the reproducibility of the freeze casting process by independently fabricating three sets of eight ZrO2-epoxy composite scaffolds with the same processing conditions but varying solid loading (10, 15 and 20 vol.%). Statistical analyses (One-way ANOVA and Tukey's HSD tests) run upon measurements of the microstructural dimensions of these composite scaffold sets show that, while the majority of microstructures are similar, in all cases the composite scaffolds display statistically significant variability. In addition, composite scaffolds where mechanically compressed and statistically analyzed. Similar to the microstructures, almost all of their resultant properties displayed significant variability though most composite scaffolds were similar. These results suggest that additional research to improve control of the freeze casting technique is required before scaffolds and composite scaffolds can reliably be reproduced for commercial or medical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Model identification using stochastic differential equation grey-box models in diabetes.
Duun-Henriksen, Anne Katrine; Schmidt, Signe; Røge, Rikke Meldgaard; Møller, Jonas Bech; Nørgaard, Kirsten; Jørgensen, John Bagterp; Madsen, Henrik
2013-03-01
The acceptance of virtual preclinical testing of control algorithms is growing and thus also the need for robust and reliable models. Models based on ordinary differential equations (ODEs) can rarely be validated with standard statistical tools. Stochastic differential equations (SDEs) offer the possibility of building models that can be validated statistically and that are capable of predicting not only a realistic trajectory, but also the uncertainty of the prediction. In an SDE, the prediction error is split into two noise terms. This separation ensures that the errors are uncorrelated and provides the possibility to pinpoint model deficiencies. An identifiable model of the glucoregulatory system in a type 1 diabetes mellitus (T1DM) patient is used as the basis for development of a stochastic-differential-equation-based grey-box model (SDE-GB). The parameters are estimated on clinical data from four T1DM patients. The optimal SDE-GB is determined from likelihood-ratio tests. Finally, parameter tracking is used to track the variation in the "time to peak of meal response" parameter. We found that the transformation of the ODE model into an SDE-GB resulted in a significant improvement in the prediction and uncorrelated errors. Tracking of the "peak time of meal absorption" parameter showed that the absorption rate varied according to meal type. This study shows the potential of using SDE-GBs in diabetes modeling. Improved model predictions were obtained due to the separation of the prediction error. SDE-GBs offer a solid framework for using statistical tools for model validation and model development. © 2013 Diabetes Technology Society.
Petrillo, S; Longo, U G; Papalia, R; Denaro, V
2017-08-01
To report the outcomes and complications of reverse shoulder arthroplasty (RSA) in massive irreparable rotator cuff tears (MIRCT) and cuff tear arthropathy (CTA). A systematic review of the literature contained in Medline, Cochrane, EMBASE, Google Scholar and Ovid databases was conducted on May 1, 2016, according to PRISMA guidelines. The key words "reverse total shoulder arthroplasty" or "reverse total shoulder prostheses" with "rotator cuff tears"; "failed rotator cuff surgery"; "massive rotator cuff tears"; "irreparable rotator cuff tears"; "cuff tear arthropathy"; "outcomes"; "complications" were matched. All articles reporting outcomes and complications of RSA for the management of MIRCT or CTA were included. The comparison between preoperative and postoperative clinical scores, as well as range of motion (ROM), was performed using the Wilcoxon-Mann-Whitney test. P values lower than 0.05 were considered statistically significant. Seven articles were included in our qualitative synthesis. A statistically significant improvement in all clinical scores and ROM was found comparing the preoperative value with the postoperative value. The degrees of retroversion of the humeral stem of the RSA do not influence the functional outcomes in a statistically significant fashion. There were 17.4% of complications. The most frequent was heterotopic ossification, occurring in 6.6% of patients. Revision surgery was necessary in 7.3% of patients. RSA restores pain-free ROM and improves function of the shoulder in patients with MIRCT or CTA. However, complications occur in a high percentage of patients. The lack of level I studies limits the real understanding of the potentials and limitations of RSA for the management of MIRCT and CTA.
Fancher, Crystal E; Scott, Anthony; Allen, Ahkeel; Dale, Paul
2017-08-01
this is a 10-year retrospective chart review evaluating the potential impact of the most recent American Cancer Society mammography screening guidelines which excludes female patients aged 40 to 44 years from routine annual screening mammography. Instead they recommend screening mammography starting at age 45 with the option to begin screening earlier if the patient desires. The institutional cancer registry was systematically searched to identify all women aged 40 to 44 years treated for breast cancer over a 10-year period. These women were separated into two cohorts: screening mammography detected cancer (SMDC) and nonscreening mammography detected cancer (NSMDC). Statistical analysis of the cohorts was performed for lymph node status (SLN), five-year disease-free survival, and five-year overall survival. Women with SMDC had a significantly lower incidence of SLN positive cancer than the NSMDC group, 9 of 63 (14.3%) versus 36 of 81 (44 %; P < 0.001). The five-year disease-free survival for both groups was 84 per cent for SMDC and 80 per cent for NSMDC; this was not statistically significant. The five-year overall survival was statistically significant at 94 per cent for the SMDC group and 80 per cent for the NSMDC group (P < 0.05). This review demonstrates the significance of mammographic screening for early detection and treatment of breast cancer. Mammographic screening in women aged 40 to 44 detected tumors with fewer nodal metastases, resulting in improved survival and reaffirming the need for annual mammographic screening in this age group.
Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J
2008-02-01
One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.
González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M
2007-06-07
In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.
Aczel, Balazs; Bago, Bence; Szollosi, Aba; Foldes, Andrei; Lukacs, Bence
2015-01-01
The aim of this study was to initiate the exploration of debiasing methods applicable in real-life settings for achieving lasting improvement in decision making competence regarding multiple decision biases. Here, we tested the potentials of the analogical encoding method for decision debiasing. The advantage of this method is that it can foster the transfer from learning abstract principles to improving behavioral performance. For the purpose of the study, we devised an analogical debiasing technique for 10 biases (covariation detection, insensitivity to sample size, base rate neglect, regression to the mean, outcome bias, sunk cost fallacy, framing effect, anchoring bias, overconfidence bias, planning fallacy) and assessed the susceptibility of the participants (N = 154) to these biases before and 4 weeks after the training. We also compared the effect of the analogical training to the effect of ‘awareness training’ and a ‘no-training’ control group. Results suggested improved performance of the analogical training group only on tasks where the violations of statistical principles are measured. The interpretation of these findings require further investigation, yet it is possible that analogical training may be the most effective in the case of learning abstract concepts, such as statistical principles, which are otherwise difficult to master. The study encourages a systematic research of debiasing trainings and the development of intervention assessment methods to measure the endurance of behavior change in decision debiasing. PMID:26300816
MixGF: spectral probabilities for mixture spectra from more than one peptide.
Wang, Jian; Bourne, Philip E; Bandeira, Nuno
2014-12-01
In large-scale proteomic experiments, multiple peptide precursors are often cofragmented simultaneously in the same mixture tandem mass (MS/MS) spectrum. These spectra tend to elude current computational tools because of the ubiquitous assumption that each spectrum is generated from only one peptide. Therefore, tools that consider multiple peptide matches to each MS/MS spectrum can potentially improve the relatively low spectrum identification rate often observed in proteomics experiments. More importantly, data independent acquisition protocols promoting the cofragmentation of multiple precursors are emerging as alternative methods that can greatly improve the throughput of peptide identifications but their success also depends on the availability of algorithms to identify multiple peptides from each MS/MS spectrum. Here we address a fundamental question in the identification of mixture MS/MS spectra: determining the statistical significance of multiple peptides matched to a given MS/MS spectrum. We propose the MixGF generating function model to rigorously compute the statistical significance of peptide identifications for mixture spectra and show that this approach improves the sensitivity of current mixture spectra database search tools by a ≈30-390%. Analysis of multiple data sets with MixGF reveals that in complex biological samples the number of identified mixture spectra can be as high as 20% of all the identified spectra and the number of unique peptides identified only in mixture spectra can be up to 35.4% of those identified in single-peptide spectra. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
MixGF: Spectral Probabilities for Mixture Spectra from more than One Peptide*
Wang, Jian; Bourne, Philip E.; Bandeira, Nuno
2014-01-01
In large-scale proteomic experiments, multiple peptide precursors are often cofragmented simultaneously in the same mixture tandem mass (MS/MS) spectrum. These spectra tend to elude current computational tools because of the ubiquitous assumption that each spectrum is generated from only one peptide. Therefore, tools that consider multiple peptide matches to each MS/MS spectrum can potentially improve the relatively low spectrum identification rate often observed in proteomics experiments. More importantly, data independent acquisition protocols promoting the cofragmentation of multiple precursors are emerging as alternative methods that can greatly improve the throughput of peptide identifications but their success also depends on the availability of algorithms to identify multiple peptides from each MS/MS spectrum. Here we address a fundamental question in the identification of mixture MS/MS spectra: determining the statistical significance of multiple peptides matched to a given MS/MS spectrum. We propose the MixGF generating function model to rigorously compute the statistical significance of peptide identifications for mixture spectra and show that this approach improves the sensitivity of current mixture spectra database search tools by a ≈30–390%. Analysis of multiple data sets with MixGF reveals that in complex biological samples the number of identified mixture spectra can be as high as 20% of all the identified spectra and the number of unique peptides identified only in mixture spectra can be up to 35.4% of those identified in single-peptide spectra. PMID:25225354
Speeding in highway work zone: An Evaluation of methods of speed control.
Ravani, Bahram; Wang, Chao
2018-04-01
Highway workers frequently work in close proximity of live traffic in highway work zones, traffic accidents therefore have devastating effects on worker safety. In order to reduce the potential for such accidents, methods involving use of advisory signs and police presence have been used to mitigate accident risks and improve safety for highway workers. This research evaluates the magnitude of the speeding problem in highway work zones and the effects of four levels of police presence on improving work zone safety. Speed data were collected in six different work zone locations in northern and southern California and used to determine the magnitude and nature of speeding problem in highway work zones. In addition data were collected over 11 test-days in four work zones with four levels of police presence: radar speed display with police decal and lighting, passive use of a police vehicle with radar speed display, passive use of a police vehicle without radar speed display, and active police speed enforcement near work zones. This paper analyzes this data using statistical methods to evaluate the effectiveness of these different methods of speed control on the safety of the work zone. Four Measures of Effectiveness (MOE) were used in this evaluation consisting of average speed reduction, speed variance, 85th percentile speed, and proportion of high speed vehicles. The results indicate that all levels of police presence provided statistically significant improvements in one or more of the MOEs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Kelly, J J
1987-01-01
This article summarizes the 3 main types of interrelated activities which the Conference of European Statisticians has worked on to improve the measurement and international comparability of international migration flows. The work has encompassed collaborating with the UN Statistical Commission on the preparation and implementation of the revised international recommendations on statistics of international migration, organizing a regular exchange of data on immigration and emigration flows among the UN Economic Commission for Europe countries and selected countries in other regions, and conducting bilateral studies on international migration within the framework of the Conference's program of work in this field of statistics. The bulk of the work which has been carried out to date by the conference has been conducted rather anonymously and even unobtrusively by the staff of national statistical offices in Economic Commission for Europe countries; they have achieved a modest but important amount of progress during the past 15 years. There is reason to expect that further progress will be made over the next decade, particularly if national statistical offices in the region continue to undertake bilateral studies and endeavor to improve their migration statistics. However, more substantial progress could be achieved if additional countries and organizations established projects aimed at achieving these ends (author's modified).
Dodd, Lori E; Wagner, Robert F; Armato, Samuel G; McNitt-Gray, Michael F; Beiden, Sergey; Chan, Heang-Ping; Gur, David; McLennan, Geoffrey; Metz, Charles E; Petrick, Nicholas; Sahiner, Berkman; Sayre, Jim
2004-04-01
Cancer of the lung and bronchus is the leading fatal malignancy in the United States. Five-year survival is low, but treatment of early stage disease considerably improves chances of survival. Advances in multidetector-row computed tomography technology provide detection of smaller lung nodules and offer a potentially effective screening tool. The large number of images per exam, however, requires considerable radiologist time for interpretation and is an impediment to clinical throughput. Thus, computer-aided diagnosis (CAD) methods are needed to assist radiologists with their decision making. To promote the development of CAD methods, the National Cancer Institute formed the Lung Image Database Consortium (LIDC). The LIDC is charged with developing the consensus and standards necessary to create an image database of multidetector-row computed tomography lung images as a resource for CAD researchers. To develop such a prospective database, its potential uses must be anticipated. The ultimate applications will influence the information that must be included along with the images, the relevant measures of algorithm performance, and the number of required images. In this article we outline assessment methodologies and statistical issues as they relate to several potential uses of the LIDC database. We review methods for performance assessment and discuss issues of defining "truth" as well as the complications that arise when truth information is not available. We also discuss issues about sizing and populating a database.
NASA Astrophysics Data System (ADS)
Shafian, S.; Maas, S. J.
2015-12-01
Variations in soil moisture strongly affect surface energy balances, regional runoff, land erosion and vegetation productivity (i.e., potential crop yield). Hence, the estimation of soil moisture is very valuable in the social, economic, humanitarian (food security) and environmental segments of society. Extensive efforts to exploit the potential of remotely sensed observations to help quantify this complex variable are ongoing. This study aims at developing a new index, the Thermal Ground cover Moisture Index (TGMI), for estimating soil moisture content. This index is based on empirical parameterization of the relationship between raw image digital count (DC) data in the thermal infrared spectral band and ground cover (determined from raw image digital count data in the red and near-infrared spectral bands).The index uses satellite-derived information only, and the potential for its operational application is therefore great. This study was conducted in 18 commercial agricultural fields near Lubbock, TX (USA). Soil moisture was measured in these fields over two years and statistically compared to corresponding values of TGMI determined from Landsat image data. Results indicate statistically significant correlations between TGMI and field measurements of soil moisture (R2 = 0.73, RMSE = 0.05, MBE = 0.17 and AAE = 0.049), suggesting that soil moisture can be estimated using this index. It was further demonstrated that maps of TGMI developed from Landsat imagery could be constructed to show the relative spatial distribution of soil moisture across a region.
Combined dietary and exercise intervention for control of serum cholesterol in the workplace
NASA Technical Reports Server (NTRS)
Angotti, C. M.; Chan, W. T.; Sample, C. J.; Levine, M. S.
2000-01-01
PURPOSE: To elucidate a potential combined dietary and exercise intervention affect on cardiovascular risk reduction of the National Aeronautics and Space Administration Headquarters employees. DESIGN: A nonexperimental, longitudinal, clinical-chart review study (1987 to 1996) of an identified intervention group and a reference (not a control) group. SETTING: The study group worked in an office environment and participated in the annual medical examinations. SUBJECTS: An intervention group of 858 people with initially elevated serum cholesterol, and a reference group of 963 people randomly sampled from 10% of the study group. MEASURES: Serum cholesterol data were obtained for both groups, respectively, from pre- and postintervention and annual examinations. The reference group was adjusted by statistical exclusion of potential intervention participants. Regression equations (cholesterol vs. study years) for the unadjusted/adjusted reference groups were tested for statistical significance. INTERVENTION: An 8-week individualized, combined dietary and exercise program was instituted with annual follow-ups and was repeated where warranted. RESULTS: Only the unadjusted (but not the adjusted) reference group with initial mean total serum cholesterol levels above 200 mg/dL shows a significant 9-year decline trend and significant beta coefficient tests. An intervention effect is suggested. Mean high density lipoprotein cholesterol rose slightly in the intervention group but was maintained in the reference group. CONCLUSION: With potential design limitations, the NASA intervention program focusing on a high risk group may be associated to some degree, if not fully, with an overall cardiovascular risk profile improvement.
Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali
2016-01-01
Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3-7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.
Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali
2016-01-01
Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3–7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception. PMID:28018197
[Nootropics and antioxidants in the complex therapy of symptomatic posttraumatic epilepsy].
Savenkov, A A; Badalian, O L; Avakian, G N
2013-01-01
To study the possibility of application of nootropics and antioxidants in the complex antiepileptic therapy, we examined 75 patients with symptomatic focal posttraumatic epilepsy. A statistically significant reduction in the number of epileptic seizures, improvement of cognitive function and quality of life of the patients as well as a decrease in the severity of depression and epileptic changes in the EEG were identified. The potentiation of antiepileptic activity of basic drugs, normalization of brain's electrical activity and reduction in EEG epileptiform activity, in particular coherent indicators of slow-wave activity, were noted after treatment with the antioxidant mexidol. A trend towards the improvement of neuropsychological performance and quality of life was observed. There was a lack of seizure aggravation typical of many nootropic drugs. Thus, phenotropil and mexidol can be recommended for complex treatment of symptomatic posttraumatic epilepsy.
Neuroplasticity and Clinical Practice: Building Brain Power for Health.
Shaffer, Joyce
2016-01-01
The focus of this review is on driving neuroplasticity in a positive direction using evidence-based interventions that also have the potential to improve general health. One goal is to provide an overview of the many ways new neuroscience can inform treatment protocols to empower and motivate clients to make the lifestyle choices that could help build brain power and could increase adherence to healthy lifestyle changes that have also been associated with simultaneously enhancing vigorous longevity, health, happiness, and wellness. Another goal is to explore the use of a focus in clinical practice on helping clients appreciate this new evidence and use evolving neuroscience in establishing individualized goals, designing strategies for achieving them and increasing treatment compliance. The timing is urgent for such interventions with goals of enhancing brain health across the lifespan and improving statistics on dementia worldwide.
Strategies to improve learning of all students in a class
NASA Astrophysics Data System (ADS)
Suraishkumar, G. K.
2018-05-01
The statistical distribution of the student learning abilities in a typical undergraduate engineering class poses a significant challenge to simultaneously improve the learning of all the students in the class. With traditional instruction styles, the students with significantly high learning abilities are not satisfied due to a feeling of unfulfilled potential, and the students with significantly low learning abilities feel lost. To address the challenge in an undergraduate core/required course on 'transport phenomena in biological systems', a combination of learning strategies such as active learning including co-operative group learning, challenge exercises, and others were employed in a pro-advising context. The short-term and long-term impacts were evaluated through student course performances and input, respectively. The results show that it is possible to effectively address the challenge posed by the distribution of student learning abilities in a class.
Toward Intraoperative Image-Guided Transoral Robotic Surgery
Liu, Wen P.; Reaugamornrat, Sureerat; Deguet, Anton; Sorger, Jonathan M.; Siewerdsen, Jeffrey H.; Richmon, Jeremy; Taylor, Russell H.
2014-01-01
This paper presents the development and evaluation of video augmentation on the stereoscopic da Vinci S system with intraoperative image guidance for base of tongue tumor resection in transoral robotic surgery (TORS). Proposed workflow for image-guided TORS begins by identifying and segmenting critical oropharyngeal structures (e.g., the tumor and adjacent arteries and nerves) from preoperative computed tomography (CT) and/or magnetic resonance (MR) imaging. These preoperative planned data can be deformably registered to the intraoperative endoscopic view using mobile C-arm cone-beam computed tomography (CBCT) [1, 2]. Augmentation of TORS endoscopic video defining surgical targets and critical structures has the potential to improve navigation, spatial orientation, and confidence in tumor resection. Experiments in animal specimens achieved statistically significant improvement in target localization error when comparing the proposed image guidance system to simulated current practice. PMID:25525474
Neuroplasticity and Clinical Practice: Building Brain Power for Health
Shaffer, Joyce
2016-01-01
The focus of this review is on driving neuroplasticity in a positive direction using evidence-based interventions that also have the potential to improve general health. One goal is to provide an overview of the many ways new neuroscience can inform treatment protocols to empower and motivate clients to make the lifestyle choices that could help build brain power and could increase adherence to healthy lifestyle changes that have also been associated with simultaneously enhancing vigorous longevity, health, happiness, and wellness. Another goal is to explore the use of a focus in clinical practice on helping clients appreciate this new evidence and use evolving neuroscience in establishing individualized goals, designing strategies for achieving them and increasing treatment compliance. The timing is urgent for such interventions with goals of enhancing brain health across the lifespan and improving statistics on dementia worldwide. PMID:27507957
Idris, Ayman Salih Omer; Pandey, Ashok; Rao, S S; Sukumaran, Rajeev K
2017-10-01
The production of cellulase by Trichoderma reesei RUT C-30 under solid-state fermentation (SSF) on wheat bran and cellulose was optimized employing a two stage statistical design of experiments. Optimization of process parameters resulted in a 3.2-fold increase in CMCase production to 959.53IU/gDS. The process was evaluated at pilot scale in tray fermenters and yielded 457IU/gDS using the lab conditions and indicating possibility for further improvement. The cellulase could effectively hydrolyze alkali pretreated sorghum stover and addition of Aspergillus niger β-glucosidase improved the hydrolytic efficiency 174%, indicating the potential to use this blend for effective saccharification of sorghum stover biomass. The enzymatic hydrolysate of sorghum stover was fermented to ethanol with ∼80% efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kataoka, Norio; Kasama, Kiyonobu; Zen, Kouki; Chen, Guangqi
This paper presents a probabilistic method for assessi ng the liquefaction risk of cement-treated ground, which is an anti-liquefaction ground improved by cemen t-mixing. In this study, the liquefaction potential of cement-treated ground is analyzed statistically using Monte Carlo Simulation based on the nonlinear earthquake response analysis consid ering the spatial variability of so il properties. The seismic bearing capacity of partially liquefied ground is analyzed in order to estimat e damage costs induced by partial liquefaction. Finally, the annual li quefaction risk is calcu lated by multiplying the liquefaction potential with the damage costs. The results indicated that the proposed new method enables to evaluate the probability of liquefaction, to estimate the damage costs using the hazard curv e, fragility curve induced by liquefaction, and liq uefaction risk curve.
Solar thermal power plants in small utilities - An economic impact analysis
NASA Technical Reports Server (NTRS)
Bluhm, S. A.; Ferber, R. R.; Mayo, L. G.
1979-01-01
A study was performed to assess the potential economic impact of small solar thermal electric power systems in statistically representative synthetic small utilities of the Southwestern United States. Power supply expansion plans were compared on the basis of present worth of future revenue requirements for 1980-2000 with and without solar thermal plants. Coal-fired and oil-fired municipal utility expansion plans with 5 percent solar penetration were 0.5 percent and 2.25 percent less expensive, respectively, than the corresponding conventional plan. At $969/kWe, which assumes the same low cost solar equipment but no improvement in site development costs, solar penetration of 5 percent in the oil-fired municipal reduced revenue requirements 0.88 percent. The paper concludes that some solar thermal plants are potentially economic in small community utilities of the Southwest.
Balancing Treatment and Control Groups in Quasi-Experiments: An Introduction to Propensity Scoring
ERIC Educational Resources Information Center
Connelly, Brian S.; Sackett, Paul R.; Waters, Shonna D.
2013-01-01
Organizational and applied sciences have long struggled with improving causal inference in quasi-experiments. We introduce organizational researchers to propensity scoring, a statistical technique that has become popular in other applied sciences as a means for improving internal validity. Propensity scoring statistically models how individuals in…
Recommendations for Improved Performance Appraisal in the Federal Sector
1986-01-01
camera-ready copy of a Participant’s Coursebook to be used in conducting sessions of the course, and (d) an evaluation instrument for use in obtaining...Timeliness and Availability of Departmental Statistics and Analyses Develop complete plans for conducting the 1990 census • Improve statistics on
McCarthy, Bridie; Trace, Anna; O'Donovan, Moira; O'Regan, Patricia; Brady-Nevin, Caroline; O'Shea, Maria; Martin, Ann-Marie; Murphy, Margaret
2018-02-01
Knowledge of coping mechanisms is important for nursing and midwifery students to cope with stressful events during undergraduate education. To evaluate the impact of a psycho-educational intervention "Coping with Stressful Events" with first year undergraduate nursing and midwifery students. A quasi-experimental, one-group pre-post-test. One school of nursing/midwifery in one university in Ireland. A convenience sample of all first year undergraduate nursing and midwifery students (n=197). Of these 166 completed the pretest and 138 students completed the post test. Using the COPE Inventory questionnaire (Carver et al., 1989) data was collected pre and post-delivery of the psycho-educational intervention "Coping with Stressful Events" by two research assistants. Data were analysed using the IBM SPSS Statistics version 22 (NY, USA). Results demonstrated improved coping skills by students. There were statistically significant differences between pre and post intervention for some coping subscales. For example, the mean subscale scores were lower post-intervention for restraint and mental disengagement, and higher for use of emotional and instrumental social support indicating improved coping strategies. This intervention has the potential to influence undergraduate nursing and midwifery students coping skills during their first year of an undergraduate programme. Copyright © 2017 Elsevier Ltd. All rights reserved.
Turner, Katrina; McCarthy, Valerie Lander
2017-01-01
Undergraduate nursing students experience significant stress and anxiety, inhibiting learning and increasing attrition. Twenty-six intervention studies were identified and evaluated, updating a previous systematic review which categorized interventions targeting: (1) stressors, (2) coping, or (3) appraisal. The majority of interventions in this review aimed to reduce numbers or intensity of stressors through curriculum development (12) or to improve students' coping skills (8). Two studies reported interventions using only cognitive reappraisal while three interventions combined reappraisal with other approaches. Strength of evidence was limited by choice of study design, sample size, and lack of methodological rigor. Some statistically significant support was found for interventions focused on reducing stressors through curriculum development or improving students' coping skills. No statistically significant studies using reappraisal, either alone or in combination with other approaches, were identified, although qualitative findings suggested the potential benefits of this approach do merit further study. Progress was noted since 2008 in the increased number of studies and greater use of validated outcome measures but the review concluded further methodologically sound, adequately powered studies, especially randomized controlled trials, are needed to determine which interventions are effective to address the issue of excessive stress and anxiety among undergraduate nursing students. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fourtune, Lisa; Prunier, Jérôme G; Paz-Vinas, Ivan; Loot, Géraldine; Veyssière, Charlotte; Blanchet, Simon
2018-04-01
Identifying landscape features that affect functional connectivity among populations is a major challenge in fundamental and applied sciences. Landscape genetics combines landscape and genetic data to address this issue, with the main objective of disentangling direct and indirect relationships among an intricate set of variables. Causal modeling has strong potential to address the complex nature of landscape genetic data sets. However, this statistical approach was not initially developed to address the pairwise distance matrices commonly used in landscape genetics. Here, we aimed to extend the applicability of two causal modeling methods-that is, maximum-likelihood path analysis and the directional separation test-by developing statistical approaches aimed at handling distance matrices and improving functional connectivity inference. Using simulations, we showed that these approaches greatly improved the robustness of the absolute (using a frequentist approach) and relative (using an information-theoretic approach) fits of the tested models. We used an empirical data set combining genetic information on a freshwater fish species (Gobio occitaniae) and detailed landscape descriptors to demonstrate the usefulness of causal modeling to identify functional connectivity in wild populations. Specifically, we demonstrated how direct and indirect relationships involving altitude, temperature, and oxygen concentration influenced within- and between-population genetic diversity of G. occitaniae.
Fulcher, Yan G.; Fotso, Martial; Chang, Chee-Hoon; Rindt, Hans; Reinero, Carol R.
2016-01-01
Asthma is prevalent in children and cats, and needs means of noninvasive diagnosis. We sought to distinguish noninvasively the differences in 53 cats before and soon after induction of allergic asthma, using NMR spectra of exhaled breath condensate (EBC). Statistical pattern recognition was improved considerably by preprocessing the spectra with probabilistic quotient normalization and glog transformation. Classification of the 106 preprocessed spectra by principal component analysis and partial least squares with discriminant analysis (PLS-DA) appears to be impaired by variances unrelated to eosinophilic asthma. By filtering out confounding variances, orthogonal signal correction (OSC) PLS-DA greatly improved the separation of the healthy and early asthmatic states, attaining 94% specificity and 94% sensitivity in predictions. OSC enhancement of multi-level PLS-DA boosted the specificity of the prediction to 100%. OSC-PLS-DA of the normalized spectra suggest the most promising biomarkers of allergic asthma in cats to include increased acetone, metabolite(s) with overlapped NMR peaks near 5.8 ppm, and a hydroxyphenyl-containing metabolite, as well as decreased phthalate. Acetone is elevated in the EBC of 74% of the cats with early asthma. The noninvasive detection of early experimental asthma, biomarkers in EBC, and metabolic perturbation invite further investigation of the diagnostic potential in humans. PMID:27764146
Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M
2015-01-01
Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Lee, J.
2017-12-01
As of April 2017, California is the third most prevalent state on the United States for Zika Infection and Southern California has an ever growing population of Aedes mosquitos. Zika is a disease which poses a significant risk to humans and other mammals due to its effects on pregnancy. This emerging disease is highly contagious due to its spread of infection primarily by Aedes aegypti mosquitos. Aedes mosquitos are able to breed in small rain collecting containers which allow the species to persevere in urban and semi urban environments. We hope to identify potential areas with risk of human infection within Los Angeles and its surrounding areas. This study integrates remote sensing, GIS, statistical, and environmental techniques to study favorable habitats for this particular species of mosquitos and their larvae. The study of the geographic and landscape factors which promote the larvae development allow for the disease spread to be analyzed and modeled. There are several goals in the development of this study. These include the coordination of statistical data with local epidemiology departments, identify workflows to improve efficiency, create models which can be utilized for disease prevention, and identify geographic risk factors for the spread of Zika.
Relationship between metabolic syndrome and moderate-to-vigorous physical activity in youth.
Machado-Rodrigues, Aristides M; Leite, Neiva; Coelho e Silva, Manuel J; Valente-dos-Santos, João; Martins, Raul A; Mascarenhas, Luis P G; Boguszewski, Margaret C S; Padez, Cristina; Malina, Robert M
2015-01-01
Associations of metabolic syndrome (MetS) with lifestyle behaviors in youth is potentially important for identifying subgroups at risk and encourage interventions. This study evaluates the associations among the clustering of metabolic risk factors and moderate-to-vigorous physical activity (MVPA) in youth. The sample comprised 522 girls and 402 boys (N = 924) aged 11 to 17 years. Height, weight, waist circumference (WC), fasting glucose, high-density lipoprotein cholesterol, triglycerides, and blood pressures were measured. Cardiorespiratory fitness (CRF) was assessed using the 20-m shuttle run test. MVPA was estimated with a 3-day diary. Outcome variables were statistically normalized and expressed as z scores. A clustered metabolic risk score was computed as the mean of z scores. Multiple linear regression was used to test associations between metabolic risk and MVPA by sex, adjusted for age, WC, and CRF. After adjustment for potential confounders, MVPA was inversely associated with the clustering of metabolic risk factors in girls, but not in boys; in addition, after adjusting for WC, the statistical model of that relationship was substantially improved in girls. MVPA was independently associated with increased risk of MetS in girls. Additional efforts are needed to encourage research with different analytical approach and standardization of criteria for MetS in youth.
Predictors of work-related sensitisation, allergic rhinitis and asthma in early work life.
Kellberger, Jessica; Peters-Weist, Astrid S; Heinrich, Sabine; Pfeiffer, Susanne; Vogelberg, Christian; Roller, Diana; Genuneit, Jon; Weinmayr, Gudrun; von Mutius, Erika; Heumann, Christian; Nowak, Dennis; Radon, Katja
2014-09-01
Although work-related asthma and allergies are a huge burden for society, investigation of occupational exposures in early work life using an unexposed reference group is rare. Thus, the present analyses aimed to assess the potential impact of occupational exposure and other risk factors on the prevalence of work-related sensitisation and incidence of allergic rhinitis/asthma using a population-based approach and taking into account an unexposed reference group. In SOLAR (Study on Occupational Allergy Risks) II, German participants of ISAAC (International Study of Asthma and Allergies in Childhood) phase II were followed from childhood (9-11 years) until early adulthood (19-24 years). Data on 1570 participants were available to fit predictive models. Occupational exposure was not statistically significantly associated with disease prevalence/incidence. Sensitisation in childhood, parental asthma, environmental tobacco smoke exposure during puberty, sex and study location were statistically significant predictors of outcome. Our results indicate that occupational exposure is of little relevance for work-related sensitisation prevalence and allergic rhinitis/asthma incidence in early work life, while other risk factors can be used to improve career guidance for adolescents. Further research on the role of a potential healthy hire effect and the impact of longer exposure duration is needed. ©ERS 2014.
Sel, İlker; Çakmakcı, Mehmet; Özkaya, Bestamin; Suphi Altan, H
2016-10-01
Main objective of this study was to develop a statistical model for easier and faster Biochemical Methane Potential (BMP) prediction of landfilled municipal solid waste by analyzing waste composition of excavated samples from 12 sampling points and three waste depths representing different landfilling ages of closed and active sections of a sanitary landfill site located in İstanbul, Turkey. Results of Principal Component Analysis (PCA) were used as a decision support tool to evaluation and describe the waste composition variables. Four principal component were extracted describing 76% of data set variance. The most effective components were determined as PCB, PO, T, D, W, FM, moisture and BMP for the data set. Multiple Linear Regression (MLR) models were built by original compositional data and transformed data to determine differences. It was observed that even residual plots were better for transformed data the R(2) and Adjusted R(2) values were not improved significantly. The best preliminary BMP prediction models consisted of D, W, T and FM waste fractions for both versions of regressions. Adjusted R(2) values of the raw and transformed models were determined as 0.69 and 0.57, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.
Deriving Vegetation Dynamics of Natural Terrestrial Ecosystems from MODIS NDVI/EVI Data over Turkey.
Evrendilek, Fatih; Gulbeyaz, Onder
2008-09-01
The 16-day composite MODIS vegetation indices (VIs) at 500-m resolution for the period between 2000 to 2007 were seasonally averaged on the basis of the estimated distribution of 16 potential natural terrestrial ecosystems (NTEs) across Turkey. Graphical and statistical analyses of the time-series VIs for the NTEs spatially disaggregated in terms of biogeoclimate zones and land cover types included descriptive statistics, correlations, discrete Fourier transform (DFT), time-series decomposition, and simple linear regression (SLR) models. Our spatio-temporal analyses revealed that both MODIS VIs, on average, depicted similar seasonal variations for the NTEs, with the NDVI values having higher mean and SD values. The seasonal VIs were most correlated in decreasing order for: barren/sparsely vegetated land > grassland > shrubland/woodland > forest; (sub)nival > warm temperate > alpine > cool temperate > boreal = Mediterranean; and summer > spring > autumn > winter. Most pronounced differences between the MODIS VI responses over Turkey occurred in boreal and Mediterranean climate zones and forests, and in winter (the senescence phase of the growing season). Our results showed the potential of the time-series MODIS VI datasets in the estimation and monitoring of seasonal and interannual ecosystem dynamics over Turkey that needs to be further improved and refined through systematic and extensive field measurements and validations across various biomes.
Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis
Ré, Miguel A.; Azad, Rajeev K.
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338
Generalization of entropy based divergence measures for symbolic sequence analysis.
Ré, Miguel A; Azad, Rajeev K
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.
Application Exercises Improve Transfer of Statistical Knowledge in Real-World Situations
ERIC Educational Resources Information Center
Daniel, Frances; Braasch, Jason L. G.
2013-01-01
The present research investigated whether real-world application exercises promoted students' abilities to spontaneously transfer statistical knowledge and to recognize the use of statistics in real-world contexts. Over the course of a semester of psychological statistics, two classes completed multiple application exercises designed to mimic…
Feinn, Richard; Chui, Kevin; Cheng, M. Samuel
2015-01-01
Purpose To assess the effects of virtual reality using the NintendoTM Wii Fit on balance, gait, and quality of life in ambulatory individuals with incomplete spinal cord injury (iSCI). Relevance There is a need for continued research to support effective treatment techniques in individuals with iSCI to maximize each individual's potential functional performance. Subjects Five males with a mean age of 58.6 years who had an iSCI and were greater than one-year post injury. Methods An interrupted time series design with three pre-tests over three weeks, a post-test within one week of the intervention, and a four-week follow up. Outcome measures: gait speed, timed up and go (TUG), forward functional reach test (FFRT) and lateral functional reach test (LFRT), RAND SF-36. Intervention consisted of one-hour sessions with varied games using the Nintendo Wii Fit twice per week for seven weeks. Survey data was also collected at post-test. Results There were statistically significant changes found in gait speed and functional reach. The changes were also maintained at the four-week follow up post-test. Survey reports suggested improvements in balance, endurance, and mobility with daily tasks at home. Conclusion All subjects who participated in training with the NintendoTM Wii Fit demonstrated statistically significant improvements in gait speed and functional reach after seven weeks of training. Given the potential positive impact that the NintendoTM Wii Fit has on functional reach and gait speed in patients with iSCI, physical therapists may want to incorporate these activities as part of a rehabilitation program. PMID:25613853
Predicting radiotherapy outcomes using statistical learning techniques
NASA Astrophysics Data System (ADS)
El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.
2009-09-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model variables. These models have the capacity to predict on unseen data. Part of this work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.
Luna-Oliva, Laura; Ortiz-Gutiérrez, Rosa María; Cano-de la Cuerda, Roberto; Piédrola, Rosa Martínez; Alguacil-Diego, Isabel M; Sánchez-Camarero, Carlos; Martínez Culebras, María Del Carmen
2013-01-01
Limited evidence is available about the effectiveness of virtual reality using low cost commercial consoles for children with developmental delay. The aim of this preliminary study is to evaluate the usefulness of a videogame system based on non-immersive virtual reality technology (Xbox 360 KinectTM) to support conventional rehabilitation treatment of children with cerebral palsy. Secondarily, to objectify changes in psychomotor status of children with cerebral palsy after receiving rehabilitation treatment in addition with this last generation game console. 11 children with cerebral palsy were included the study. A baseline, a post-treatment and a follow-up assessment were performed related to motor and the process skills, balance, gait speed, running and jumping and fine and manual finger dexterity. All the participants completed 8 weeks of videogame treatment, added to their conventional physiotherapy treatment, with Xbox 360 Kinect™ (Microsoft) game console. The Friedman test showed significant differences among the three assessments for each variable: GMFM (p = 0.001), AMPS motor (p = 0.001), AMPS process (p = 0.010), PRT (p = 0.005) and 10 MW (p = 0.029). Wilcoxon test showed significant statistically differences pre and post-treatment, in all the values. Similarly, results revealed significant differences between basal and follow-up assessment. There were not statistical differences between post-treatment and follow-up evaluation, indicating a long-term maintenance of the improvements achieved after treatment. Low cost video games based on motion capture are potential tools in the rehabilitation context in children with CP. Our Kinect Xbox 360 protocol has showed improvements in balance and ADL in CP participants in a school environment, but further studies are need to validate the potential benefits of these video game systems as a supplement for rehabilitation of children with CP.
A Resilience Intervention in African American Adults with Type 2 Diabetes: A Pilot Study of Efficacy
Steinhardt, Mary A.; Mamerow, Madonna M.; Brown, Sharon A.; Jolly, Christopher A.
2010-01-01
Purpose The purpose of this pilot study was to determine the feasibility of offering our Diabetes Coaching Program (DCP), adapted for African Americans, in a sample of African American adults with type 2 diabetes. Methods The study utilized a one-group, pretest-posttest design to test the acceptance and potential effectiveness of the DCP. Subjects were a convenience sample of 16 African Americans (8 females; 8 males) with type 2 diabetes; twelve subjects (6 females; 6 males) completed the program. The DCP included four weekly class sessions devoted to resilience education and diabetes self-management, followed by eight biweekly support group meetings. Psychosocial process variables (resilience, coping strategies, diabetes empowerment), and proximal (perceived stress, depressive symptoms, diabetes self-management) and distal outcomes (BMI, fasting blood glucose, HbA1c, lipidemia, blood pressure) were assessed at baseline and six-months post study entry. Qualitative data were collected at eight-months via a focus group conducted to examine the acceptability of the DCP. Results Preliminary paired t-tests indicated statistically significant improvements in diabetes empowerment, diabetes self-management, BMI, HbA1c, total cholesterol, LDL-cholesterol, and systolic and diastolic blood pressure. Medium to large effect sizes were reported. Resilience, perceived stress, fasting blood glucose, and HDL-cholesterol improved, but changes were not statistically significant. Focus group data confirmed that participants held positive opinions regarding the DCP and follow-up support group sessions, although they suggested an increase in program length from 4 to 8 weeks. Conclusions The pilot study documented the feasibility and potential effectiveness of the DCP to enhance diabetes empowerment, diabetes self-management, and reductions in the progression of obesity, type 2 diabetes, and CVD in the African American community. Randomized experimental designs are needed to confirm these findings. PMID:19204102
Atomoxetine and stroop task performance in adult attention-deficit/hyperactivity disorder.
Faraone, Stephen V; Biederman, Joseph; Spencer, Thomas; Michelson, David; Adler, Lenard; Reimherr, Fred; Seidman, Larry
2005-08-01
The aim of this study was to assess the efficacy of atomoxetine, a new and highly selective inhibitor of the norepinephrine transporter, for executive functioning in adults with attention-deficit/hyperactivity disorder (ADHD). Two identical studies using a double-blind, placebo-controlled, parallel design were conducted. Patients were adults (Study 1, n = 280; Study 2, n = 256) with Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV)-defined ADHD recruited by referral and advertising. They were randomized to 10 weeks of treatment with atomoxetine or placebo. Executive functions were measured by the Stroop task. There was no evidence of cognitive deterioration associated with atomoxetine treatment. Atomoxetine treatment was associated with an improvement of the Stroop colorword score. Our results provide further support for Spencer et al.'s (1998) report that atomoxetine improves inhibitory capacity, as measured by the Stroop task. The absence of cognitive deterioration from atomoxetine, along with improved performance in a subgroup of patients in this large study, supports the safety of atomoxetine in this regard and its potential for improving a significant source of impairment for adults with ADHD.
Document image improvement for OCR as a classification problem
NASA Astrophysics Data System (ADS)
Summers, Kristen M.
2003-01-01
In support of the goal of automatically selecting methods of enhancing an image to improve the accuracy of OCR on that image, we consider the problem of determining whether to apply each of a set of methods as a supervised classification problem for machine learning. We characterize each image according to a combination of two sets of measures: a set that are intended to reflect the degree of particular types of noise present in documents in a single font of Roman or similar script and a more general set based on connected component statistics. We consider several potential methods of image improvement, each of which constitutes its own 2-class classification problem, according to whether transforming the image with this method improves the accuracy of OCR. In our experiments, the results varied for the different image transformation methods, but the system made the correct choice in 77% of the cases in which the decision affected the OCR score (in the range [0,1]) by at least .01, and it made the correct choice 64% of the time overall.
A retrospective outcomes study examining the effect of interactive metronome on hand function.
Shank, Tracy M; Harron, Wendy
2015-01-01
Interactive Metronome (IM, The Interactive Metronome Company, Sunrise, Florida, USA) is a computer-based modality marketed to rehabilitation professionals who want to improve outcomes in areas of coordination, motor skills, self-regulation behaviors, and cognitive skills. This retrospective study examined the efficacy of IM training on improving timing skills, hand function, and parental report of self-regulatory behaviors. Forty eight children with mixed motor and cognitive diagnoses completed an average of 14 one-hour training sessions over an average of 8.5 weeks in an outpatient setting. Each child was assessed before and after training with the Interactive Metronome Long Form Assessment, the Jebsen Taylor Test of Hand Function, and a parent questionnaire. All three measures improved with statistical significance despite participants having no direct skill training. These results suggest an intimate relationship between cognition and motor skills that has potential therapeutic value. Level 4, Retrospective Case Series. Copyright © 2015 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.
Camargo, Lucila Basto; Raggio, Daniela Prócida; Bonacina, Carlos Felipe; Wen, Chao Lung; Mendes, Fausto Medeiros; Bönecker, Marcelo José Strazzeri; Haddad, Ana Estela
2014-07-17
The aim of this study was to evaluate e-learning strategy in teaching Atraumatic Restorative Treatment (ART) to undergraduate and graduate students. The sample comprised 76 participants-38 dental students and 38 pediatric dentistry students-in a specialization course. To evaluate knowledge improvement, participants were subjected to a test performed before and after the course. A single researcher corrected the tests and intraexaminer reproducibility was calculated (CCI = 0.991; 95% IC = 0.975-0.996). All students improved their performances after the e-learning course (Paired t-tests p < 0.001). The means of undergraduate students were 4.7 (initial) and 6.4 (final) and those of graduate students were 6.8 (initial) and 8.2 (final). The comparison of the final evaluation means showed a statistically significant difference (t-tests p < 0.0001). The e-learning strategy has the potential of improving students' knowledge in ART. Mature students perform better in this teaching modality when it is applied exclusively via distance learning.
The role of haemorrhage and exudate detection in automated grading of diabetic retinopathy.
Fleming, Alan D; Goatman, Keith A; Philip, Sam; Williams, Graeme J; Prescott, Gordon J; Scotland, Graham S; McNamee, Paul; Leese, Graham P; Wykes, William N; Sharp, Peter F; Olson, John A
2010-06-01
Automated grading has the potential to improve the efficiency of diabetic retinopathy screening services. While disease/no disease grading can be performed using only microaneurysm detection and image-quality assessment, automated recognition of other types of lesions may be advantageous. This study investigated whether inclusion of automated recognition of exudates and haemorrhages improves the detection of observable/referable diabetic retinopathy. Images from 1253 patients with observable/referable retinopathy and 6333 patients with non-referable retinopathy were obtained from three grading centres. All images were reference-graded, and automated disease/no disease assessments were made based on microaneurysm detection and combined microaneurysm, exudate and haemorrhage detection. Introduction of algorithms for exudates and haemorrhages resulted in a statistically significant increase in the sensitivity for detection of observable/referable retinopathy from 94.9% (95% CI 93.5 to 96.0) to 96.6% (95.4 to 97.4) without affecting manual grading workload. Automated detection of exudates and haemorrhages improved the detection of observable/referable retinopathy.
Front-Line Physicians' Satisfaction with Information Systems in Hospitals.
Peltonen, Laura-Maria; Junttila, Kristiina; Salanterä, Sanna
2018-01-01
Day-to-day operations management in hospital units is difficult due to continuously varying situations, several actors involved and a vast number of information systems in use. The aim of this study was to describe front-line physicians' satisfaction with existing information systems needed to support the day-to-day operations management in hospitals. A cross-sectional survey was used and data chosen with stratified random sampling were collected in nine hospitals. Data were analyzed with descriptive and inferential statistical methods. The response rate was 65 % (n = 111). The physicians reported that information systems support their decision making to some extent, but they do not improve access to information nor are they tailored for physicians. The respondents also reported that they need to use several information systems to support decision making and that they would prefer one information system to access important information. Improved information access would better support physicians' decision making and has the potential to improve the quality of decisions and speed up the decision making process.
A systematic review of health literacy interventions for people living with HIV
Perazzo, Joseph; Reyes, Darcel; Webel, Allison
2017-01-01
Health literacy significantly impacts health-related outcomes among people living with HIV. Our aim was to systematically review current literature on health literacy interventions for people living with HIV. The authors conducted a thorough literature search following the PRISMA statement and the AMSTAR checklist as a guide, and found six studies that met inclusion/exclusion criteria. The majority of these interventions were designed to improve HIV treatment adherence as well as HIV knowledge and treatment-related skills, with one study focusing on e-Health literacy. Several of the studies demonstrated trends toward improvement in medication adherence, but most did not achieve statistical significance primarily due to methodological limitations. Significant improvements in knowledge, behavioral skills, and e-Health literacy were found following interventions (p = 0·001–0·05). Health literacy interventions have the potential to promote HIV-related knowledge, behavioral skills, and self-management practices. More research is needed to assess the efficacy of interventions to promote a variety of self-management practices. PMID:26864691
An assessment of the skid resistance effect on traffic safety under wet-pavement conditions.
Pardillo Mayora, José M; Jurado Piña, Rafael
2009-07-01
Pavement-tire friction provides the grip that is required for maintaining vehicle control and for stopping in emergency situations. Statistically significant negative correlations of skid resistance values and wet-pavement accident rates have been found in previous research. Skid resistance measured with SCRIM and crash data from over 1750km of two-lane rural roads in the Spanish National Road System were analyzed to determine the influence of pavement conditions on safety and to assess the effects of improving pavement friction on safety. Both wet- and dry-pavement crash rates presented a decreasing trend as skid resistance values increased. Thresholds in SCRIM coefficient values associated with significant decreases in wet-pavement crash rates were determined. Pavement friction improvement schemes were found to yield significant reductions in wet-pavement crash rates averaging 68%. The results confirm the importance of maintaining adequate levels of pavement friction to safeguard traffic safety as well as the potential of pavement friction improvement schemes to achieve significant crash reductions.
Artificial Intelligence in Precision Cardiovascular Medicine.
Krittanawong, Chayakrit; Zhang, HongJu; Wang, Zhen; Aydar, Mehmet; Kitai, Takeshi
2017-05-30
Artificial intelligence (AI) is a field of computer science that aims to mimic human thought processes, learning capacity, and knowledge storage. AI techniques have been applied in cardiovascular medicine to explore novel genotypes and phenotypes in existing diseases, improve the quality of patient care, enable cost-effectiveness, and reduce readmission and mortality rates. Over the past decade, several machine-learning techniques have been used for cardiovascular disease diagnosis and prediction. Each problem requires some degree of understanding of the problem, in terms of cardiovascular medicine and statistics, to apply the optimal machine-learning algorithm. In the near future, AI will result in a paradigm shift toward precision cardiovascular medicine. The potential of AI in cardiovascular medicine is tremendous; however, ignorance of the challenges may overshadow its potential clinical impact. This paper gives a glimpse of AI's application in cardiovascular clinical care and discusses its potential role in facilitating precision cardiovascular medicine. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Contextualization of drug-mediator relations using evidence networks.
Tran, Hai Joey; Speyer, Gil; Kiefer, Jeff; Kim, Seungchan
2017-05-31
Genomic analysis of drug response can provide unique insights into therapies that can be used to match the "right drug to the right patient." However, the process of discovering such therapeutic insights using genomic data is not straightforward and represents an area of active investigation. EDDY (Evaluation of Differential DependencY), a statistical test to detect differential statistical dependencies, is one method that leverages genomic data to identify differential genetic dependencies. EDDY has been used in conjunction with the Cancer Therapeutics Response Portal (CTRP), a dataset with drug-response measurements for more than 400 small molecules, and RNAseq data of cell lines in the Cancer Cell Line Encyclopedia (CCLE) to find potential drug-mediator pairs. Mediators were identified as genes that showed significant change in genetic statistical dependencies within annotated pathways between drug sensitive and drug non-sensitive cell lines, and the results are presented as a public web-portal (EDDY-CTRP). However, the interpretability of drug-mediator pairs currently hinders further exploration of these potentially valuable results. In this study, we address this challenge by constructing evidence networks built with protein and drug interactions from the STITCH and STRING interaction databases. STITCH and STRING are sister databases that catalog known and predicted drug-protein interactions and protein-protein interactions, respectively. Using these two databases, we have developed a method to construct evidence networks to "explain" the relation between a drug and a mediator. RESULTS: We applied this approach to drug-mediator relations discovered in EDDY-CTRP analysis and identified evidence networks for ~70% of drug-mediator pairs where most mediators were not known direct targets for the drug. Constructed evidence networks enable researchers to contextualize the drug-mediator pair with current research and knowledge. Using evidence networks, we were able to improve the interpretability of the EDDY-CTRP results by linking the drugs and mediators with genes associated with both the drug and the mediator. We anticipate that these evidence networks will help inform EDDY-CTRP results and enhance the generation of important insights to drug sensitivity that will lead to improved precision medicine applications.
ERIC Educational Resources Information Center
Dunn, Karee
2014-01-01
Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…
Pearson, Glen J; Olson, Kari L; Panich, Nicole E; Majumdar, Sumit R; Tsuyuki, Ross T; Gilchrist, Dawna M; Damani, Ali; Francis, Gordon A
2008-01-01
Background: Specialty cardiovascular risk reduction clinics (CRRC) increase the proportion of patients attaining recommended lipid targets; however, it is not known if the benefits are sustained after discharge. We evaluated the impact of a CRRC on lipid levels and assessed the long-term effect of a CRRC in maintaining improved lipid levels following discharge. Methods: The medical records of consecutive dyslipidemic patients discharged ×6 months from a tertiary hospital CRRC from January 1991 to January 2001 were retrospectively reviewed. The primary outcome was the change in patients’ lipid levels between the final CRRC visit and the most recent primary care follow-up. A worst-case analysis was conducted to evaluate the potential impact of the patients in whom the follow-up lipid profiles post-discharge from the CRRC were not obtained. Results: Within the CRRC (median follow-up = 1.28 years in 1064 patients), we observed statistically significant improvements in all lipid parameters. In the 411 patients for whom post-discharge lipid profiles were available (median follow-up = 2.41 years), there were no significant differences observed in low-density lipoprotein-cholesterol, total cholesterol (TC), or triglycerides since CRRC discharge; however, there were small improvements in high-density lipoprotein-cholesterol (HDL-C) and TC:HDL ratio (p < 0.05 for both). The unadjusted worst-case analysis (653 patients with no follow-up lipid profiles) demonstrated statistically significant worsening of all lipid parameters between CRRC discharge and the most recent follow-up. However, when the change in lipid parameters between the baseline and the most recent follow-up was assessed in this analysis, the changes in all lipid parameters were significantly improved (p < 0.05). Conclusions: This study demonstrates that a CRRC can improve lipid levels and suggests that these benefits are sustained once patients are returned to the care of their primary physician. PMID:19183763
Clancy, J P; Rowe, Steven M; Accurso, Frank J; Aitken, Moira L; Amin, Raouf S; Ashlock, Melissa A; Ballmann, Manfred; Boyle, Michael P; Bronsveld, Inez; Campbell, Preston W; De Boeck, Kris; Donaldson, Scott H; Dorkin, Henry L; Dunitz, Jordan M; Durie, Peter R; Jain, Manu; Leonard, Anissa; McCoy, Karen S; Moss, Richard B; Pilewski, Joseph M; Rosenbluth, Daniel B; Rubenstein, Ronald C; Schechter, Michael S; Botfield, Martyn; Ordoñez, Claudia L; Spencer-Green, George T; Vernillet, Laurent; Wisseh, Steve; Yen, Karl; Konstan, Michael W
2012-01-01
VX-809, a cystic fibrosis transmembrane conductance regulator (CFTR) modulator, has been shown to increase the cell surface density of functional F508del-CFTR in vitro. A randomised, double-blind, placebo-controlled study evaluated the safety, tolerability and pharmacodynamics of VX-809 in adult patients with cystic fibrosis (n=89) who were homozygous for the F508del-CFTR mutation. Subjects were randomised to one of four VX-809 28 day dose groups (25, 50, 100 and 200 mg) or matching placebo. The type and incidence of adverse events were similar among VX-809- and placebo-treated subjects. Respiratory events were the most commonly reported and led to discontinuation by one subject in each active treatment arm. Pharmacokinetic data supported a once-daily oral dosing regimen. Pharmacodynamic data suggested that VX-809 improved CFTR function in at least one organ (sweat gland). VX-809 reduced elevated sweat chloride values in a dose-dependent manner (p=0.0013) that was statistically significant in the 100 and 200 mg dose groups. There was no statistically significant improvement in CFTR function in the nasal epithelium as measured by nasal potential difference, nor were there statistically significant changes in lung function or patient-reported outcomes. No maturation of immature F508del-CFTR was detected in the subgroup that provided rectal biopsy specimens. In this study, VX-809 had a similar adverse event profile to placebo for 28 days in F508del-CFTR homozygous patients, and demonstrated biological activity with positive impact on CFTR function in the sweat gland. Additional data are needed to determine how improvements detected in CFTR function secondary to VX-809 in the sweat gland relate to those measurable in the respiratory tract and to long-term measures of clinical benefit. NCT00865904.
Short-term Outcomes After Open and Laparoscopic Colostomy Creation.
Ivatury, Srinivas Joga; Bostock Rosenzweig, Ian C; Holubar, Stefan D
2016-06-01
Colostomy creation is a common procedure performed in colon and rectal surgery. Outcomes by technique have not been well studied. This study evaluated outcomes related to open versus laparoscopic colostomy creation. This was a retrospective review of patients undergoing colostomy creation using univariate and multivariate propensity score analyses. Hospitals participating in the American College of Surgeons National Surgical Quality Improvement Program database were included. Data on patients were obtained from the American College of Surgeons National Surgical Quality Improvement Program 2005-2011 Participant Use Data Files. We measured 30-day mortality, 30-day complications, and predictors of 30-day mortality. A total of 2179 subjects were in the open group and 1132 in the laparoscopic group. The open group had increased age (open, 64 years vs laparoscopic, 60 years), admission from facility (17.0% vs 14.9%), and disseminated cancer (26.1% vs 21.4%). All were statistically significant. The open group had a significantly higher percentage of emergency operations (24.9% vs 7.9%). Operative time was statistically different (81 vs 86 minutes). Thirty-day mortality was significantly higher in the open group (8.7% vs 3.5%), as was any 30-day complication (25.4% vs 17.0%). Propensity-matching analysis on elective patients only revealed that postoperative length of stay and rate of any wound complication were statistically higher in the open group. Multivariate analysis for mortality was performed on the full, elective, and propensity-matched cohorts; age >65 years and dependent functional status were associated with an increased risk of mortality in all of the models. This study has the potential for selection bias and limited generalizability. Colostomy creation at American College of Surgeons National Surgical Quality Improvement Program hospitals is more commonly performed open rather than laparoscopically. Patient age >65 years and dependent functional status are associated with an increased risk of 30-day mortality.
Survival Regression Modeling Strategies in CVD Prediction.
Barkhordari, Mahnaz; Padyab, Mojgan; Sardarinia, Mahsa; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza
2016-04-01
A fundamental part of prevention is prediction. Potential predictors are the sine qua non of prediction models. However, whether incorporating novel predictors to prediction models could be directly translated to added predictive value remains an area of dispute. The difference between the predictive power of a predictive model with (enhanced model) and without (baseline model) a certain predictor is generally regarded as an indicator of the predictive value added by that predictor. Indices such as discrimination and calibration have long been used in this regard. Recently, the use of added predictive value has been suggested while comparing the predictive performances of the predictive models with and without novel biomarkers. User-friendly statistical software capable of implementing novel statistical procedures is conspicuously lacking. This shortcoming has restricted implementation of such novel model assessment methods. We aimed to construct Stata commands to help researchers obtain the aforementioned statistical indices. We have written Stata commands that are intended to help researchers obtain the following. 1, Nam-D'Agostino X 2 goodness of fit test; 2, Cut point-free and cut point-based net reclassification improvement index (NRI), relative absolute integrated discriminatory improvement index (IDI), and survival-based regression analyses. We applied the commands to real data on women participating in the Tehran lipid and glucose study (TLGS) to examine if information relating to a family history of premature cardiovascular disease (CVD), waist circumference, and fasting plasma glucose can improve predictive performance of Framingham's general CVD risk algorithm. The command is adpredsurv for survival models. Herein we have described the Stata package "adpredsurv" for calculation of the Nam-D'Agostino X 2 goodness of fit test as well as cut point-free and cut point-based NRI, relative and absolute IDI, and survival-based regression analyses. We hope this work encourages the use of novel methods in examining predictive capacity of the emerging plethora of novel biomarkers.
Cronin, Katherine A; Jacobson, Sarah L; Bonnie, Kristin E; Hopper, Lydia M
2017-01-01
Studying animal cognition in a social setting is associated with practical and statistical challenges. However, conducting cognitive research without disturbing species-typical social groups can increase ecological validity, minimize distress, and improve animal welfare. Here, we review the existing literature on cognitive research run with primates in a social setting in order to determine how widespread such testing is and highlight approaches that may guide future research planning. Using Google Scholar to search the terms "primate" "cognition" "experiment" and "social group," we conducted a systematic literature search covering 16 years (2000-2015 inclusive). We then conducted two supplemental searches within each journal that contained a publication meeting our criteria in the original search, using the terms "primate" and "playback" in one search and the terms "primate" "cognition" and "social group" in the second. The results were used to assess how frequently nonhuman primate cognition has been studied in a social setting (>3 individuals), to gain perspective on the species and topics that have been studied, and to extract successful approaches for social testing. Our search revealed 248 unique publications in 43 journals encompassing 71 species. The absolute number of publications has increased over years, suggesting viable strategies for studying cognition in social settings. While a wide range of species were studied they were not equally represented, with 19% of the publications reporting data for chimpanzees. Field sites were the most common environment for experiments run in social groups of primates, accounting for more than half of the results. Approaches to mitigating the practical and statistical challenges were identified. This analysis has revealed that the study of primate cognition in a social setting is increasing and taking place across a range of environments. This literature review calls attention to examples that may provide valuable models for researchers wishing to overcome potential practical and statistical challenges to studying cognition in a social setting, ultimately increasing validity and improving the welfare of the primates we study.
Saliba, Georges; Saleh, Rawad; Zhao, Yunliang; Presto, Albert A; Lambe, Andrew T; Frodin, Bruce; Sardar, Satya; Maldonado, Hector; Maddox, Christine; May, Andrew A; Drozd, Greg T; Goldstein, Allen H; Russell, Lynn M; Hagen, Fabian; Robinson, Allen L
2017-06-06
Recent increases in the Corporate Average Fuel Economy standards have led to widespread adoption of vehicles equipped with gasoline direct-injection (GDI) engines. Changes in engine technologies can alter emissions. To quantify these effects, we measured gas- and particle-phase emissions from 82 light-duty gasoline vehicles recruited from the California in-use fleet tested on a chassis dynamometer using the cold-start unified cycle. The fleet included 15 GDI vehicles, including 8 GDIs certified to the most-stringent emissions standard, superultra-low-emission vehicles (SULEV). We quantified the effects of engine technology, emission certification standards, and cold-start on emissions. For vehicles certified to the same emissions standard, there is no statistical difference of regulated gas-phase pollutant emissions between PFIs and GDIs. However, GDIs had, on average, a factor of 2 higher particulate matter (PM) mass emissions than PFIs due to higher elemental carbon (EC) emissions. SULEV certified GDIs have a factor of 2 lower PM mass emissions than GDIs certified as ultralow-emission vehicles (3.0 ± 1.1 versus 6.3 ± 1.1 mg/mi), suggesting improvements in engine design and calibration. Comprehensive organic speciation revealed no statistically significant differences in the composition of the volatile organic compounds emissions between PFI and GDIs, including benzene, toluene, ethylbenzene, and xylenes (BTEX). Therefore, the secondary organic aerosol and ozone formation potential of the exhaust does not depend on engine technology. Cold-start contributes a larger fraction of the total unified cycle emissions for vehicles meeting more-stringent emission standards. Organic gas emissions were the most sensitive to cold-start compared to the other pollutants tested here. There were no statistically significant differences in the effects of cold-start on GDIs and PFIs. For our test fleet, the measured 14.5% decrease in CO 2 emissions from GDIs was much greater than the potential climate forcing associated with higher black carbon emissions. Thus, switching from PFI to GDI vehicles will likely lead to a reduction in net global warming.
Tonkin, Matthew J.; Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.
2007-01-01
The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one or more parameters is added.
Magill, Stephen T; Wang, Doris D; Rutledge, W Caleb; Lau, Darryl; Berger, Mitchel S; Sankaran, Sujatha; Lau, Catherine Y; Imershein, Sarah G
2017-11-01
Patient safety is foundational to neurosurgical care. Postprocedural "debrief" checklists have been proposed to improve patient safety, but data about their use in neurosurgery are limited. Here, we implemented an initiative to routinely perform postoperative debriefs and evaluated the impact of debriefing on operating room (OR) safety culture. A 10-question safety attitude questionnaire (SAQ) was sent to neurosurgical OR staff at a major academic medical center before and 18 months after the implementation of a postoperative debriefing initiative. Rates of debrief compliance and changes in attitudes before and after the survey were evaluated. The survey used a Likert scale and analyzed with standard statistical methods. After the debrief initiative, the rate of debriefing increased from 51% to 86% of cases for the neurosurgery service. Baseline SAQ responses found that neurosurgeons had a more favorable perception of OR safety than did anesthesiologists and nurses. After implementation of the postoperative debriefing process, perceptions of OR safety significantly improved for neurosurgeons, anesthesiologists, and nurses. Furthermore, the disparity between nurses and surgeons was no longer significant. After debrief implementation, neurosurgical OR staff had improved perceptions of patient safety compared with surgical services that did not commonly perform debriefing. Debriefing identified OR efficiency concerns in 26.9% of cases, and prevention of potential adverse events/near misses was reported in 8% of cases. Postoperative debriefing can be effectively introduced into the OR and improves the safety culture after implementation. Debriefing is an effective tool to identify OR inefficiencies and potential adverse events. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H
2014-06-15
Purpose: To develop a 3D dictionary learning based statistical reconstruction algorithm on graphic processing units (GPU), to improve the quality of low-dose cone beam CT (CBCT) imaging with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms) of 3x3x3 voxels was trained from a high quality volume image. During reconstruction, we utilized a Cholesky decomposition based orthogonal matching pursuit algorithm to find a sparse representation on this dictionary basis of each patch in the reconstructed image, in order to regularize the image quality. To accelerate the time-consuming sparse coding in the 3D case, we implemented our algorithm inmore » a parallel fashion by taking advantage of the tremendous computational power of GPU. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with a tight frame (TF) based one using a subset data of 121 projections. The image qualities under different resolutions in z-direction, with or without statistical weighting are also studied. Results: Compared to the TF-based CBCT reconstruction, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, to remove more streaking artifacts, and is less susceptible to blocky artifacts. It is also observed that statistical reconstruction approach is sensitive to inconsistency between the forward and backward projection operations in parallel computing. Using high a spatial resolution along z direction helps improving the algorithm robustness. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppressing noise, and hence to achieve high quality reconstruction. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application. A high zresolution is preferred to stabilize statistical iterative reconstruction. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011), China Scholarship Council.« less
Statistical Primer on Biosimilar Clinical Development.
Isakov, Leah; Jin, Bo; Jacobs, Ira Allen
A biosimilar is highly similar to a licensed biological product and has no clinically meaningful differences between the biological product and the reference (originator) product in terms of safety, purity, and potency and is approved under specific regulatory approval processes. Because both the originator and the potential biosimilar are large and structurally complex proteins, biosimilars are not generic equivalents of the originator. Thus, the regulatory approach for a small-molecule generic is not appropriate for a potential biosimilar. As a result, different study designs and statistical approaches are used in the assessment of a potential biosimilar. This review covers concepts and terminology used in statistical analyses in the clinical development of biosimilars so that clinicians can understand how similarity is evaluated. This should allow the clinician to understand the statistical considerations in biosimilar clinical trials and make informed prescribing decisions when an approved biosimilar is available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevik, James; Wallner, Thomas; Pamminger, Michael
The efficiency improvement and emissions reduction potential of lean and exhaust gas recirculation (EGR)-dilute operation of spark-ignition gasoline engines is well understood and documented. However, dilute operation is generally limited by deteriorating combustion stability with increasing inert gas levels. The combustion stability decreases due to reduced mixture flame speeds resulting in significantly increased combustion initiation periods and burn durations. A study was designed and executed to evaluate the potential to extend lean and EGR-dilute limits using a low-energy transient plasma ignition system. The low-energy transient plasma was generated by nanosecond pulses and its performance compared to a conventional transistorized coilmore » ignition (TCI) system operated on an automotive, gasoline direct-injection (GDI) single-cylinder research engine. The experimental assessment was focused on steady-state experiments at the part load condition of 1500 rpm 5.6 bar indicated mean effective pressure (IMEP), where dilution tolerance is particularly critical to improving efficiency and emission performance. Experimental results suggest that the energy delivery process of the low-energy transient plasma ignition system significantly improves part load dilution tolerance by reducing the early flame development period. Statistical analysis of relevant combustion metrics was performed in order to further investigate the effects of the advanced ignition system on combustion stability. Results confirm that at select operating conditions EGR tolerance and lean limit could be improved by as much as 20% (from 22.7 to 27.1% EGR) and nearly 10% (from λ = 1.55 to 1.7) with the low-energy transient plasma ignition system.« less
Improving measurement of injection drug risk behavior using item response theory.
Janulis, Patrick
2014-03-01
Recent research highlights the multiple steps to preparing and injecting drugs and the resultant viral threats faced by drug users. This research suggests that more sensitive measurement of injection drug HIV risk behavior is required. In addition, growing evidence suggests there are gender differences in injection risk behavior. However, the potential for differential item functioning between genders has not been explored. To explore item response theory as an improved measurement modeling technique that provides empirically justified scaling of injection risk behavior and to examine for potential gender-based differential item functioning. Data is used from three studies in the National Institute on Drug Abuse's Criminal Justice Drug Abuse Treatment Studies. A two-parameter item response theory model was used to scale injection risk behavior and logistic regression was used to examine for differential item functioning. Item fit statistics suggest that item response theory can be used to scale injection risk behavior and these models can provide more sensitive estimates of risk behavior. Additionally, gender-based differential item functioning is present in the current data. Improved measurement of injection risk behavior using item response theory should be encouraged as these models provide increased congruence between construct measurement and the complexity of injection-related HIV risk. Suggestions are made to further improve injection risk behavior measurement. Furthermore, results suggest direct comparisons of composite scores between males and females may be misleading and future work should account for differential item functioning before comparing levels of injection risk behavior.
Sanchez-Lite, Alberto; Garcia, Manuel; Domingo, Rosario; Angel Sebastian, Miguel
2013-01-01
Musculoskeletal disorders (MSDs) that result from poor ergonomic design are one of the occupational disorders of greatest concern in the industrial sector. A key advantage in the primary design phase is to focus on a method of assessment that detects and evaluates the potential risks experienced by the operative when faced with these types of physical injuries. The method of assessment will improve the process design identifying potential ergonomic improvements from various design alternatives or activities undertaken as part of the cycle of continuous improvement throughout the differing phases of the product life cycle. This paper presents a novel postural assessment method (NERPA) fit for product-process design, which was developed with the help of a digital human model together with a 3D CAD tool, which is widely used in the aeronautic and automotive industries. The power of 3D visualization and the possibility of studying the actual assembly sequence in a virtual environment can allow the functional performance of the parts to be addressed. Such tools can also provide us with an ergonomic workstation design, together with a competitive advantage in the assembly process. The method developed was used in the design of six production lines, studying 240 manual assembly operations and improving 21 of them. This study demonstrated the proposed method's usefulness and found statistically significant differences in the evaluations of the proposed method and the widely used Rapid Upper Limb Assessment (RULA) method.
Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.
Gangnon, Ronald E
2012-03-01
The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.
Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution
Gangnon, Ronald E.
2011-01-01
Summary The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, while rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. PMID:21762118
Efforts to improve international migration statistics: a historical perspective.
Kraly, E P; Gnanasekaran, K S
1987-01-01
During the past decade, the international statistical community has made several efforts to develop standards for the definition, collection and publication of statistics on international migration. This article surveys the history of official initiatives to standardize international migration statistics by reviewing the recommendations of the International Statistical Institute, International Labor Organization, and the UN, and reports a recently proposed agenda for moving toward comparability among national statistical systems. Heightening awareness of the benefits of exchange and creating motivation to implement international standards requires a 3-pronged effort from the international statistical community. 1st, it is essential to continue discussion about the significance of improvement, specifically standardization, of international migration statistics. The move from theory to practice in this area requires ongoing focus by migration statisticians so that conformity to international standards itself becomes a criterion by which national statistical practices are examined and assessed. 2nd, the countries should be provided with technical documentation to support and facilitate the implementation of the recommended statistical systems. Documentation should be developed with an understanding that conformity to international standards for migration and travel statistics must be achieved within existing national statistical programs. 3rd, the call for statistical research in this area requires more efforts by the community of migration statisticians, beginning with the mobilization of bilateral and multilateral resources to undertake the preceding list of activities.
Shams, Ahmed; El-Sayed, Mohamed; Gamal, Osama; Ewes, Waled
2016-12-01
Rotator cuff tears are one of the most common causes of chronic shoulder pain and disability. They significantly affect the quality of life. Reduced pain and improved function are the goals of conventional therapy, which includes relative rest, pain therapy, physical therapy, corticosteroid injections and surgical intervention. Tendons have a relative avascular nature; hence, their regenerative potential is limited. There is some clinical evidence that the application of autologous platelets may help to revascularize the area of injury in rotator cuff pathologies. This prospective randomized controlled study was done to evaluate the results of subacromial injection of platelet-rich plasma (PRP) versus corticosteroid injection therapy in 40 patients with symptomatic partial rotator cuff tears. All patients were assessed before injection, 6 weeks, 3 and 6 months after injection, using the American Shoulder and Elbow Surgeons Standardized Shoulder Assessment Form (ASES), the Constant-Murley Score (CMS), the Simple Shoulder Test (SST) and a Visual Analog Scale (VAS) for pain. An MRI was performed before and 6 months after the injection for all the included patients and was graded on 0-5 scale. Both injection groups showed statistically significantly better clinical outcomes over time compared with those before injection. There was a statistically significant difference between RPP group and corticosteroid group 12 weeks after injection, regarding VAS, ASES, CMS and SST in favor of the RPP group. MRI showed an overall slight nonsignificant improvement in grades of tendinopathy/tear in both groups, however, without statistically significant differences between the two groups. PRP injections showed earlier better results as compared to corticosteroid injections, although statistically significant better results after 6 months could not be found. Therefore, subacromial RPP injection could be considered as a good alternative to corticosteroid injection, especially in patients with a contraindication to corticosteroid administration. II.
Crown, William; Chang, Jessica; Olson, Melvin; Kahler, Kristijan; Swindle, Jason; Buzinec, Paul; Shah, Nilay; Borah, Bijan
2015-09-01
Missing data, particularly missing variables, can create serious analytic challenges in observational comparative effectiveness research studies. Statistical linkage of datasets is a potential method for incorporating missing variables. Prior studies have focused upon the bias introduced by imperfect linkage. This analysis uses a case study of hepatitis C patients to estimate the net effect of statistical linkage on bias, also accounting for the potential reduction in missing variable bias. The results show that statistical linkage can reduce bias while also enabling parameter estimates to be obtained for the formerly missing variables. The usefulness of statistical linkage will vary depending upon the strength of the correlations of the missing variables with the treatment variable, as well as the outcome variable of interest.
NASA Astrophysics Data System (ADS)
Zink, Frank Edward
The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology assessment.
Bansal, Himanshu; Bansal, Anupama; Agrawal, Diwaker; Singh, Dhananjay; Deb, Kaushik
2014-01-01
To evaluate the therapeutic and safety efficacy of a naturally occurring mineral supplementation in the treatment of symptomatic knee osteoarthritis (OA). A prospective, single centre, study of 50 patients aged 50 years and above with painful and radiological Osteoarthritis of knees was carried out for one year. Patients received 40 drops of naturally occurring commercially available mineral supplement concentrate mineral drops purportedly derived from the Great Salt Lake in Utah. Efficacy was objectively confirmed by evaluating changes in the thickness of articular cartilage, joint space width, synovial fluid analysis and subjectively by changes in WOMAC scores and 6 Minute pain-free Walking Distance. The composite WOMAC scores were significantly improved by 17.2 points from a mean of 52 at baseline by year end. 18 (41%) patients showed improvement of more than 100 feet for the pain free distance covered during a 6 minute walk at one year follow-up. Ultrasonologicaly, at one year cartilage thickness improved by at least 0.01 mm in 9 (21%) patients. Though radiologicallynone of patient showed increase in joint space it was noticed that only 2(4.6%) patients had decline of joint space width of more than 0.5 mm. Average cell count reduced to 205/microlitre from a value of 520/microlitre at the start of study suggesting that the mineral supplement used had structural efficacy. Clinically relevant, statistically significant symptomatic and statistically insignificant structural improvement occurred over 1 year period in patients receiving the naturally occurring mineral supplement. The protection of the joint cartilages from progressive degeneration during osteoarthritis by these supplements indicates towards a chondrocyte regenerative potential of this supplement. Such regeneration may occur through activation of tissue specific adult chondrocyte precursors or stem cells.
Dunea, Daniel; Pohoata, Alin; Iordache, Stefania
2015-07-01
The paper presents the screening of various feedforward neural networks (FANN) and wavelet-feedforward neural networks (WFANN) applied to time series of ground-level ozone (O3), nitrogen dioxide (NO2), and particulate matter (PM10 and PM2.5 fractions) recorded at four monitoring stations located in various urban areas of Romania, to identify common configurations with optimal generalization performance. Two distinct model runs were performed as follows: data processing using hourly-recorded time series of airborne pollutants during cold months (O3, NO2, and PM10), when residential heating increases the local emissions, and data processing using 24-h daily averaged concentrations (PM2.5) recorded between 2009 and 2012. Dataset variability was assessed using statistical analysis. Time series were passed through various FANNs. Each time series was decomposed in four time-scale components using three-level wavelets, which have been passed also through FANN, and recomposed into a single time series. The agreement between observed and modelled output was evaluated based on the statistical significance (r coefficient and correlation between errors and data). Daubechies db3 wavelet-Rprop FANN (6-4-1) utilization gave positive results for O3 time series optimizing the exclusive use of the FANN for hourly-recorded time series. NO2 was difficult to model due to time series specificity, but wavelet integration improved FANN performances. Daubechies db3 wavelet did not improve the FANN outputs for PM10 time series. Both models (FANN/WFANN) overestimated PM2.5 forecasted values in the last quarter of time series. A potential improvement of the forecasted values could be the integration of a smoothing algorithm to adjust the PM2.5 model outputs.
Fichera, Eleonora; Gray, Ewan; Sutton, Matt
2016-06-01
The efficacy of the management of long-term conditions depends in part on whether healthcare and health behaviours are complements or substitutes in the health production function. On the one hand, individuals might believe that improved health care can raise the marginal productivity of their own health behaviour and decide to complement health care with additional effort in healthier behaviours. On the other hand, health care can lower the cost of unhealthy behaviours by compensating for their negative effects. Individuals may therefore reduce their effort in healthier lifestyles. Identifying which of these effects prevails is complicated by the endogenous nature of treatment decisions and individuals' behavioural responses. We explore whether the introduction in 2004 of the Quality and Outcomes Framework (QOF), a financial incentive for family doctors to improve the quality of healthcare, affected the population's weight, smoking and drinking behaviours by applying a sharp regression discontinuity design to a sample of 32,102 individuals in the Health Survey for England (1997-2009). We find that individuals with the targeted health conditions improved their lifestyle behaviours. This complementarity was only statistically significant for smoking, which reduced by 0.7 cigarettes per person per day, equal to 18% of the mean. We investigate whether this change was attributable to the QOF by testing for other discontinuity points, including the introduction of a smoking ban in 2007 and changes to the QOF in 2006. We also examine whether medication and smoking cessation advice are potential mechanisms and find no statistically significant discontinuities for these aspects of health care supply. Our results suggest that a general improvement in healthcare generated by provider incentives can have positive unplanned effects on patients' behaviours. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Schroeder, Krista; Jia, Haomiao; Smaldone, Arlene
Propensity score (PS) methods are increasingly being employed by researchers to reduce bias arising from confounder imbalance when using observational data to examine intervention effects. The purpose of this study was to examine PS theory and methodology and compare application of three PS methods (matching, stratification, weighting) to determine which best improves confounder balance. Baseline characteristics of a sample of 20,518 school-aged children with severe obesity (of whom 1,054 received an obesity intervention) were assessed prior to PS application. Three PS methods were then applied to the data to determine which showed the greatest improvement in confounder balance between the intervention and control group. The effect of each PS method on the outcome variable-body mass index percentile change at one year-was also examined. SAS 9.4 and Comprehensive Meta-analysis statistical software were used for analyses. Prior to PS adjustment, the intervention and control groups differed significantly on seven of 11 potential confounders. PS matching removed all differences. PS stratification and weighting both removed one difference but created two new differences. Sensitivity analyses did not change these results. Body mass index percentile at 1 year decreased in both groups. The size of the decrease was smaller in the intervention group, and the estimate of the decrease varied by PS method. Selection of a PS method should be guided by insight from statistical theory and simulation experiments, in addition to observed improvement in confounder balance. For this data set, PS matching worked best to correct confounder imbalance. Because each method varied in correcting confounder imbalance, we recommend that multiple PS methods be compared for ability to improve confounder balance before implementation in evaluating treatment effects in observational data.
Falagas, M E; Zarkadoulia, E; Rafailidis, P I
2009-07-01
Systematic review. There is widespread popular belief that balneotherapy is effective in the treatment of various ailments. We searched PubMed (1950-2006), Scopus and Cochrane library for randomised controlled trials (RCTs), examining the clinical effect of balneotherapy (both as a solitary approach and in the context of spa) on various diseases. A total of 203 potentially relevant articles were identified. In all, 29 RCTs were further evaluated; 22 of them (75.8%) investigated the use of balneotherapy in rheumatological diseases and eight osteoarthritis, six fibromyalgia, four ankylosing spondylitis, four rheumatoid arthritis and three RCTs (10.3%) in other musculoskeletal system diseases (chronic low back pain). In addition, three relevant studies focused on psoriasis and one on Parkinson's disease. A total of 1720 patients with rheumatological and other musculoskeletal diseases were evaluated in these studies. Balneotherapy did result in more pain improvement (statistically different) in patients with rheumatological diseases and chronic low back pain in comparison to the control group in 17 (68%) of the 25 RCTs examined. In the remaining eight studies, pain was improved in the balneotherapy treatment arm, but this improvement was statistically not different than that of the comparator treatment arm(s). This beneficial effect lasted for different periods of time: 10 days in one study, 2 weeks in one study, 3 weeks in one study, 12 weeks in 2 studies, 3 months in 11 studies, 16-20 weeks in one study, 24 weeks in three studies, 6 months in three studies, 40 weeks in one study and 1 year in one study. The available data suggest that balneotherapy may be truly associated with improvement in several rheumatological diseases. However, existing research is not sufficiently strong to draw firm conclusions.
Schwitzer, Jonathan A.; Albino, Frank P.; Mathis, Ryan K.; Scott, Amie M.; Gamble, Laurie; Baker, Stephen B.
2015-01-01
Background As rhinoplasty patient demographics evolve, surgeons must consider the impact of demographics on patient satisfaction. Objectives The objective of this study was to identify independent demographic predictors of differences in satisfaction with appearance and quality of life following rhinoplasty utilizing the FACE-Q patient-reported outcome instrument. Methods Patients presenting for rhinoplasty completed the following FACE-Q scales: Satisfaction with Facial Appearance, Satisfaction with Nose, Social Function, and Psychological Well-being. Higher FACE-Q scores indicate greater satisfaction with appearance or superior quality of life. Pre- and post-treatment scores were compared in the context of patient demographics. Results The scales were completed by 59 patients. Women demonstrated statistically significant improvements in Satisfaction with Facial Appearance and quality of life while men only experienced significant improvement in Satisfaction with Facial appearance. Caucasians demonstrated statistically significant improvement in Satisfaction with Facial Appearance and quality of life while non-Caucasians did not. Patients younger than 35 years old were more likely to experience enhanced Satisfaction with Facial Appearance and quality of life compared with patients older than 35 years old. Patients with income ≥$100,000 were more likely to experience significant increases in Satisfaction with Facial Appearance and quality of life than patients with incomes <$100,000. Conclusions In an objective study using a validated patient-reported outcome instrument, the authors were able to quantify differences in the clinically meaningful change in perception of appearance and quality of life that rhinoplasty patients gain based on demographic variables. The authors also demonstrated that these variables are potential predictors of differences in satisfaction. Level of Evidence 3 Therapeutic PMID:26063837
Schwitzer, Jonathan A; Albino, Frank P; Mathis, Ryan K; Scott, Amie M; Gamble, Laurie; Baker, Stephen B
2015-09-01
As rhinoplasty patient demographics evolve, surgeons must consider the impact of demographics on patient satisfaction. The objective of this study was to identify independent demographic predictors of differences in satisfaction with appearance and quality of life following rhinoplasty utilizing the FACE-Q patient-reported outcome instrument. Patients presenting for rhinoplasty completed the following FACE-Q scales: Satisfaction with Facial Appearance, Satisfaction with Nose, Social Function, and Psychological Well-being. Higher FACE-Q scores indicate greater satisfaction with appearance or superior quality of life. Pre- and post-treatment scores were compared in the context of patient demographics. The scales were completed by 59 patients. Women demonstrated statistically significant improvements in Satisfaction with Facial Appearance and quality of life while men only experienced significant improvement in Satisfaction with Facial appearance. Caucasians demonstrated statistically significant improvement in Satisfaction with Facial Appearance and quality of life while non-Caucasians did not. Patients younger than 35 years old were more likely to experience enhanced Satisfaction with Facial Appearance and quality of life compared with patients older than 35 years old. Patients with income ≥$100,000 were more likely to experience significant increases in Satisfaction with Facial Appearance and quality of life than patients with incomes <$100,000. In an objective study using a validated patient-reported outcome instrument, the authors were able to quantify differences in the clinically meaningful change in perception of appearance and quality of life that rhinoplasty patients gain based on demographic variables. The authors also demonstrated that these variables are potential predictors of differences in satisfaction. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.
McLean, Andrew; Lawlor, Jenine; Mitchell, Rob; Kault, David; O'Kane, Carl; Lees, Michelle
2015-02-01
To evaluate the impact of More Learning for Interns in Emergency (MoLIE) on clinical documentation in the ED of a large regional hospital. MoLIE was implemented at The Townsville Hospital (TTH) in 2010, and has since provided ED interns with structured off-floor teaching and a dedicated clinical supervisor. A pre- and post-intervention study was conducted using retrospective medical record review methodology. Charts were selected by identifying all TTH ED patients seen by interns in the period 2008-2011. Two hundred pre-intervention records (2008-2009) and 200 post-intervention records (2010-2011) were reviewed. These were randomly selected following an initial screen by an ED staff specialist. The quality of clinical documentation for five common ED presentations (asthma, chest pain, lacerations, abdominal pain and upper limb fractures) was assessed. For each presentation, documentation quality was scored out of 10 using predefined criteria. An improvement of two or more was thought to be clinically significant. Mean scores for each group were compared using a Student's t-test for independent samples. Mean documentation scores (and 95% confidence intervals) were 5.55 (5.17-5.93) in 2008, 5.42 (4.98-5.86) in 2009, 6.37 (5.99-6.75) in 2010 and 6.08 (5.71-6.45) in 2011. There was a statistically but not clinically significant improvement in scores pre- and post-intervention (P ≤ 0.001). The introduction of MoLIE was associated with a small but statistically significant improvement in documentation, despite an 80% increase in intern placements. These results suggest that structured training programmes have potential to improve intern performance while simultaneously enhancing training capacity. The impact on quality of care requires further evaluation. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Peer-to-Peer Mentoring for African American Women With Lupus: A Feasibility Pilot.
Williams, Edith M; Hyer, J Madison; Viswanathan, Ramakrishnan; Faith, Trevor D; Voronca, Delia; Gebregzaibher, Mulugeta; Oates, Jim C; Egede, Leonard
2018-06-01
To examine the feasibility and potential benefits of peer mentoring to improve the disease self-management and quality of life of individuals with systemic lupus erythematosus (SLE). Peer mentors were trained and paired with up to 3 mentees to receive self-management education and support by telephone over 12 weeks. This study took place at an academic teaching hospital in Charleston, South Carolina. Seven quads consisting of 1 peer mentor and 3 mentees were matched, based on factors such as age, area of residence, and marital and work status. Mentee outcomes of self-management, health-related quality of life, and disease activity were measured using validated tools at baseline, mid-intervention, and post-intervention. Descriptive statistics and effect sizes were calculated to determine clinically important (>0.3) changes from baseline. Mentees showed trends toward lower disease activity (P = 0.004) and improved health-related quality of life, in the form of decreased anxiety (P = 0.018) and decreased depression (P = 0.057). Other improvements in health-related quality of life were observed with effect sizes >0.3, but did not reach statistical significance. In addition, both mentees and mentors gave very high scores for perceived treatment credibility and service delivery. The intervention was well received. Training, the peer-mentoring program, and outcome measures were demonstrated to be feasible with modifications. This result provides preliminary support for the efficacy, acceptability, and perceived credibility of a peer-mentoring approach to improve disease self-management and health-related quality of life in African American women with SLE. Peer mentoring may augment current rheumatologic care. © 2017, American College of Rheumatology.
Promoting dietary diversity to improve child growth in less-resourced rural settings in Uganda.
Kabahenda, M K; Andress, E L; Nickols, S Y; Kabonesa, C; Mullis, R M
2014-04-01
Analyses of global trends indicate that childhood undernutrition is more prevalent in rural areas, and also that maternal education and decision-making power are among the key factors significantly associated with child growth. The present study comprised a controlled longitudinal study aiming to assess the effectiveness of nutrition education with respect to improving growth patterns of young children of less-literate, low income caregivers in a rural subsistence farming community. Caregivers in the intervention group (n = 52) attended a structured nutrition education programme, whereas the control group (n = 45) participated in sewing classes. Weights and lengths/heights were measured for children in the intervention and control groups every month for 1 year to assess changes in growth patterns. Repeated measures analysis of covariance was used to access differences between the two groups over time and across age groups. Variability in growth patterns of individual children and clustering of caregiver effects were controlled for during the statistical analysis. After 12 months, children in the intervention group had significant improvements in weight-for-age compared to the controls [mean (SD): 0.61 (0.15) versus -0.99 (0.16), P = 0.038]. Changes in height-for-age, weight-for-height and mid-upper arm circumference-for-age showed a positive trend for children in the intervention group. Changes in weight-for-height were statistically significant across age groups and negatively related to caregiver's age. Educating caregivers has the potential to improve young children's nutritional status and growth, especially among less literate populations where households subsist on what they produce. © 2013 The Authors Journal of Human Nutrition and Dietetics © 2013 The British Dietetic Association Ltd.
Fortuna, Karen L; DiMilia, Peter R; Lohman, Matthew C; Bruce, Martha L; Zubritsky, Cynthia D; Halaby, Mitch R; Walker, Robert M; Brooks, Jessica M; Bartels, Stephen J
2018-06-01
To assess the feasibility, acceptability, and preliminary effectiveness of a peer-delivered and technology supported integrated medical and psychiatric self-management intervention for older adults with serious mental illness. Ten older adults with serious mental illness (i.e., schizophrenia, schizoaffective disorder, bipolar disorder, or major depressive disorder) and medical comorbidity (i.e., cardiovascular disease, obesity, diabetes, chronic obstructive pulmonary disease, hypertension, and/or high cholesterol) aged 60 years and older received the PeerTECH intervention in their homes. Three certified peer specialists were trained to deliver PeerTECH. Data were collected at baseline, one-month, and three-month. The pilot study demonstrated that a three-month, peer-delivered and technology-supported integrated medical and psychiatric self-management intervention ("PeerTECH") was experienced by peer specialists and participants as feasible and acceptable. PeerTECH was associated with statistically significant improvements in psychiatric self-management. In addition, pre/post, non-statistically significant improvements were observed in self-efficacy for managing chronic health conditions, hope, quality of life, medical self-management skills, and empowerment. This pre/post pilot study demonstrated it is possible to train peers to use technology to deliver an integrated psychiatric and medical self-management intervention in a home-based setting to older adults with serious mental illness with fidelity. These findings provide preliminary evidence that a peer-delivered and technology-supported intervention designed to improve medical and psychiatric self-management is feasible, acceptable, and is potentially associated with improvements in psychiatric self-management, self-efficacy for managing chronic health conditions, hope, quality of life, medical self-management skills, and empowerment with older adults with serious mental illness and chronic health conditions.
The impact of obesity surgery on musculoskeletal disease.
El-Khani, Ussamah; Ahmed, Ahmed; Hakky, Sherif; Nehme, Jean; Cousins, Jonathan; Chahal, Harvinder; Purkayastha, Sanjay
2014-12-01
Obesity is an important modifiable risk factor for musculoskeletal disease. A Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)-compliant systematic review of bariatric surgery on musculoskeletal disease symptoms was performed. One thousand nineteen papers were identified, of which 43 were eligible for data synthesis. There were 79 results across 24 studies pertaining to physical capacity, of which 53 (67 %) demonstrated statistically significant post-operative improvement. There were 75 results across 33 studies pertaining to musculoskeletal pain, of which 42 (56 %) demonstrated a statistically significant post-operative improvement. There were 13 results across 6 studies pertaining to arthritis, of which 5 (38 %) demonstrated a statistically significant post-operative improvement. Bariatric surgery significantly improved musculoskeletal disease symptoms in 39 of the 43 studies. These changes were evident in a follow-up of 1 month to 10 years.
Using High Resolution Model Data to Improve Lightning Forecasts across Southern California
NASA Astrophysics Data System (ADS)
Capps, S. B.; Rolinski, T.
2014-12-01
Dry lightning often results in a significant amount of fire starts in areas where the vegetation is dry and continuous. Meteorologists from the USDA Forest Service Predictive Services' program in Riverside, California are tasked to provide southern and central California's fire agencies with fire potential outlooks. Logistic regression equations were developed by these meteorologists several years ago, which forecast probabilities of lightning as well as lightning amounts, out to seven days across southern California. These regression equations were developed using ten years of historical gridded data from the Global Forecast System (GFS) model on a coarse scale (0.5 degree resolution), correlated with historical lightning strike data. These equations do a reasonably good job of capturing a lightning episode (3-5 consecutive days or greater of lightning), but perform poorly regarding more detailed information such as exact location and amounts. It is postulated that the inadequacies in resolving the finer details of episodic lightning events is due to the coarse resolution of the GFS data, along with limited predictors. Stability parameters, such as the Lifted Index (LI), the Total Totals index (TT), Convective Available Potential Energy (CAPE), along with Precipitable Water (PW) are the only parameters being considered as predictors. It is hypothesized that the statistical forecasts will benefit from higher resolution data both in training and implementing the statistical model. We have dynamically downscaled NCEP FNL (Final) reanalysis data using the Weather Research and Forecasting model (WRF) to 3km spatial and hourly temporal resolution across a decade. This dataset will be used to evaluate the contribution to the success of the statistical model of additional predictors in higher vertical, spatial and temporal resolution. If successful, we will implement an operational dynamically downscaled GFS forecast product to generate predictors for the resulting statistical lightning model. This data will help fire agencies be better prepared to pre-deploy resources in advance of these events. Specific information regarding duration, amount, and location will be especially valuable.
Statistics in the Speed Cameras Debate
ERIC Educational Resources Information Center
Girard, Jean Claude
2013-01-01
This article illustrates how statistical arguments can be used to influence public policy... for better or for worse. Road safety has improved in a signi?cant way since the 1970s in developed countries. If road casualties and number of killed have decreased, there are many reasons for this, including improvement in roads, building of motorways,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisk, William J.
The evidence of health benefits of particle filtration in homes and commercial buildings is reviewed. Prior reviews of papers published before 2000 are summarized. The results of 16 more recent intervention studies are compiled and analyzed. Also, reviewed are four studies that modeled health benefits of using filtration to reduce indoor exposures to particles from outdoors. Prior reviews generally concluded that particle filtration is, at best, a source of small improvements in allergy and asthma health effects; however, many early studies had weak designs. A majority of recent intervention studies employed strong designs and more of these studies report statisticallymore » significant improvements in health symptoms or objective health outcomes, particularly for subjects with allergies or asthma. The percent age improvement in health outcomes is typically modest, for example, 7percent to 25percent. Delivery of filtered air to the breathing zone of sleeping allergic or asthmatic persons may be more consistently effective in improving health than room air filtration. Notable are two studies that report statistically significant improvements, with filtration, in markers that predict future adverse coronary events. From modeling, the largest potential benefits of indoor particle filtration may be reductions in morbidity and mortality from reducing indoor exposures to particles from outdoor air.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisk, William J.
The evidence of health benefits of particle filtration in homes and commercial buildings is reviewed. Prior reviews of papers published before 2000 are summarized. The results of 16 more recent intervention studies are compiled and analyzed. Also reviewed are four studies that modeled health benefits of using filtration to reduce indoor exposures to particles from outdoors. Prior reviews generally concluded that particle filtration is, at best, a source of small improvements in allergy and asthma health effects; however, many early studies had weak designs. A majority of recent intervention studies employed strong designs and more of these studies report statisticallymore » significant improvements in health symptoms or objective health outcomes, particularly for subjects with allergies or asthma. The percentage improvement in health outcomes is typically modest, e.g., 7percent to 25percent. Delivery of filtered air to the breathing zone of sleeping allergic or asthmatic persons may be more consistently effective in improving health than room air filtration. Notable are two studies that report statistically significant improvements, with filtration, in markers that predict future adverse coronary events. From modeling, the largest potential benefits of indoor particle filtration may be reductions in morbidity and mortality from reducing indoor exposures to particles from outdoor air.« less
Effect of video self-observations vs. observations of others on insight in psychotic disorders.
David, Anthony S; Chis Ster, Irina; Zavarei, Hooman
2012-04-01
Improving insight in patients with schizophrenia and related disorders is a worthwhile goal. Previous work has suggested that patients' insight may improve if they see videos of themselves taken when ill. Our aim was to test the hypothesis that schizophrenia patients improve their insight after viewing videos of themselves when unwell more so than after viewing an actor. Forty patients admitted with an acute psychotic disorder underwent a videotaped recording of a clinical interview. The patients were then randomized to viewing this or a "control" video of a same-sex actor displaying psychotic symptoms approximately 3 weeks later. Insight, psychopathology, and mood were assessed before and 24 to 48 hours after viewing the videos. All participants showed general improvement across all measures. There was a trend for scores on the Schedule for the Assessment of Insight to improve more in those who viewed themselves when ill, but there were no clear statistically significant differences between the "self" and "other" video groups. In conclusion, video self-confrontation seems to be a safe and potentially effective means of enhancing insight, but evidence for a specific effect is lacking.
Brody, Abraham A; Guan, Carrie; Cortes, Tara; Galvin, James E
2016-01-01
Home health care agencies are increasingly taking care of sicker, older patients with greater comorbidities. However, they are unequipped to appropriately manage these older adults, particular persons living with dementia (PLWD). We therefore developed the Dementia Symptom Management at Home (DSM-H) Program, a bundled interprofessional intervention, to improve the care confidence of providers, and quality of care delivered to PLWD and their caregivers. We implemented the DSM-H with 83 registered nurses, physical therapists, and occupational therapists. Overall, there was significant improvement in pain knowledge (5.9%) and confidence (26.5%), depression knowledge (14.8%) and confidence (36.1%), and neuropsychiatric symptom general knowledge (16.8%), intervention knowledge (20.9%), attitudes (3.4%) and confidence (27.1%) at a statistical significance of (P < .0001). We also found significant differences between disciplines. Overall, this disseminable program proved to be implementable and improve clinician's knowledge and confidence in caring for PLWD, with the potential to improve quality of care and quality of life, and decrease costs. Published by Elsevier Inc.
Raymond, Mark R; Clauser, Brian E; Furman, Gail E
2010-10-01
The use of standardized patients to assess communication skills is now an essential part of assessing a physician's readiness for practice. To improve the reliability of communication scores, it has become increasingly common in recent years to use statistical models to adjust ratings provided by standardized patients. This study employed ordinary least squares regression to adjust ratings, and then used generalizability theory to evaluate the impact of these adjustments on score reliability and the overall standard error of measurement. In addition, conditional standard errors of measurement were computed for both observed and adjusted scores to determine whether the improvements in measurement precision were uniform across the score distribution. Results indicated that measurement was generally less precise for communication ratings toward the lower end of the score distribution; and the improvement in measurement precision afforded by statistical modeling varied slightly across the score distribution such that the most improvement occurred in the upper-middle range of the score scale. Possible reasons for these patterns in measurement precision are discussed, as are the limitations of the statistical models used for adjusting performance ratings.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A
2011-01-01
Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Tissue Chips to aid drug development and modeling for rare diseases
Low, Lucie A.; Tagle, Danilo A.
2016-01-01
Introduction The technologies used to design, create and use microphysiological systems (MPS, “tissue chips” or “organs-on-chips”) have progressed rapidly in the last 5 years, and validation studies of the functional relevance of these platforms to human physiology, and response to drugs for individual model organ systems, are well underway. These studies are paving the way for integrated multi-organ systems that can model diseases and predict drug efficacy and toxicology of multiple organs in real-time, improving the potential for diagnostics and development of novel treatments of rare diseases in the future. Areas covered This review will briefly summarize the current state of tissue chip research and highlight model systems where these microfabricated (or bioengineered) devices are already being used to screen therapeutics, model disease states, and provide potential treatments in addition to helping elucidate the basic molecular and cellular phenotypes of rare diseases. Expert opinion Microphysiological systems hold great promise and potential for modeling rare disorders, as well as for their potential use to enhance the predictive power of new drug therapeutics, plus potentially increase the statistical power of clinical trials while removing the inherent risks of these trials in rare disease populations. PMID:28626620
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
Issues with data and analyses: Errors, underlying themes, and potential solutions
Allison, David B.
2018-01-01
Some aspects of science, taken at the broadest level, are universal in empirical research. These include collecting, analyzing, and reporting data. In each of these aspects, errors can and do occur. In this work, we first discuss the importance of focusing on statistical and data errors to continually improve the practice of science. We then describe underlying themes of the types of errors and postulate contributing factors. To do so, we describe a case series of relatively severe data and statistical errors coupled with surveys of some types of errors to better characterize the magnitude, frequency, and trends. Having examined these errors, we then discuss the consequences of specific errors or classes of errors. Finally, given the extracted themes, we discuss methodological, cultural, and system-level approaches to reducing the frequency of commonly observed errors. These approaches will plausibly contribute to the self-critical, self-correcting, ever-evolving practice of science, and ultimately to furthering knowledge. PMID:29531079
Hannigan, Ailish; Bargary, Norma; Kinsella, Anthony; Clarke, Mary
2017-06-14
Although the relationships between duration of untreated psychosis (DUP) and outcomes are often assumed to be linear, few studies have explored the functional form of these relationships. The aim of this study is to demonstrate the potential of recent advances in curve fitting approaches (splines) to explore the form of the relationship between DUP and global assessment of functioning (GAF). Curve fitting approaches were used in models to predict change in GAF at long-term follow-up using DUP for a sample of 83 individuals with schizophrenia. The form of the relationship between DUP and GAF was non-linear. Accounting for non-linearity increased the percentage of variance in GAF explained by the model, resulting in better prediction and understanding of the relationship. The relationship between DUP and outcomes may be complex and model fit may be improved by accounting for the form of the relationship. This should be routinely assessed and new statistical approaches for non-linear relationships exploited, if appropriate. © 2017 John Wiley & Sons Australia, Ltd.
NASA Astrophysics Data System (ADS)
Moron, Vincent; Navarra, Antonio
2000-05-01
This study presents the skill of the seasonal rainfall of tropical America from an ensemble of three 34-year general circulation model (ECHAM 4) simulations forced with observed sea surface temperature between 1961 and 1994. The skill gives a first idea of the amount of potential predictability if the sea surface temperatures are perfectly known some time in advance. We use statistical post-processing based on the leading modes (extracted from Singular Value Decomposition of the covariance matrix between observed and simulated rainfall fields) to improve the raw skill obtained by simple comparison between observations and simulations. It is shown that 36-55 % of the observed seasonal variability is explained by the simulations on a regional basis. Skill is greatest for Brazilian Nordeste (March-May), but also for northern South America or the Caribbean basin in June-September or northern Amazonia in September-November for example.
Model-based reconstruction of synthetic promoter library in Corynebacterium glutamicum.
Zhang, Shuanghong; Liu, Dingyu; Mao, Zhitao; Mao, Yufeng; Ma, Hongwu; Chen, Tao; Zhao, Xueming; Wang, Zhiwen
2018-05-01
To develop an efficient synthetic promoter library for fine-tuned expression of target genes in Corynebacterium glutamicum. A synthetic promoter library for C. glutamicum was developed based on conserved sequences of the - 10 and - 35 regions. The synthetic promoter library covered a wide range of strengths, ranging from 1 to 193% of the tac promoter. 68 promoters were selected and sequenced for correlation analysis between promoter sequence and strength with a statistical model. A new promoter library was further reconstructed with improved promoter strength and coverage based on the results of correlation analysis. Tandem promoter P70 was finally constructed with increased strength by 121% over the tac promoter. The promoter library developed in this study showed a great potential for applications in metabolic engineering and synthetic biology for the optimization of metabolic networks. To the best of our knowledge, this is the first reconstruction of synthetic promoter library based on statistical analysis of C. glutamicum.
Certification Strategies using Run-Time Safety Assurance for Part 23 Autopilot Systems
NASA Technical Reports Server (NTRS)
Hook, Loyd R.; Clark, Matthew; Sizoo, David; Skoog, Mark A.; Brady, James
2016-01-01
Part 23 aircraft operation, and in particular general aviation, is relatively unsafe when compared to other common forms of vehicle travel. Currently, there exists technologies that could increase safety statistics for these aircraft; however, the high burden and cost of performing the requisite safety critical certification processes for these systems limits their proliferation. For this reason, many entities, including the Federal Aviation Administration, NASA, and the US Air Force, are considering new options for certification for technologies that will improve aircraft safety. Of particular interest, are low cost autopilot systems for general aviation aircraft, as these systems have the potential to positively and significantly affect safety statistics. This paper proposes new systems and techniques, leveraging run-time verification, for the assurance of general aviation autopilot systems, which would be used to supplement the current certification process and provide a viable path for near-term low-cost implementation. In addition, discussions on preliminary experimentation and building the assurance case for a system, based on these principles, is provided.
Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods
NASA Technical Reports Server (NTRS)
Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.;
2016-01-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Similar estimates of temperature impacts on global wheat yield by three independent methods
NASA Astrophysics Data System (ADS)
Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan
2016-12-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Saini, Parmesh K; Marks, Harry M; Dreyfuss, Moshe S; Evans, Peter; Cook, L Victor; Dessai, Uday
2011-08-01
Measuring commonly occurring, nonpathogenic organisms on poultry products may be used for designing statistical process control systems that could result in reductions of pathogen levels. The extent of pathogen level reduction that could be obtained from actions resulting from monitoring these measurements over time depends upon the degree of understanding cause-effect relationships between processing variables, selected output variables, and pathogens. For such measurements to be effective for controlling or improving processing to some capability level within the statistical process control context, sufficiently frequent measurements would be needed to help identify processing deficiencies. Ultimately the correct balance of sampling and resources is determined by those characteristics of deficient processing that are important to identify. We recommend strategies that emphasize flexibility, depending upon sampling objectives. Coupling the measurement of levels of indicator organisms with practical emerging technologies and suitable on-site platforms that decrease the time between sample collections and interpreting results would enhance monitoring process control.
[Immunological and clinical study on therapeutic efficacy of inosine pranobex].
Gołebiowska-Wawrzyniak, Maria; Markiewicz, Katarzyna; Kozar, Agata; Derentowicz, Piotr; Czerwińska-Kartowicz, Iwona; Jastrzebska-Janas, Krystyna; Wacławek, Jolanta; Wawrzyniak, Zbigniew M; Siwińska-Gołebiowska, Henryka
2005-09-01
Many studies in vitro and in vivo have shown immunomodulating and antiviral activities of inosine pranobex. The object of this research was to examine the potential beneficial effects of inosine pranobex (Groprinosin) on immune system in children with cellular immunodeficiency as a prophylaxis of recurrent infections, mainly of viral origin. 50 mg/kg b.w/day of inosine pranobex in divided doses was given to the group of 30 children aged 3-15 years for 10 days in 3 following months. Clinical and immunological investigations were done before and after the treatment. Statistically significant rise of CD3T lymphocytes number (p = 0.02) and in this CD4T lymphocytes number (p = 0.02) as well as statistically significant improvement of their function (p = 0.005) evaluated with blastic transformation method were found. These laboratory findings were parallel to clinical benefits. Control study was performed in the group of children completed by randomization and treated in the same way with garlic (Alliofil).
Estimating weak ratiometric signals in imaging data. I. Dual-channel data.
Broder, Josef; Majumder, Anirban; Porter, Erika; Srinivasamoorthy, Ganesh; Keith, Charles; Lauderdale, James; Sornborger, Andrew
2007-09-01
Ratiometric fluorescent indicators are becoming increasingly prevalent in many areas of biology. They are used for making quantitative measurements of intracellular free calcium both in vitro and in vivo, as well as measuring membrane potentials, pH, and other important physiological variables of interest to researchers in many subfields. Often, functional changes in the fluorescent yield of ratiometric indicators are small, and the signal-to-noise ratio (SNR) is of order unity or less. In particular, variability in the denominator of the ratio can lead to very poor ratio estimates. We present a statistical optimization method for objectively detecting and estimating ratiometric signals in dual-wavelength measurements of fluorescent, ratiometric indicators that improves on standard methods. With the use of an appropriate statistical model for ratiometric signals and by taking the pixel-pixel covariance of an imaging dataset into account, we are able to extract user-independent spatiotemporal information that retains high resolution in both space and time.
Effects of temporal variability in ground data collection on classification accuracy
Hoch, G.A.; Cully, J.F.
1999-01-01
This research tested whether the timing of ground data collection can significantly impact the accuracy of land cover classification. Ft. Riley Military Reservation, Kansas, USA was used to test this hypothesis. The U.S. Army's Land Condition Trend Analysis (LCTA) data annually collected at military bases was used to ground truth disturbance patterns. Ground data collected over an entire growing season and data collected one year after the imagery had a kappa statistic of 0.33. When using ground data from only within two weeks of image acquisition the kappa statistic improved to 0.55. Potential sources of this discrepancy are identified. These data demonstrate that there can be significant amounts of land cover change within a narrow time window on military reservations. To accurately conduct land cover classification at military reservations, ground data need to be collected in as narrow a window of time as possible and be closely synchronized with the date of the satellite imagery.
NETWORK ASSISTED ANALYSIS TO REVEAL THE GENETIC BASIS OF AUTISM1
Liu, Li; Lei, Jing; Roeder, Kathryn
2016-01-01
While studies show that autism is highly heritable, the nature of the genetic basis of this disorder remains illusive. Based on the idea that highly correlated genes are functionally interrelated and more likely to affect risk, we develop a novel statistical tool to find more potentially autism risk genes by combining the genetic association scores with gene co-expression in specific brain regions and periods of development. The gene dependence network is estimated using a novel partial neighborhood selection (PNS) algorithm, where node specific properties are incorporated into network estimation for improved statistical and computational efficiency. Then we adopt a hidden Markov random field (HMRF) model to combine the estimated network and the genetic association scores in a systematic manner. The proposed modeling framework can be naturally extended to incorporate additional structural information concerning the dependence between genes. Using currently available genetic association data from whole exome sequencing studies and brain gene expression levels, the proposed algorithm successfully identified 333 genes that plausibly affect autism risk. PMID:27134692
NASA Astrophysics Data System (ADS)
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble, which is a case-matching scheme. The presentation will provide (1) an overview of each method and the experimental design, (2) performance comparisons based on standard metrics such as bias, MAE and RMSE, (3) a summary of the performance characteristics of each approach and (4) a preview of further experiments to be conducted.
Paradigms for adaptive statistical information designs: practical experiences and strategies.
Wang, Sue-Jane; Hung, H M James; O'Neill, Robert
2012-11-10
In the last decade or so, interest in adaptive design clinical trials has gradually been directed towards their use in regulatory submissions by pharmaceutical drug sponsors to evaluate investigational new drugs. Methodological advances of adaptive designs are abundant in the statistical literature since the 1970s. The adaptive design paradigm has been enthusiastically perceived to increase the efficiency and to be more cost-effective than the fixed design paradigm for drug development. Much interest in adaptive designs is in those studies with two-stages, where stage 1 is exploratory and stage 2 depends upon stage 1 results, but where the data of both stages will be combined to yield statistical evidence for use as that of a pivotal registration trial. It was not until the recent release of the US Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics (2010) that the boundaries of flexibility for adaptive designs were specifically considered for regulatory purposes, including what are exploratory goals, and what are the goals of adequate and well-controlled (A&WC) trials (2002). The guidance carefully described these distinctions in an attempt to minimize the confusion between the goals of preliminary learning phases of drug development, which are inherently substantially uncertain, and the definitive inference-based phases of drug development. In this paper, in addition to discussing some aspects of adaptive designs in a confirmatory study setting, we underscore the value of adaptive designs when used in exploratory trials to improve planning of subsequent A&WC trials. One type of adaptation that is receiving attention is the re-estimation of the sample size during the course of the trial. We refer to this type of adaptation as an adaptive statistical information design. Specifically, a case example is used to illustrate how challenging it is to plan a confirmatory adaptive statistical information design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development. Published 2012. This article is a US Government work and is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Salvi, Kaustubh; Villarini, Gabriele; Vecchi, Gabriel A.
2017-10-01
Unprecedented alterations in precipitation characteristics over the last century and especially in the last two decades have posed serious socio-economic problems to society in terms of hydro-meteorological extremes, in particular flooding and droughts. The origin of these alterations has its roots in changing climatic conditions; however, its threatening implications can only be dealt with through meticulous planning that is based on realistic and skillful decadal precipitation predictions (DPPs). Skillful DPPs represent a very challenging prospect because of the complexities associated with precipitation predictions. Because of the limited skill and coarse spatial resolution, the DPPs provided by General Circulation Models (GCMs) fail to be directly applicable for impact assessment. Here, we focus on nine GCMs and quantify the seasonally and regionally averaged skill in DPPs over the continental United States. We address the problems pertaining to the limited skill and resolution by applying linear and kernel regression-based statistical downscaling approaches. For both the approaches, statistical relationships established over the calibration period (1961-1990) are applied to the retrospective and near future decadal predictions by GCMs to obtain DPPs at ∼4 km resolution. The skill is quantified across different metrics that evaluate potential skill, biases, long-term statistical properties, and uncertainty. Both the statistical approaches show improvements with respect to the raw GCM data, particularly in terms of the long-term statistical properties and uncertainty, irrespective of lead time. The outcome of the study is monthly DPPs from nine GCMs with 4-km spatial resolution, which can be used as a key input for impacts assessments.
A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism.
Levman, Jacob; Takahashi, Emi; Forgeron, Cynthia; MacDonald, Patrick; Stewart, Natalie; Lim, Ashley; Martel, Anne
2018-01-01
Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital , Harvard Medical School , demonstrating that it can potentially play a constructive role in future healthcare technologies.
A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism
Takahashi, Emi; Lim, Ashley; Martel, Anne
2018-01-01
Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital, Harvard Medical School, demonstrating that it can potentially play a constructive role in future healthcare technologies. PMID:29796236
Inferring Demographic History Using Two-Locus Statistics.
Ragsdale, Aaron P; Gutenkunst, Ryan N
2017-06-01
Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.
NASA Astrophysics Data System (ADS)
Grewer, Uwe; Nash, Julie; Gurwick, Noel; Bockel, Louis; Galford, Gillian; Richards, Meryl; Costa Junior, Ciniro; White, Julianna; Pirolli, Gillian; Wollenberg, Eva
2018-04-01
This article analyses the greenhouse gas (GHG) impact potential of improved management practices and technologies for smallholder agriculture promoted under a global food security development program. Under ‘business-as-usual’ development, global studies on the future of agriculture to 2050 project considerable increases in total food production and cultivated area. Conventional cropland intensification and conversion of natural vegetation typically result in increased GHG emissions and loss of carbon stocks. There is a strong need to understand the potential greenhouse gas impacts of agricultural development programs intended to achieve large-scale change, and to identify pathways of smallholder agricultural development that can achieve food security and agricultural production growth without drastic increases in GHG emissions. In an analysis of 134 crop and livestock production systems in 15 countries with reported impacts on 4.8 million ha, improved management practices and technologies by smallholder farmers significantly reduce GHG emission intensity of agricultural production, increase yields and reduce post-harvest losses, while either decreasing or only moderately increasing net GHG emissions per area. Investments in both production and post-harvest stages meaningfully reduced GHG emission intensity, contributing to low emission development. We present average impacts on net GHG emissions per hectare and GHG emission intensity, while not providing detailed statistics of GHG impacts at scale that are associated to additional uncertainties. While reported improvements in smallholder systems effectively reduce future GHG emissions compared to business-as-usual development, these contributions are insufficient to significantly reduce net GHG emission in agriculture beyond current levels, particularly if future agricultural production grows at projected rates.
Potentials for Platooning in U.S. Highway Freight Transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muratori, Matteo; Holden, Jacob; Lammert, Michael
2017-03-28
Smart technologies enabling connection among vehicles and between vehicles and infrastructure as well as vehicle automation to assist human operators are receiving significant attention as a means for improving road transportation systems by reducing fuel consumption - and related emissions - while also providing additional benefits through improving overall traffic safety and efficiency. For truck applications, which are currently responsible for nearly three-quarters of the total U.S. freight energy use and greenhouse gas (GHG) emissions, platooning has been identified as an early feature for connected and automated vehicles (CAVs) that could provide significant fuel savings and improved traffic safety andmore » efficiency without radical design or technology changes compared to existing vehicles. A statistical analysis was performed based on a large collection of real-world U.S. truck usage data to estimate the fraction of total miles that are technically suitable for platooning. In particular, our analysis focuses on estimating 'platoonable' mileage based on overall highway vehicle use and prolonged high-velocity traveling, and established that about 65% of the total miles driven by combination trucks from this data sample could be driven in platoon formation, leading to a 4% reduction in total truck fuel consumption. This technical potential for 'platoonable' miles in the United States provides an upper bound for scenario analysis considering fleet willingness and convenience to platoon as an estimate of overall benefits of early adoption of connected and automated vehicle technologies. A benefit analysis is proposed to assess the overall potential for energy savings and emissions mitigation by widespread implementation of highway platooning for trucks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigeti, David E.; Pelak, Robert A.
We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis withmore » an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a general beta-function prior for {theta}, enabling sequential analysis in which a small number of new simulations may be done and the resulting posterior for {theta} used as a prior to inform the next stage of power analysis.« less
ERIC Educational Resources Information Center
Martin, Justin D.
2017-01-01
This essay presents data from a census of statistics requirements and offerings at all 4-year journalism programs in the United States (N = 369) and proposes a model of a potential course in statistics for journalism majors. The author proposes that three philosophies underlie a statistics course for journalism students. Such a course should (a)…
NASA Astrophysics Data System (ADS)
Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang
2018-01-01
Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.
Pauli structures arising from confined particles interacting via a statistical potential
NASA Astrophysics Data System (ADS)
Batle, Josep; Ciftja, Orion; Farouk, Ahmed; Alkhambashi, Majid; Abdalla, Soliman
2017-09-01
There have been suggestions that the Pauli exclusion principle alone can lead a non-interacting (free) system of identical fermions to form crystalline structures dubbed Pauli crystals. Single-shot imaging experiments for the case of ultra-cold systems of free spin-polarized fermionic atoms in a two-dimensional harmonic trap appear to show geometric arrangements that cannot be characterized as Wigner crystals. This work explores this idea and considers a well-known approach that enables one to treat a quantum system of free fermions as a system of classical particles interacting with a statistical interaction potential. The model under consideration, though classical in nature, incorporates the quantum statistics by endowing the classical particles with an effective interaction potential. The reasonable expectation is that possible Pauli crystal features seen in experiments may manifest in this model that captures the correct quantum statistics as a first order correction. We use the Monte Carlo simulated annealing method to obtain the most stable configurations of finite two-dimensional systems of confined particles that interact with an appropriate statistical repulsion potential. We consider both an isotropic harmonic and a hard-wall confinement potential. Despite minor differences, the most stable configurations observed in our model correspond to the reported Pauli crystals in single-shot imaging experiments of free spin-polarized fermions in a harmonic trap. The crystalline configurations observed appear to be different from the expected classical Wigner crystal structures that would emerge should the confined classical particles had interacted with a pair-wise Coulomb repulsion.
Qiao, Zhi; Li, Xiang; Liu, Haifeng; Zhang, Lei; Cao, Junyang; Xie, Guotong; Qin, Nan; Jiang, Hui; Lin, Haocheng
2017-01-01
The prevalence of erectile dysfunction (ED) has been extensively studied worldwide. Erectile dysfunction drugs has shown great efficacy in preventing male erectile dysfunction. In order to help doctors know drug taken preference of patients and better prescribe, it is crucial to analyze who actually take erectile dysfunction drugs and the relation between sexual behaviors and drug use. Existing clinical studies usually used descriptive statistics and regression analysis based on small volume of data. In this paper, based on big volume of data (48,630 questionnaires), we use data mining approaches besides statistics and regression analysis to comprehensively analyze the relation between male sexual behaviors and use of erectile dysfunction drugs for unravelling the characteristic of patients who take erectile dysfunction drugs. We firstly analyze the impact of multiple sexual behavior factors on whether to use the erectile dysfunction drugs. Then, we explore to mine the Decision Rules for Stratification to discover patients who are more likely to take drugs. Based on the decision rules, the patients can be partitioned into four potential groups for use of erectile dysfunction: high potential group, intermediate potential-1 group, intermediate potential-2 group and low potential group. Experimental results show 1) the sexual behavior factors, erectile hardness and time length to prepare (how long to prepares for sexual behaviors ahead of time), have bigger impacts both in correlation analysis and potential drug taking patients discovering; 2) odds ratio between patients identified as low potential and high potential was 6.098 (95% confidence interval, 5.159-7.209) with statistically significant differences in taking drug potential detected between all potential groups.
Preventability and Causes of Readmissions in a National Cohort of General Medicine Patients.
Auerbach, Andrew D; Kripalani, Sunil; Vasilevskis, Eduard E; Sehgal, Neil; Lindenauer, Peter K; Metlay, Joshua P; Fletcher, Grant; Ruhnke, Gregory W; Flanders, Scott A; Kim, Christopher; Williams, Mark V; Thomas, Larissa; Giang, Vernon; Herzig, Shoshana J; Patel, Kanan; Boscardin, W John; Robinson, Edmondo J; Schnipper, Jeffrey L
2016-04-01
Readmission penalties have catalyzed efforts to improve care transitions, but few programs have incorporated viewpoints of patients and health care professionals to determine readmission preventability or to prioritize opportunities for care improvement. To determine preventability of readmissions and to use these estimates to prioritize areas for improvement. An observational study was conducted of 1000 general medicine patients readmitted within 30 days of discharge to 12 US academic medical centers between April 1, 2012, and March 31, 2013. We surveyed patients and physicians, reviewed documentation, and performed 2-physician case review to determine preventability of and factors contributing to readmission. We used bivariable statistics to compare preventable and nonpreventable readmissions, multivariable models to identify factors associated with potential preventability, and baseline risk factor prevalence and adjusted odds ratios (aORs) to determine the proportion of readmissions affected by individual risk factors. Likelihood that a readmission could have been prevented. The study cohort comprised 1000 patients (median age was 55 years). Of these, 269 (26.9%) were considered potentially preventable. In multivariable models, factors most strongly associated with potential preventability included emergency department decision making regarding the readmission (aOR, 9.13; 95% CI, 5.23-15.95), failure to relay important information to outpatient health care professionals (aOR, 4.19; 95% CI, 2.17-8.09), discharge of patients too soon (aOR, 3.88; 95% CI, 2.44-6.17), and lack of discussions about care goals among patients with serious illnesses (aOR, 3.84; 95% CI, 1.39-10.64). The most common factors associated with potentially preventable readmissions included emergency department decision making (affecting 9.0%; 95% CI, 7.1%-10.3%), inability to keep appointments after discharge (affecting 8.3%; 95% CI, 4.1%-12.0%), premature discharge from the hospital (affecting 8.7%; 95% CI, 5.8%-11.3%), and patient lack of awareness of whom to contact after discharge (affecting 6.2%; 95% CI, 3.5%-8.7%). Approximately one-quarter of readmissions are potentially preventable when assessed using multiple perspectives. High-priority areas for improvement efforts include improved communication among health care teams and between health care professionals and patients, greater attention to patients' readiness for discharge, enhanced disease monitoring, and better support for patient self-management.
Behavioral biometrics for verification and recognition of malicious software agents
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.; Govindaraju, Venu
2008-04-01
Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.
Increased Surface Fatigue Lives of Spur Gears by Application of a Coating
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.; Cooper, Clark V.; Townsend, Dennis P.; Hansen, Bruce D.
2003-01-01
Hard coatings have potential for increasing gear surface fatigue lives. Experiments were conducted using gears both with and without a metal-containing, carbonbased coating. The gears were case-carburized AISI 9310 steel spur gears. Some gears were provided with the coating by magnetron sputtering. Lives were evaluated by accelerated life tests. For uncoated gears, all of fifteen tests resulted in fatigue failure before completing 275 million revolutions. For coated gears, eleven of the fourteen tests were suspended with no fatigue failure after 275 million revolutions. The improved life owing to the coating, approximately a six-fold increase, was a statistically significant result.
Convergence of Mayer and Virial expansions and the Penrose tree-graph identity
NASA Astrophysics Data System (ADS)
Procacci, Aldo; Yuhjtman, Sergio A.
2017-01-01
We establish new lower bounds for the convergence radius of the Mayer series and the Virial series of a continuous particle system interacting via a stable and tempered pair potential. Our bounds considerably improve those given by Penrose (J Math Phys 4:1312, 1963) and Ruelle (Ann Phys 5:109-120, 1963) for the Mayer series and by Lebowitz and Penrose (J Math Phys 7:841-847, 1964) for the Virial series. To get our results, we exploit the tree-graph identity given by Penrose (Statistical mechanics: foundations and applications. Benjamin, New York, 1967) using a new partition scheme based on minimum spanning trees.
Antenna systems for base station diversity in urban small and micro cells
NASA Astrophysics Data System (ADS)
Eggers, Patrick C. F.; Toftgard, Jorn; Oprea, Alex M.
1993-09-01
This paper describes cross-correlation properties for compact urban base station antenna configurations, nearly all resulting in very low envelope cross-correlation coefficients of about 0.1 to 0.3. A focus is set on polarization diversity systems for their potential in improving link quality when hand-held terminals are involved. An expression is given for the correlation function of compound space and polarization diversity systems. Dispersion and envelope dynamic statistics are presented for the measured environments. For microcell applications, it is found that systems such as GSM having a bandwidth of 200 MHz or less can use narrowband cross-correlation analysis directly.
Results of a real-time irradiation of lithium P/N and conventional N/P silicon solar cells.
NASA Technical Reports Server (NTRS)
Reynard, D. L.; Peterson, D. G.
1972-01-01
Eight types of lithium-diffused P/N and three types of conventional 10 ohm-cm N/P silicon solar cells were irradiated at four different temperatures with a strontium-90 radioisotope at a rate typical of that expected in earth orbit. The six-month irradiation confirmed earlier accelerator results, showed that certain cell types outperform others at the various temperatures, and, in general, verified the recent improvements and potential usefulness of lithium solar cells. The experimental approach and statistical methods and analyses employed yielded increased confidence in the validity of the results. Injection level effects were observed to be significant.
Space Weather in the Machine Learning Era: A Multidisciplinary Approach
NASA Astrophysics Data System (ADS)
Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.
2018-01-01
The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.
ERIC Educational Resources Information Center
Garfield, Joan; delMas, Robert
2010-01-01
The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…
2016-01-29
31 Appendix B. Improvement in PAR Completion Statistics _________________________________ 33 vi...agencies must perform frequent evaluation of compliance with reporting requirements so they can readily identify delinquent past performance efforts...Reporting Program,” August 13, 2011 Appendixes DODIG-2016-043 │ 33 Appendix B Improvement in PAR Completion Statistics The Senate Armed Services Committee
2017-01-01
This paper provides evidence on the usefulness of very high spatial resolution (VHR) imagery in gathering socioeconomic information in urban settlements. We use land cover, spectral, structure and texture features extracted from a Google Earth image of Liverpool (UK) to evaluate their potential to predict Living Environment Deprivation at a small statistical area level. We also contribute to the methodological literature on the estimation of socioeconomic indices with remote-sensing data by introducing elements from modern machine learning. In addition to classical approaches such as Ordinary Least Squares (OLS) regression and a spatial lag model, we explore the potential of the Gradient Boost Regressor and Random Forests to improve predictive performance and accuracy. In addition to novel predicting methods, we also introduce tools for model interpretation and evaluation such as feature importance and partial dependence plots, or cross-validation. Our results show that Random Forest proved to be the best model with an R2 of around 0.54, followed by Gradient Boost Regressor with 0.5. Both the spatial lag model and the OLS fall behind with significantly lower performances of 0.43 and 0.3, respectively. PMID:28464010
Arribas-Bel, Daniel; Patino, Jorge E; Duque, Juan C
2017-01-01
This paper provides evidence on the usefulness of very high spatial resolution (VHR) imagery in gathering socioeconomic information in urban settlements. We use land cover, spectral, structure and texture features extracted from a Google Earth image of Liverpool (UK) to evaluate their potential to predict Living Environment Deprivation at a small statistical area level. We also contribute to the methodological literature on the estimation of socioeconomic indices with remote-sensing data by introducing elements from modern machine learning. In addition to classical approaches such as Ordinary Least Squares (OLS) regression and a spatial lag model, we explore the potential of the Gradient Boost Regressor and Random Forests to improve predictive performance and accuracy. In addition to novel predicting methods, we also introduce tools for model interpretation and evaluation such as feature importance and partial dependence plots, or cross-validation. Our results show that Random Forest proved to be the best model with an R2 of around 0.54, followed by Gradient Boost Regressor with 0.5. Both the spatial lag model and the OLS fall behind with significantly lower performances of 0.43 and 0.3, respectively.
[Repetitive transcranial magnetic stimulation: A potential therapy for cognitive disorders?
Nouhaud, C; Sherrard, R M; Belmin, J
2017-03-01
Considering the limited effectiveness of drugs treatments in cognitive disorders, the emergence of noninvasive techniques to modify brain function is very interesting. Among these techniques, repetitive transcranial magnetic stimulation (rTMS) can modulate cortical excitability and have potential therapeutic effects on cognition and behaviour. These effects are due to physiological modifications in the stimulated cortical tissue and their associated circuits, which depend on the parameters of stimulation. The objective of this article is to specify current knowledge and efficacy of rTMS in cognitive disorders. Previous studies found very encouraging results with significant improvement of higher brain functions. Nevertheless, these few studies have limits: a few patients were enrolled, the lack of control of the mechanisms of action by brain imaging, insufficiently formalized technique and variability of cognitive tests. It is therefore necessary to perform more studies, which identify statistical significant improvement and to specify underlying mechanisms of action and the parameters of use of the rTMS to offer rTMS as a routine therapy for cognitive dysfunction. Copyright © 2016 Société Nationale Française de Médecine Interne (SNFMI). Published by Elsevier SAS. All rights reserved.
Recruitment of Older Adults: Success May Be in the Details
McHenry, Judith C.; Insel, Kathleen C.; Einstein, Gilles O.; Vidrine, Amy N.; Koerner, Kari M.; Morrow, Daniel G.
2015-01-01
Purpose: Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Results: Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. Implications: The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. PMID:22899424
Factors relating to professional self-concept among nurse managers.
Kantek, Filiz; Şimşek, Belkıs
2017-12-01
To investigate the self-concept in nurse managers in Turkey and the effects of certain variables on professional self-concept. Professional self-concept plays a significant role in improving certain professional behaviours. Nursing managers have the potential to influence other members of the profession with their attitudes and behaviours. The study was designed as a cross-sectional descriptive study. This study was conducted with 159 nurse managers in nine different hospitals. The study data were collected with a Personal Information Form and Professional Self-concept Nursing Inventory, and the data analysis was accomplished with descriptive statistics, Cronbach's alpha coefficients and Chi-squared Automatic Interaction Detector analyses. The professional self-concept score of nurse managers was 3·33 (SD = 0·308). Professional competence subdimension had the highest scores, while professional satisfaction subdimension had the lowest. The types of hospital were found to be influential on professional self-concept. The types of hospital were reported to influence the professional self-concept of nurses. Nursing managers are visionaries who can potentially influence nursing practices and decisions. Nursing leaders must monitor and administer strategies to improve their professional self-concept. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Silva-Rodríguez, J.; Cortés, J.; Rodríguez-Osorio, X.; López-Urdaneta, J.; Pardo-Montero, J.; Aguiar, P.; Tsoumpas, C.
2016-10-01
Structural Functional Synergistic Resolution Recovery (SFS-RR) is a technique that uses supplementary structural information from MR or CT to improve the spatial resolution of PET or SPECT images. This wavelet-based method may have a potential impact on the clinical decision-making of brain focal disorders such as refractory epilepsy, since it can produce images with better quantitative accuracy and enhanced detectability. In this work, a method for the iterative application of SFS-RR (iSFS-RR) was firstly developed and optimized in terms of convergence and input voxel size, and the corrected images were used for the diagnosis of 18 patients with refractory epilepsy. To this end, PET/MR images were clinically evaluated through visual inspection, atlas-based asymmetry indices (AIs) and SPM (Statistical Parametric Mapping) analysis, using uncorrected images and images corrected with SFS-RR and iSFS-RR. Our results showed that the sensitivity can be increased from 78% for uncorrected images, to 84% for SFS-RR and 94% for the proposed iSFS-RR. Thus, the proposed methodology has demonstrated the potential to improve the management of refractory epilepsy patients in the clinical routine.
The review and results of different methods for facial recognition
NASA Astrophysics Data System (ADS)
Le, Yifan
2017-09-01
In recent years, facial recognition draws much attention due to its wide potential applications. As a unique technology in Biometric Identification, facial recognition represents a significant improvement since it could be operated without cooperation of people under detection. Hence, facial recognition will be taken into defense system, medical detection, human behavior understanding, etc. Several theories and methods have been established to make progress in facial recognition: (1) A novel two-stage facial landmark localization method is proposed which has more accurate facial localization effect under specific database; (2) A statistical face frontalization method is proposed which outperforms state-of-the-art methods for face landmark localization; (3) It proposes a general facial landmark detection algorithm to handle images with severe occlusion and images with large head poses; (4) There are three methods proposed on Face Alignment including shape augmented regression method, pose-indexed based multi-view method and a learning based method via regressing local binary features. The aim of this paper is to analyze previous work of different aspects in facial recognition, focusing on concrete method and performance under various databases. In addition, some improvement measures and suggestions in potential applications will be put forward.
System level modeling and component level control of fuel cells
NASA Astrophysics Data System (ADS)
Xue, Xingjian
This dissertation investigates the fuel cell systems and the related technologies in three aspects: (1) system-level dynamic modeling of both PEM fuel cell (PEMFC) and solid oxide fuel cell (SOFC); (2) condition monitoring scheme development of PEM fuel cell system using model-based statistical method; and (3) strategy and algorithm development of precision control with potential application in energy systems. The dissertation first presents a system level dynamic modeling strategy for PEM fuel cells. It is well known that water plays a critical role in PEM fuel cell operations. It makes the membrane function appropriately and improves the durability. The low temperature operating conditions, however, impose modeling difficulties in characterizing the liquid-vapor two phase change phenomenon, which becomes even more complex under dynamic operating conditions. This dissertation proposes an innovative method to characterize this phenomenon, and builds a comprehensive model for PEM fuel cell at the system level. The model features the complete characterization of multi-physics dynamic coupling effects with the inclusion of dynamic phase change. The model is validated using Ballard stack experimental result from open literature. The system behavior and the internal coupling effects are also investigated using this model under various operating conditions. Anode-supported tubular SOFC is also investigated in the dissertation. While the Nernst potential plays a central role in characterizing the electrochemical performance, the traditional Nernst equation may lead to incorrect analysis results under dynamic operating conditions due to the current reverse flow phenomenon. This dissertation presents a systematic study in this regard to incorporate a modified Nernst potential expression and the heat/mass transfer into the analysis. The model is used to investigate the limitations and optimal results of various operating conditions; it can also be utilized to perform the optimal design of tubular SOFC. With the system-level dynamic model as a basis, a framework for the robust, online monitoring of PEM fuel cell is developed in the dissertation. The monitoring scheme employs the Hotelling T2 based statistical scheme to handle the measurement noise and system uncertainties and identifies the fault conditions through a series of self-checking and conformal testing. A statistical sampling strategy is also utilized to improve the computation efficiency. Fuel/gas flow control is the fundamental operation for fuel cell energy systems. In the final part of the dissertation, a high-precision and robust tracking control scheme using piezoelectric actuator circuit with direct hysteresis compensation is developed. The key characteristic of the developed control algorithm includes the nonlinear continuous control action with the adaptive boundary layer strategy.
Statistics usage in the American Journal of Obstetrics and Gynecology: has anything changed?
Welch, Gerald E; Gabbe, Steven G
2002-03-01
Our purpose was to compare statistical listing and usage between articles published in the American Journal of Obstetrics and Gynecology in 1994 with those published in 1999. All papers included in the obstetrics, fetus-placenta-newborn, and gynecology sections and the transactions of societies sections of the January through June 1999 issues of the American Journal of Obstetrics and Gynecology (volume 180, numbers 1 to 6) were reviewed for statistical usage. Each paper was given a rating for the cataloging of applied statistics and a rating for the appropriateness of statistical usage, when possible. These results were compared with the data collected on a similar review of articles published in 1994. Of the 238 available articles, 195 contained statistics and were reviewed. In comparison to the articles published in 1994, there were significantly more articles that completely cataloged applied statistics (74.3% vs 47.4%) (P <.0001), and there was a significant improvement in appropriateness of statistical usage (56.4% vs 30.3%) (P <.0001). Changes in the Instructions to Authors regarding the description of applied statistics and probable changes in the behavior of researchers and Editors have led to an improvement in the quality of statistics in papers published in the American Journal of Obstetrics and Gynecology.
Evaluation of a New Mean Scaled and Moment Adjusted Test Statistic for SEM
ERIC Educational Resources Information Center
Tong, Xiaoxiao; Bentler, Peter M.
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and 2 well-known robust test…
Two strategies to engineer flexible loops for improved enzyme thermostability
Yu, Haoran; Yan, Yihan; Zhang, Cheng; Dalby, Paul A.
2017-01-01
Flexible sites are potential targets for engineering the stability of enzymes. Nevertheless, the success rate of the rigidifying flexible sites (RFS) strategy is still low due to a limited understanding of how to determine the best mutation candidates. In this study, two parallel strategies were applied to identify mutation candidates within the flexible loops of Escherichia coli transketolase (TK). The first was a “back to consensus mutations” approach, and the second was computational design based on ΔΔG calculations in Rosetta. Forty-nine single variants were generated and characterised experimentally. From these, three single-variants I189H, A282P, D143K were found to be more thermostable than wild-type TK. The combination of A282P with H192P, a variant constructed previously, resulted in the best all-round variant with a 3-fold improved half-life at 60 °C, 5-fold increased specific activity at 65 °C, 1.3-fold improved kcat and a Tm increased by 5 °C above that of wild type. Based on a statistical analysis of the stability changes for all variants, the qualitative prediction accuracy of the Rosetta program reached 65.3%. Both of the two strategies investigated were useful in guiding mutation candidates to flexible loops, and had the potential to be used for other enzymes. PMID:28145457