Sample records for current statistics show

  1. Statistical learning and language acquisition

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2011-01-01

    Human learners, including infants, are highly sensitive to structure in their environment. Statistical learning refers to the process of extracting this structure. A major question in language acquisition in the past few decades has been the extent to which infants use statistical learning mechanisms to acquire their native language. There have been many demonstrations showing infants’ ability to extract structures in linguistic input, such as the transitional probability between adjacent elements. This paper reviews current research on how statistical learning contributes to language acquisition. Current research is extending the initial findings of infants’ sensitivity to basic statistical information in many different directions, including investigating how infants represent regularities, learn about different levels of language, and integrate information across situations. These current directions emphasize studying statistical language learning in context: within language, within the infant learner, and within the environment as a whole. PMID:21666883

  2. The timber industries of Kentucky, 1986

    Treesearch

    Eric H. Wharton; Stephen C. Kayse; Robert L., Jr. Nevel; Robert L. Nevel

    1992-01-01

    A statistical report based on a survey of primary wood manufacturers using wood from Kentucky. Contains statistics on production and consumption of industrial forest products by species, geographic units, and state; and production and disposition of manufacturing residues. Includes graphics and statistical tables showing current and historical data.

  3. Statistical Knowledge and the Over-Interpretation of Student Evaluations of Teaching

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2017-01-01

    Research shows that teachers interpret small differences in student evaluations of teaching as meaningful even when available statistical information indicates that the differences are not reliable. The current research explored the effect of statistical training on college teachers' tendency to over-interpret student evaluation differences. A…

  4. What can we learn from noise? — Mesoscopic nonequilibrium statistical physics —

    PubMed Central

    KOBAYASHI, Kensuke

    2016-01-01

    Mesoscopic systems — small electric circuits working in quantum regime — offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics. PMID:27477456

  5. What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.

    PubMed

    Kobayashi, Kensuke

    2016-01-01

    Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.

  6. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  7. Distributioin, orientation and scales of the field-aligned currents measured by Swarm

    NASA Astrophysics Data System (ADS)

    Yang, J.; Dunlop, M. W.

    2016-12-01

    We have statistically studied the R1, R2 and net field aligned currents using the FAC data of the Swarm satellites. We also have investigated the statistical, dual-spacecraft correlations of field-aligned current signatures between two Swarm spacecraft (A and C). For the first time we have inferred the orientations of the current sheets of FACs directly, using the maximum correlations, obtained from sliding data segments, which show clear trends in magnetic local time (MLT). To compare with this we also check the MVAB method. To explore the scale and variability of the current sheet supposition, we investigate the MLT dependence of the maximum correlations in different time shift or longitude shift bins.

  8. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  9. 31 CFR 9.5 - Applications for investigation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., both past and current. (c) Statistical material presented should be on a calendar-year basis for... domestic industry concerned with the article in question. (4) Pertinent statistics showing the quantities... competition created by imports of the article in question. (6) The effect, if any, of imports of the article...

  10. Hydrophobicity and leakage current statistics of polymeric insulators long-term exposed to coastal contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soerqvist, T.; Vlastos, A.E.

    1996-12-31

    The hydrophobicity of polymeric insulators is crucial for their performance. This paper reports the hydrophobicity and the peak leakage current statistics of one porcelain, two ethylene-propylene-diene monomer (EPDM) and four silicone rubber (SIR) commercially available insulators. The insulators have been energized with 130 kV rms phase-to-ground AC voltage under identical outdoor conditions for more than seven years. The results presented show that under wet and polluted conditions the hydrophilic EPDM rubber insulators develop high leakage currents and substantial arcing. During a typical salt-storm the arcing amplitude of the EPDM rubber insulators is at least twice as high as that ofmore » the porcelain insulator. The SIR insulators, on the other hand, preserve a high degree of hydrophobicity after more than seven years in service and maintain very low leakage currents. However, the results show that during heavy salt contaminated conditions a highly stressed SIR insulator can temporarily lose its hydrophobicity and thereby develop considerable surface arcing.« less

  11. Analysis of ground-water data for selected wells near Holloman Air Force Base, New Mexico, 1950-95

    USGS Publications Warehouse

    Huff, G.F.

    1996-01-01

    Ground-water-level, ground-water-withdrawal, and ground- water-quality data were evaluated for trends. Holloman Air Force Base is located in the west-central part of Otero County, New Mexico. Ground-water-data analyses include assembly and inspection of U.S. Geological Survey and Holloman Air Force Base data, including ground-water-level data for public-supply and observation wells and withdrawal and water-quality data for public-supply wells in the area. Well Douglas 4 shows a statistically significant decreasing trend in water levels for 1972-86 and a statistically significant increasing trend in water levels for 1986-90. Water levels in wells San Andres 5 and San Andres 6 show statistically significant decreasing trends for 1972-93 and 1981-89, respectively. A mixture of statistically significant increasing trends, statistically significant decreasing trends, and lack of statistically significant trends over periods ranging from the early 1970's to the early 1990's are indicated for the Boles wells and wells near the Boles wells. Well Boles 5 shows a statistically significant increasing trend in water levels for 1981-90. Well Boles 5 and well 17S.09E.25.343 show no statistically significant trends in water levels for 1990-93 and 1988-93, respectively. For 1986-93, well Frenchy 1 shows a statistically significant decreasing trend in water levels. Ground-water withdrawal from the San Andres and Douglas wells regularly exceeded estimated ground-water recharge from San Andres Canyon for 1963-87. For 1951-57 and 1960-86, ground-water withdrawal from the Boles wells regularly exceeded total estimated ground-water recharge from Mule, Arrow, and Lead Canyons. Ground-water withdrawal from the San Andres and Douglas wells and from the Boles wells nearly equaled estimated ground- water recharge for 1989-93 and 1986-93, respectively. For 1987- 93, ground-water withdrawal from the Escondido well regularly exceeded estimated ground-water recharge from Escondido Canyon, and ground-water withdrawal from the Frenchy wells regularly exceeded total estimated ground-water recharge from Dog and Deadman Canyons. Water-quality samples were collected from selected Douglas, San Andres, and Boles public-supply wells from December 1994 to February 1995. Concentrations of dissolved nitrate show the most consistent increases between current and historical data. Current concentrations of dissolved nitrate are greater than historical concentrations in 7 of 10 wells.

  12. Money Income and Poverty Status of Families and Persons in the United States: 1985. (Advance Data from the March 1986 Current Population Survey).

    ERIC Educational Resources Information Center

    Current Population Reports, 1986

    1986-01-01

    Analysis of information gained from the March 1986 Current Population Survey (CPS) conducted by the Bureau of the Census shows the following results for the year 1985: (1) median family money income continued to move ahead of inflation; (2) the median earnings of men showed no statistically significant change from 1984, but the earnings of women…

  13. I Am a Chameleon in Pearls: How Three Select Female Superintendents Perceive Their Professional Lives

    ERIC Educational Resources Information Center

    Ryan, Catherine Agnes

    2012-01-01

    The public school superintendent is the least progressive position in education at integrating women and balancing the scales of equitable representation. Statistical data indicates there are far fewer females than males serving as superintendents. Current statistics show women make up: 1) over 70 percent of all public school educators; 2) nearly…

  14. Statistical Prediction of Sea Ice Concentration over Arctic

    NASA Astrophysics Data System (ADS)

    Kim, Jongho; Jeong, Jee-Hoon; Kim, Baek-Min

    2017-04-01

    In this study, a statistical method that predict sea ice concentration (SIC) over the Arctic is developed. We first calculate the Season-reliant Empirical Orthogonal Functions (S-EOFs) of monthly Arctic SIC from Nimbus-7 SMMR and DMSP SSM/I-SSMIS Passive Microwave Data, which contain the seasonal cycles (12 months long) of dominant SIC anomaly patterns. Then, the current SIC state index is determined by projecting observed SIC anomalies for latest 12 months to the S-EOFs. Assuming the current SIC anomalies follow the spatio-temporal evolution in the S-EOFs, we project the future (upto 12 months) SIC anomalies by multiplying the SI and the corresponding S-EOF and then taking summation. The predictive skill is assessed by hindcast experiments initialized at all the months for 1980-2010. When comparing predictive skill of SIC predicted by statistical model and NCEP CFS v2, the statistical model shows a higher skill in predicting sea ice concentration and extent.

  15. Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.

    PubMed

    Pandolfi, Maurizio; Carreras, Giulia

    2018-06-07

    It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.

  16. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.

    PubMed

    Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia

    2012-11-23

    In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.

  17. The unrealized promise of infant statistical word-referent learning

    PubMed Central

    Smith, Linda B.; Suanda, Sumarga H.; Yu, Chen

    2014-01-01

    Recent theory and experiments offer a new solution as to how infant learners may break into word learning, by using cross-situational statistics to find the underlying word-referent mappings. Computational models demonstrate the in-principle plausibility of this statistical learning solution and experimental evidence shows that infants can aggregate and make statistically appropriate decisions from word-referent co-occurrence data. We review these contributions and then identify the gaps in current knowledge that prevent a confident conclusion about whether cross-situational learning is the mechanism through which infants break into word learning. We propose an agenda to address that gap that focuses on detailing the statistics in the learning environment and the cognitive processes that make use of those statistics. PMID:24637154

  18. Statistical Forecasting of Current and Future Circum-Arctic Ground Temperatures and Active Layer Thickness

    NASA Astrophysics Data System (ADS)

    Aalto, J.; Karjalainen, O.; Hjort, J.; Luoto, M.

    2018-05-01

    Mean annual ground temperature (MAGT) and active layer thickness (ALT) are key to understanding the evolution of the ground thermal state across the Arctic under climate change. Here a statistical modeling approach is presented to forecast current and future circum-Arctic MAGT and ALT in relation to climatic and local environmental factors, at spatial scales unreachable with contemporary transient modeling. After deploying an ensemble of multiple statistical techniques, distance-blocked cross validation between observations and predictions suggested excellent and reasonable transferability of the MAGT and ALT models, respectively. The MAGT forecasts indicated currently suitable conditions for permafrost to prevail over an area of 15.1 ± 2.8 × 106 km2. This extent is likely to dramatically contract in the future, as the results showed consistent, but region-specific, changes in ground thermal regime due to climate change. The forecasts provide new opportunities to assess future Arctic changes in ground thermal state and biogeochemical feedback.

  19. Stay Smart: Lost Weight--Childhood Obesity and Health Education

    ERIC Educational Resources Information Center

    Kosa-Postl, Linda

    2006-01-01

    Prevention is the key strategy for controlling the current epidemic levels of childhood obesity. Current statistics show that obesity has more than doubled for preschool children aged 2-5 years and adolescents aged 12-19 years, and it has more than tripled for children aged 6-11 years. It is generally recognized that nutrition education for the…

  20. Electron Heating at Kinetic Scales in Magnetosheath Turbulence

    NASA Technical Reports Server (NTRS)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.; Lecontel, O.; Retino, A.; Breuillard, H.; Khotyaintsev, Y.; Vaivads, A.; Lavraud, B.; Eriksson, E.; hide

    2017-01-01

    We present a statistical study of coherent structures at kinetic scales, using data from the Magnetospheric Multiscale mission in the Earths magnetosheath. We implemented the multi-spacecraft partial variance of increments (PVI) technique to detect these structures, which are associated with intermittency at kinetic scales. We examine the properties of the electron heating occurring within such structures. We find that, statistically, structures with a high PVI index are regions of significant electron heating. We also focus on one such structure, a current sheet, which shows some signatures consistent with magnetic reconnection. Strong parallel electron heating coincides with whistler emissions at the edges of the current sheet.

  1. Fluctuation Relations for Currents

    NASA Astrophysics Data System (ADS)

    Sinitsyn, Nikolai; Akimov, Alexei; Chernyak, Vladimir; Chertkov, Michael

    2011-03-01

    We consider a non-equilibrium statistical system on a graph or a network. Identical particles are injected, interact with each other, traverse, and leave the graph in a stochastic manner described in terms of Poisson rates, possibly strongly dependent on time and instantaneous occupation numbers at the nodes of the graph. We show that the system demonstrates a profound statistical symmetry, leading to new Fluctuation Relations that originate from the supersymmetry and the principle of the geometric universality of currents rather than from the relations between probabilities of forward and reverse trajectories. NSF/ECCS-0925618, NSF/CHE-0808910 and DOE at LANL under Contract No. DE-AC52-06NA25396.

  2. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring

    PubMed Central

    2012-01-01

    Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770

  3. LORETA imaging of P300 in schizophrenia with individual MRI and 128-channel EEG.

    PubMed

    Pae, Ji Soo; Kwon, Jun Soo; Youn, Tak; Park, Hae-Jeong; Kim, Myung Sun; Lee, Boreom; Park, Kwang Suk

    2003-11-01

    We investigated the characteristics of P300 generators in schizophrenics by using voxel-based statistical parametric mapping of current density images. P300 generators, produced by a rare target tone of 1500 Hz (15%) under a frequent nontarget tone of 1000 Hz (85%), were measured in 20 right-handed schizophrenics and 21 controls. Low-resolution electromagnetic tomography (LORETA), using a realistic head model of the boundary element method based on individual MRI, was applied to the 128-channel EEG. Three-dimensional current density images were reconstructed from the LORETA intensity maps that covered the whole cortical gray matter. Spatial normalization and intensity normalization of the smoothed current density images were used to reduce anatomical variance and subject-specific global activity and statistical parametric mapping (SPM) was applied for the statistical analysis. We found that the sources of P300 were consistently localized at the left superior parietal area in normal subjects, while those of schizophrenics were diversely distributed. Upon statistical comparison, schizophrenics, with globally reduced current densities, showed a significant P300 current density reduction in the left medial temporal area and in the left inferior parietal area, while both left prefrontal and right orbitofrontal areas were relatively activated. The left parietotemporal area was found to correlate negatively with Positive and Negative Syndrome Scale total scores of schizophrenic patients. In conclusion, the reduced and increased areas of current density in schizophrenic patients suggest that the medial temporal and frontal areas contribute to the pathophysiology of schizophrenia, the frontotemporal circuitry abnormality.

  4. Non-invasive brain stimulation and computational models in post-stroke aphasic patients: single session of transcranial magnetic stimulation and transcranial direct current stimulation. A randomized clinical trial.

    PubMed

    Santos, Michele Devido Dos; Cavenaghi, Vitor Breseghello; Mac-Kay, Ana Paula Machado Goyano; Serafim, Vitor; Venturi, Alexandre; Truong, Dennis Quangvinh; Huang, Yu; Boggio, Paulo Sérgio; Fregni, Felipe; Simis, Marcel; Bikson, Marom; Gagliardi, Rubens José

    2017-01-01

    Patients undergoing the same neuromodulation protocol may present different responses. Computational models may help in understanding such differences. The aims of this study were, firstly, to compare the performance of aphasic patients in naming tasks before and after one session of transcranial direct current stimulation (tDCS), transcranial magnetic stimulation (TMS) and sham, and analyze the results between these neuromodulation techniques; and secondly, through computational model on the cortex and surrounding tissues, to assess current flow distribution and responses among patients who received tDCS and presented different levels of results from naming tasks. Prospective, descriptive, qualitative and quantitative, double blind, randomized and placebo-controlled study conducted at Faculdade de Ciências Médicas da Santa Casa de São Paulo. Patients with aphasia received one session of tDCS, TMS or sham stimulation. The time taken to name pictures and the response time were evaluated before and after neuromodulation. Selected patients from the first intervention underwent a computational model stimulation procedure that simulated tDCS. The results did not indicate any statistically significant differences from before to after the stimulation.The computational models showed different current flow distributions. The present study did not show any statistically significant difference between tDCS, TMS and sham stimulation regarding naming tasks. The patients'responses to the computational model showed different patterns of current distribution.

  5. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning.

    PubMed

    Baghi, Heibatollah; Kornides, Melanie L

    2013-01-01

    Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes-along with the students' statistical proficiency-improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals' effective use of statistics in critically evaluating and utilizing research in their practices.

  6. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning

    PubMed Central

    Baghi, Heibatollah; Kornides, Melanie L.

    2014-01-01

    Background Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. Methods In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results and Conclusions Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes—along with the students’ statistical proficiency—improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals’ effective use of statistics in critically evaluating and utilizing research in their practices. PMID:25419256

  7. Rhythms at the bottom of the deep sea: Cyclic current flow changes and melatonin patterns in two species of demersal fish

    NASA Astrophysics Data System (ADS)

    Wagner, H.-J.; Kemp, K.; Mattheus, U.; Priede, I. G.

    2007-11-01

    We have studied physical and biological rhythms in the deep demersal habitat of the Northeastern Atlantic. Current velocity and direction changes occurred at intervals of 12.4 h, demonstrating that they could have an impact of tidal activity, and also showed indications of other seasonal changes. As an indicator of biological rhythms, we measured the content of pineal and retinal melatonin in the grenadier Coryphaenoides armatus and the deep-sea eel Synaphobranchus kaupii, and determined the spontaneous release of melatonin in long-term (52 h minimum) cultures of isolated pineal organs and retinae in S. kaupii. The results of the release experiments show statistically significant signs of synchronicity and periodicity suggesting the presence of an endogenous clock. The melatonin content data show large error bars typical of cross-sectional population studies. When the data are plotted according to a lunar cycle, taken as indication of a tidal rhythm, both species show peak values at the beginning of the lunar day and night and lower values during the second half of lunar day and night and during moonrise and moonset. Statistical analysis, however, shows that the periodicity of the melatonin content is not significant. Taken together these observations strongly suggest that (1) biological rhythms are present in demersal fish, (2) the melatonin metabolism shows signs of periodicity, and (3) tidal currents may act as zeitgeber at the bottom of the deep sea.

  8. Electron Heating at Kinetic Scales in Magnetosheath Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.

    2017-02-20

    We present a statistical study of coherent structures at kinetic scales, using data from the Magnetospheric Multiscale mission in the Earth’s magnetosheath. We implemented the multi-spacecraft partial variance of increments (PVI) technique to detect these structures, which are associated with intermittency at kinetic scales. We examine the properties of the electron heating occurring within such structures. We find that, statistically, structures with a high PVI index are regions of significant electron heating. We also focus on one such structure, a current sheet, which shows some signatures consistent with magnetic reconnection. Strong parallel electron heating coincides with whistler emissions at themore » edges of the current sheet.« less

  9. Statistical lamb wave localization based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  10. Brush head composition, wear profile, and cleaning efficacy: an assessment of three electric brush heads using in vitro methods.

    PubMed

    Kaiser, Eva; Meyners, Michael; Markgraf, Dirk; Stoerkel, Ulrich; von Koppenfels, Roxana; Adam, Ralf; Soukup, Martin; Wehrbein, Heinrich; Erbe, Christina

    2014-01-01

    The objective of this research was to evaluate a current store brand (SB) brush head for composition/physical characteristics, Wear Index (WI), and cleaning efficacy versus the previous SB brush head refill design (SB control) and the Oral-B Precision Clean brush head (positive control, PC). This research consisted of three parts: 1) Analytical analysis using Fourier Transform Infrared (FT-IR) spectrometry to evaluate the chemical composition of the current SB brush head bristles relative to the SB control. In addition, physical parameters such as bristle count and diameter were determined. 2) Wear Index (WI) investigation to determine the Wear Index scores of in vitro-aged brush heads at four weeks (one month) and 13 weeks (three months) by a trained investigator. To "age" the brush heads, a robot system was used as a new alternative in vitro method to simulate aging by consumer use. 3) Robot testing to determine the cleaning performance of in vitro-aged brush heads, comparing one month-aged current SB brush heads with the SB control (one and three months-aged) and the PC brush heads (three months-aged) in a standardized fashion. 1) FT-IR analysis revealed that the chemical composition of the current and control SB refill brush heads is identical. In terms of physical parameters, the current SB brush head has 12% more bristles and a slightly oval brush head compared to the round brush head of the SB control. 2) Wear Index analysis showed there was no difference in the one month-aged current SB brush head versus the one month-aged SB control (1.67 vs. 1.50, p = 0.65) or versus the three months-aged PC brush head (1.67 vs. 1.50, p = 0.65). The one month-aged current SB brush head demonstrated statistically significantly less wear than the three months-aged SB control (1.67 vs. 2.67, p = 0.01). 3) Analysis of cleaning efficacy shows that the one month-aged current SB brush head had improved cleaning performance over the one month-aged SB control brush head (p < 0.05), despite no statistically significant difference in wear. Both the one month-aged current and control SB brush heads showed statistically significantly lower cleaning performance compared to the three months-aged PC brush heads (p < 0.01). While the current SB brush head showed improved cleaning over the SB control, it demonstrated significantly lower durability and cleaning in comparison to the PC brush head. Dental professionals should be aware of these differences, both in durability and in cleaning performance, when recommending brush heads to their patients.

  11. The Freight Transportation Services Index as a leading economic indicator

    DOT National Transportation Integrated Search

    2009-09-01

    The Bureau of Transportation Statistics (BTS) freight Transportation : Services Index (TSI) showed a decline a full year : and a half prior to the start of the current recession. This : downturn suggests the TSI may prove particularly useful : as ...

  12. Results of module electrical measurement of the DOE 46-kilowatt procurement

    NASA Technical Reports Server (NTRS)

    Curtis, H. B.

    1978-01-01

    Current-voltage measurements have been made on terrestrial solar cell modules of the DOE/JPL Low Cost Silicon Solar Array procurement. Data on short circuit current, open circuit voltage, and maximum power for the four types of modules are presented in normalized form, showing distribution of the measured values. Standard deviations from the mean values are also given. Tests of the statistical significance of the data are discussed.

  13. An evaluation of GTAW-P versus GTA welding of alloy 718

    NASA Technical Reports Server (NTRS)

    Gamwell, W. R.; Kurgan, C.; Malone, T. W.

    1991-01-01

    Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.

  14. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  15. Full Counting Statistics for Interacting Fermions with Determinantal Quantum Monte Carlo Simulations.

    PubMed

    Humeniuk, Stephan; Büchler, Hans Peter

    2017-12-08

    We present a method for computing the full probability distribution function of quadratic observables such as particle number or magnetization for the Fermi-Hubbard model within the framework of determinantal quantum Monte Carlo calculations. Especially in cold atom experiments with single-site resolution, such a full counting statistics can be obtained from repeated projective measurements. We demonstrate that the full counting statistics can provide important information on the size of preformed pairs. Furthermore, we compute the full counting statistics of the staggered magnetization in the repulsive Hubbard model at half filling and find excellent agreement with recent experimental results. We show that current experiments are capable of probing the difference between the Hubbard model and the limiting Heisenberg model.

  16. Changes in American Family Life

    ERIC Educational Resources Information Center

    Norton, Arthur J.; Glick, Paul C.

    1976-01-01

    This article attempts to provide a factual, historical perspective on the current family situation of American children. Demographic statistics from recent decades are given which show trends toward small family size, nuclear families, one-parent families, and a higher level of education among parents. (MS)

  17. Statistical Relational Learning to Predict Primary Myocardial Infarction from Electronic Health Records

    PubMed Central

    Weiss, Jeremy C; Page, David; Peissig, Peggy L; Natarajan, Sriraam; McCarty, Catherine

    2013-01-01

    Electronic health records (EHRs) are an emerging relational domain with large potential to improve clinical outcomes. We apply two statistical relational learning (SRL) algorithms to the task of predicting primary myocardial infarction. We show that one SRL algorithm, relational functional gradient boosting, outperforms propositional learners particularly in the medically-relevant high recall region. We observe that both SRL algorithms predict outcomes better than their propositional analogs and suggest how our methods can augment current epidemiological practices. PMID:25360347

  18. A model for characterizing residential ground current and magnetic field fluctuations.

    PubMed

    Mader, D L; Peralta, S B; Sherar, M D

    1994-01-01

    The current through the residential grounding circuit is an important source for magnetic fields; field variations near the grounding circuit accurately track fluctuations in this ground current. In this paper, a model is presented which permits calculation of the range of these fluctuations. A discrete network model is used to simulate a local distribution system for a single street, and a statistical model to simulate unbalanced currents in the system. Simulations of three-house and ten-house networks show that random appliance operation leads to ground current fluctuations which can be quite large, on the order of 600%. This is consistent with measured fluctuations in an actual house.

  19. Comparison of the dynamics of neural interactions between current-based and conductance-based integrate-and-fire recurrent networks

    PubMed Central

    Cavallari, Stefano; Panzeri, Stefano; Mazzoni, Alberto

    2014-01-01

    Models of networks of Leaky Integrate-and-Fire (LIF) neurons are a widely used tool for theoretical investigations of brain function. These models have been used both with current- and conductance-based synapses. However, the differences in the dynamics expressed by these two approaches have been so far mainly studied at the single neuron level. To investigate how these synaptic models affect network activity, we compared the single neuron and neural population dynamics of conductance-based networks (COBNs) and current-based networks (CUBNs) of LIF neurons. These networks were endowed with sparse excitatory and inhibitory recurrent connections, and were tested in conditions including both low- and high-conductance states. We developed a novel procedure to obtain comparable networks by properly tuning the synaptic parameters not shared by the models. The so defined comparable networks displayed an excellent and robust match of first order statistics (average single neuron firing rates and average frequency spectrum of network activity). However, these comparable networks showed profound differences in the second order statistics of neural population interactions and in the modulation of these properties by external inputs. The correlation between inhibitory and excitatory synaptic currents and the cross-neuron correlation between synaptic inputs, membrane potentials and spike trains were stronger and more stimulus-modulated in the COBN. Because of these properties, the spike train correlation carried more information about the strength of the input in the COBN, although the firing rates were equally informative in both network models. Moreover, the network activity of COBN showed stronger synchronization in the gamma band, and spectral information about the input higher and spread over a broader range of frequencies. These results suggest that the second order statistics of network dynamics depend strongly on the choice of synaptic model. PMID:24634645

  20. Comparison of the dynamics of neural interactions between current-based and conductance-based integrate-and-fire recurrent networks.

    PubMed

    Cavallari, Stefano; Panzeri, Stefano; Mazzoni, Alberto

    2014-01-01

    Models of networks of Leaky Integrate-and-Fire (LIF) neurons are a widely used tool for theoretical investigations of brain function. These models have been used both with current- and conductance-based synapses. However, the differences in the dynamics expressed by these two approaches have been so far mainly studied at the single neuron level. To investigate how these synaptic models affect network activity, we compared the single neuron and neural population dynamics of conductance-based networks (COBNs) and current-based networks (CUBNs) of LIF neurons. These networks were endowed with sparse excitatory and inhibitory recurrent connections, and were tested in conditions including both low- and high-conductance states. We developed a novel procedure to obtain comparable networks by properly tuning the synaptic parameters not shared by the models. The so defined comparable networks displayed an excellent and robust match of first order statistics (average single neuron firing rates and average frequency spectrum of network activity). However, these comparable networks showed profound differences in the second order statistics of neural population interactions and in the modulation of these properties by external inputs. The correlation between inhibitory and excitatory synaptic currents and the cross-neuron correlation between synaptic inputs, membrane potentials and spike trains were stronger and more stimulus-modulated in the COBN. Because of these properties, the spike train correlation carried more information about the strength of the input in the COBN, although the firing rates were equally informative in both network models. Moreover, the network activity of COBN showed stronger synchronization in the gamma band, and spectral information about the input higher and spread over a broader range of frequencies. These results suggest that the second order statistics of network dynamics depend strongly on the choice of synaptic model.

  1. 1987 Population Trends for Washington State.

    ERIC Educational Resources Information Center

    Washington State Office of Financial Management, Olympia.

    This statistical profile provides current demographic data for Washington State and is also broken down by counties, incorporated cities, and towns. Fifteen tables show population figures; components of population change; housing units by structure type; annexations, incorporations and municipal boundary changes; growth of households; estimates of…

  2. Status of Indonesian women in physics

    NASA Astrophysics Data System (ADS)

    Raharti, Monika; Kartini, Evvy

    2015-12-01

    This paper reports on the current situation of women in physics in Indonesia. Statistics show that there is an imbalance in the number of male and female physicists in Indonesia. An overview by one of the very few female professors in physics in Indonesia also shows how women struggle in their careers. A Women in Physics organization will be established under the Indonesian Physical Society in October 2014.

  3. Is fertility falling in Zimbabwe?

    PubMed

    Udjo, E O

    1996-01-01

    With an unequalled contraceptive prevalence rate in sub-Saharan Africa, of 43% among currently married women in Zimbabwe, the Central Statistical Office (1989) observed that fertility has declined sharply in recent years. Using data from several surveys on Zimbabwe, especially the birth histories of the Zimbabwe Demographic and Health Survey, this study examines fertility trends in Zimbabwe. The results show that the fertility decline in Zimbabwe is modest and that the decline is concentrated among high order births. Multivariate analysis did not show a statistically significant effect of contraception on fertility, partly because a high proportion of Zimbabwean women in the reproductive age group never use contraception due to prevailing pronatalist attitudes in the country.

  4. 77 FR 68731 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... Statistics Service Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection AGENCY: National Agricultural Statistics Service, USDA. ACTION: Notice and request for comments... National Agricultural Statistics Service (NASS) to request revision and extension of a currently approved...

  5. Health, United States, 1980, With Prevention Profile.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    The first part of this report consolidates the most current data on health care trends and presents analytical discussions on four interrelated themes: health status and determinants; utilization of health resources; health care resources; and health care expenditures. Detailed tables present statistics showing comparisons over time for such…

  6. Large fluctuations of the macroscopic current in diffusive systems: a numerical test of the additivity principle.

    PubMed

    Hurtado, Pablo I; Garrido, Pedro L

    2010-04-01

    Most systems, when pushed out of equilibrium, respond by building up currents of locally conserved observables. Understanding how microscopic dynamics determines the averages and fluctuations of these currents is one of the main open problems in nonequilibrium statistical physics. The additivity principle is a theoretical proposal that allows to compute the current distribution in many one-dimensional nonequilibrium systems. Using simulations, we validate this conjecture in a simple and general model of energy transport, both in the presence of a temperature gradient and in canonical equilibrium. In particular, we show that the current distribution displays a Gaussian regime for small current fluctuations, as prescribed by the central limit theorem, and non-Gaussian (exponential) tails for large current deviations, obeying in all cases the Gallavotti-Cohen fluctuation theorem. In order to facilitate a given current fluctuation, the system adopts a well-defined temperature profile different from that of the steady state and in accordance with the additivity hypothesis predictions. System statistics during a large current fluctuation is independent of the sign of the current, which implies that the optimal profile (as well as higher-order profiles and spatial correlations) are invariant upon current inversion. We also demonstrate that finite-time joint fluctuations of the current and the profile are well described by the additivity functional. These results suggest the additivity hypothesis as a general and powerful tool to compute current distributions in many nonequilibrium systems.

  7. Angular velocity estimation based on star vector with improved current statistical model Kalman filter.

    PubMed

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, He

    2016-11-20

    Angular velocity information is a requisite for a spacecraft guidance, navigation, and control system. In this paper, an approach for angular velocity estimation based merely on star vector measurement with an improved current statistical model Kalman filter is proposed. High-precision angular velocity estimation can be achieved under dynamic conditions. The amount of calculation is also reduced compared to a Kalman filter. Different trajectories are simulated to test this approach, and experiments with real starry sky observation are implemented for further confirmation. The estimation accuracy is proved to be better than 10-4  rad/s under various conditions. Both the simulation and the experiment demonstrate that the described approach is effective and shows an excellent performance under both static and dynamic conditions.

  8. Powerful Inference with the D-Statistic on Low-Coverage Whole-Genome Data

    PubMed Central

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2017-01-01

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness is assessed by evaluating specific coincidences of alleles between the groups. When working with high-throughput sequencing data, calling genotypes accurately is not always possible; therefore, the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction to combat the problems of sequencing errors, and show a way to correct for introgression from an external population that is not part of the supposed genetic relationship, and how this leads to an estimate of the admixture rate. We prove that the D-statistic is approximated by a standard normal distribution. Furthermore, we show that our method outperforms the traditional D-statistic in detecting admixtures. The power gain is most pronounced for low and medium sequencing depth (1–10×), and performances are as good as with perfectly called genotypes at a sequencing depth of 2×. We show the reliability of error correction in scenarios with simulated errors and ancient data, and correct for introgression in known scenarios to estimate the admixture rates. PMID:29196497

  9. The landscape of W± and Z bosons produced in pp collisions up to LHC energies

    NASA Astrophysics Data System (ADS)

    Basso, Eduardo; Bourrely, Claude; Pasechnik, Roman; Soffer, Jacques

    2017-10-01

    We consider a selection of recent experimental results on electroweak W± , Z gauge boson production in pp collisions at BNL RHIC and CERN LHC energies in comparison to prediction of perturbative QCD calculations based on different sets of NLO parton distribution functions including the statistical PDF model known from fits to the DIS data. We show that the current statistical PDF parametrization (fitted to the DIS data only) underestimates the LHC data on W± , Z gauge boson production cross sections at the NLO by about 20%. This suggests that there is a need to refit the parameters of the statistical PDF including the latest LHC data.

  10. A simulation-based assessment approach to increase safety among senior drivers.

    DOT National Transportation Integrated Search

    2013-04-01

    Statistics show that in the U.S., there are about 38 million licensed drivers over age 65; about 1/8 of our : population. By 2024, this figure will DOUBLE to 25%. The current research is intended to address the : driving capabilities of our older pop...

  11. Gender and Employment. Current Statistics and Their Implications.

    ERIC Educational Resources Information Center

    Equity Issues, 1996

    1996-01-01

    This publication contains three fact sheets on gender and employment statistics and their implications. The fact sheets are divided into two sections--statistics and implications. The statistics present the current situation of men and women workers as they relate to occupations, education, and earnings. The implications express suggestions for…

  12. Generalized Hurst exponent estimates differentiate EEG signals of healthy and epileptic patients

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2018-01-01

    The aim of our current study is to check whether multifractal patterns of the electroencephalographic (EEG) signals of normal and epileptic patients are statistically similar or different. In this regard, the generalized Hurst exponent (GHE) method is used for robust estimation of the multifractals in each type of EEG signals, and three powerful statistical tests are performed to check existence of differences between estimated GHEs from healthy control subjects and epileptic patients. The obtained results show that multifractals exist in both types of EEG signals. Particularly, it was found that the degree of fractal is more pronounced in short variations of normal EEG signals than in short variations of EEG signals with seizure free intervals. In contrary, it is more pronounced in long variations of EEG signals with seizure free intervals than in normal EEG signals. Importantly, both parametric and nonparametric statistical tests show strong evidence that estimated GHEs of normal EEG signals are statistically and significantly different from those with seizure free intervals. Therefore, GHEs can be efficiently used to distinguish between healthy and patients suffering from epilepsy.

  13. Solar Energetic Particle Transport Near a Heliospheric Current Sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battarbee, Markus; Dalla, Silvia; Marsh, Mike S., E-mail: mbattarbee@uclan.ac.uk

    2017-02-10

    Solar energetic particles (SEPs), a major component of space weather, propagate through the interplanetary medium strongly guided by the interplanetary magnetic field (IMF). In this work, we analyze the implications that a flat Heliospheric Current Sheet (HCS) has on proton propagation from SEP release sites to the Earth. We simulate proton propagation by integrating fully 3D trajectories near an analytically defined flat current sheet, collecting comprehensive statistics into histograms, fluence maps, and virtual observer time profiles within an energy range of 1–800 MeV. We show that protons experience significant current sheet drift to distant longitudes, causing time profiles to exhibitmore » multiple components, which are a potential source of confusing interpretations of observations. We find that variation of the current sheet thickness within a realistic parameter range has little effect on particle propagation. We show that the IMF configuration strongly affects the deceleration of protons. We show that in our model, the presence of a flat equatorial HCS in the inner heliosphere limits the crossing of protons into the opposite hemisphere.« less

  14. Mode-Stirred Method Implementation for HIRF Susceptibility Testing and Results Comparison with Anechoic Method

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.

    2001-01-01

    This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.

  15. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  16. The physicist's companion to current fluctuations: one-dimensional bulk-driven lattice gases

    NASA Astrophysics Data System (ADS)

    Lazarescu, Alexandre

    2015-12-01

    One of the main features of statistical systems out of equilibrium is the currents they exhibit in their stationary state: microscopic currents of probability between configurations, which translate into macroscopic currents of mass, charge, etc. Understanding the general behaviour of these currents is an important step towards building a universal framework for non-equilibrium steady states akin to the Gibbs-Boltzmann distribution for equilibrium systems. In this review, we consider one-dimensional bulk-driven particle gases, and in particular the asymmetric simple exclusion process (ASEP) with open boundaries, which is one of the most popular models of one-dimensional transport. We focus, in particular, on the current of particles flowing through the system in its steady state, and on its fluctuations. We show how one can obtain the complete statistics of that current, through its large deviation function, by combining results from various methods: exact calculation of the cumulants of the current, using the integrability of the model; direct diagonalization of a biased process in the limits of very high or low current; hydrodynamic description of the model in the continuous limit using the macroscopic fluctuation theory. We give a pedagogical account of these techniques, starting with a quick introduction to the necessary mathematical tools, as well as a short overview of the existing works relating to the ASEP. We conclude by drawing the complete dynamical phase diagram of the current. We also remark on a few possible generalizations of these results.

  17. Statistical analysis of the electrocatalytic activity of Pt nanoparticles supported on novel functionalized reduced graphene oxide-chitosan for methanol electrooxidation

    NASA Astrophysics Data System (ADS)

    Ekrami-Kakhki, Mehri-Saddat; Abbasi, Sedigheh; Farzaneh, Nahid

    2018-01-01

    The purpose of this study is to statistically analyze the anodic current density and peak potential of methanol oxidation at Pt nanoparticles supported on functionalized reduced graphene oxide (RGO), using design of experiments methodology. RGO is functionalized with methyl viologen (MV) and chitosan (CH). The novel Pt/MV-RGO-CH catalyst is successfully prepared and characterized with transmission electron microscopy (TEM) image. The electrocatalytic activity of Pt/MV-RGOCH catalyst is experimentally evaluated for methanol oxidation. The effects of methanol concentration and scan rate factors are also investigated experimentally and statistically. The effects of these two main factors and their interactions are investigated, using analysis of variance test, Duncan's multiple range test and response surface method. The results of the analysis of variance show that all the main factors and their interactions have a significant effect on anodic current density and peak potential of methanol oxidation at α = 0.05. The suggested models which encompass significant factors can predict the variation of the anodic current density and peak potential of methanol oxidation. The results of Duncan's multiple range test confirmed that there is a significant difference between the studied levels of the main factors. [Figure not available: see fulltext.

  18. Current temporal asymmetry and the role of tides: Nan-Wan Bay vs. the Gulf of Elat

    NASA Astrophysics Data System (ADS)

    Ashkenazy, Yosef; Fredj, Erick; Gildor, Hezi; Gong, Gwo-Ching; Lee, Hung-Jen

    2016-05-01

    Nan-Wan Bay in Taiwan and the Gulf of Elat in Israel are two different coastal environments, and as such, their currents are expected to have different statistical properties. While Nan-Wan Bay is shallow, has three open boundaries, and is directly connected to the open ocean, the Gulf of Elat is deep, semi-enclosed, and connected to the Red Sea via the Straits of Tiran. Surface currents have been continuously measured with fine temporal (less than or equal to 1 h) and spatial resolution (less than or equal to 1 km) for more than a year in both environments using coastal radars (CODARs) that cover a domain of roughly 10 × 10 km. These measurements show that the currents in Nan-Wan Bay are much stronger than those in the Gulf of Elat and that the mean current field in Nan-Wan Bay exhibits cyclonic circulation, which is stronger in the summer; in the Gulf of Elat, the mean current field is directed southward and is also stronger during the summer. We have compared the statistical properties of the current speeds in both environments and found that both exhibit large spatial and seasonal variations in the shape parameter of the Weibull distribution. However, we have found fundamental and significant differences when comparing the temporal asymmetry of the current speed (i.e., the ratio between the time during which the current speed increases and the total time). While the Nan-Wan Bay currents are significantly asymmetric, those of the Gulf of Elat are not. We then extracted the tidal component of the Nan-Wan Bay currents and found that it is strongly asymmetric, while the asymmetry of tidally filtered currents is much weaker. We thus conclude that the temporal asymmetry of the Nan-Wan Bay currents reported here is due to the strong tides in the region. We show that the asymmetry ratio in Nan-Wan Bay varies spatially and seasonally: (i) the currents increase rapidly and decay slowly in the northern part of the domain and vice versa in the southern part, and (ii) the asymmetry is stronger during summer.

  19. Current temporal asymmetry and the role of tides: Nan-Wan Bay vs. the Gulf of Elat

    NASA Astrophysics Data System (ADS)

    Ashkenazy, Y.; Fredj, E.; Gildor, H.; Gong, G. C.; Lee, H. J.; Wu, C. R.

    2016-02-01

    Nan-Wan Bay in Taiwan and the Gulf of Elat in Israel are two different coastal environments and as such are expected to have different statistical properties of their currents. While the Nan-Wan Bay is shallow, has three open boundaries, and directly connected to the open ocean, the Gulf of Elat is deep, semi-enclosed, and connected to the Red Sea via the Straits of Tiran. High temporal (less or equal one hour) and spatial (less or equal one km) surface currents have been measured continuously for more than a year in both environments using Coastal Radars (CODARs) that cover a domain of roughly 10×10 kms. These measurements show that the currents in Nan-Wan Bay are much stronger than those in the Gulf of Elat and that the mean current field in Nan-Wan Bay exhibits cyclonic circulation, which is stronger in the summer; in the Gulf of Elat the mean current field is directed to south and is stronger during the summer. We have compared the statistical properties of the CODAR current speeds in both environments and found that both exhibit large spatial and seasonal variations in the shape parameter of the Weibull distribution. However, we have found fundamental and significant differences when comparing the temporal asymmetry of the current speed (i.e., the ratio between the time during which the current speed is increasing to the total time). While the Nan-Wan Bay currents are significantly asymmetric, those of the Gulf of Elat are not significantly asymmetric. We then extracted the tidal component of the Nan-Wan Bay currents and found that it is strongly asymmetric while the asymmetry of tidally-filtered currents is much weaker. We thus conclude that the temporal asymmetry of the Nan-Wan Bay currents reported here is due to the strong tides in the region. We show that the asymmetry ratio in the Nan-Wan Bay is varied spatially and seasonally: (i) currents increase rapidly and decay slowly in the northern part of the domain and vice versa in the southern part, and that (ii) the asymmetry is stronger during summer.

  20. Powerful Inference with the D-Statistic on Low-Coverage Whole-Genome Data.

    PubMed

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2018-02-02

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness is assessed by evaluating specific coincidences of alleles between the groups. When working with high-throughput sequencing data, calling genotypes accurately is not always possible; therefore, the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction to combat the problems of sequencing errors, and show a way to correct for introgression from an external population that is not part of the supposed genetic relationship, and how this leads to an estimate of the admixture rate. We prove that the D-statistic is approximated by a standard normal distribution. Furthermore, we show that our method outperforms the traditional D-statistic in detecting admixtures. The power gain is most pronounced for low and medium sequencing depth (1-10×), and performances are as good as with perfectly called genotypes at a sequencing depth of 2×. We show the reliability of error correction in scenarios with simulated errors and ancient data, and correct for introgression in known scenarios to estimate the admixture rates. Copyright © 2018 Soraggi et al.

  1. Demystifying the Millennial Student: A Reassessment in Measures of Character and Engagement in Professional Education

    ERIC Educational Resources Information Center

    DiLullo, Camille; McGee, Patricia; Kriebel, Richard M.

    2011-01-01

    The characteristic profile of Millennial Generation students, driving many educational reforms, can be challenged by research in a number of fields including cognition, learning style, neurology, and psychology. This evidence suggests that the current aggregate view of the Millennial student may be less than accurate. Statistics show that…

  2. Show the Data, Don't Conceal Them

    ERIC Educational Resources Information Center

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Current standards of data presentation and analysis in biological journals often fall short of ideal. This is the first of a planned series of short articles, to be published in a number of journals, aiming to highlight the principles of clear data presentation and appropriate statistical analysis. This article considers the methods used to show…

  3. Models of Reference Services in Australian Academic Libraries

    ERIC Educational Resources Information Center

    Burke, Liz

    2008-01-01

    This article reports on a project which was undertaken in 2006 to investigate the current modes and methods for delivering reference services in Australian academic libraries. The project included a literature review to assist in providing a definition of reference services as well as a snapshot of statistics showing staff and patron numbers from…

  4. Women, University and Science in Twentieth-Century Spain

    ERIC Educational Resources Information Center

    Canales, Antonio Fco.

    2018-01-01

    This article aims to question the widely accepted idea that female university students in Spain have, in the past, tended to opt for degrees in the field of humanities. Based on an analysis of the official statistics that are currently available, the paper demonstrates that Spanish female university students showed a clear preference for…

  5. Passing the Baton: The Last 100 Days of the College Presidency

    ERIC Educational Resources Information Center

    Johnson, Sandra Swanson

    2012-01-01

    Over the past half-century, the college president's job and its associated expectations have grown increasingly complex. At the same time, colleges and universities across the United States are facing an unprecedented rate of anticipated turnover among college presidents (King & Gomez, 2008). Current statistics show that approximately 70%…

  6. Investigating Students' Beliefs about Arabic Language Programs at Kuwait University

    ERIC Educational Resources Information Center

    Al-Shaye, Shaye S.

    2009-01-01

    The current study attempted to identify students' of Arabic programs beliefs about their chosen programs. To achieve this purpose, a survey was developed to collect the data from randomly selected students in liberal-arts and education-based programs at Kuwait University. The results showed that students were statistically differentiated as a…

  7. Exact Large-Deviation Statistics for a Nonequilibrium Quantum Spin Chain

    NASA Astrophysics Data System (ADS)

    Žnidarič, Marko

    2014-01-01

    We consider a one-dimensional XX spin chain in a nonequilibrium setting with a Lindblad-type boundary driving. By calculating large-deviation rate function in the thermodynamic limit, a generalization of free energy to a nonequilibrium setting, we obtain a complete distribution of current, including closed expressions for lower-order cumulants. We also identify two phase-transition-like behaviors in either the thermodynamic limit, at which the current probability distribution becomes discontinuous, or at maximal driving, when the range of possible current values changes discontinuously. In the thermodynamic limit the current has a finite upper and lower bound. We also explicitly confirm nonequilibrium fluctuation relation and show that the current distribution is the same under mapping of the coupling strength Γ→1/Γ.

  8. Image Processing Diagnostics: Emphysema

    NASA Astrophysics Data System (ADS)

    McKenzie, Alex

    2009-10-01

    Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

  9. Are weather models better than gridded observations for precipitation in the mountains? (Invited)

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Rasmussen, R.; Liu, C.; Ikeda, K.; Clark, M. P.; Brekke, L. D.; Arnold, J.; Raff, D. A.

    2013-12-01

    Mountain snowpack is a critical storage component in the water cycle, and it provides drinking water for tens of millions of people in the Western US alone. This water store is susceptible to climate change both because warming temperatures are likely to lead to earlier melt and a temporal shift of the hydrograph, and because changing atmospheric conditions are likely to change the precipitation patterns that produce the snowpack. Current measurements of snowfall in complex terrain are limited in number due in part to the logistics of installing equipment in complex terrain. We show that this limitation leads to statistical artifacts in gridded observations of current climate including errors in precipitation season totals of a factor of two or more, increases in wet day fraction, and decreases in storm intensity. In contrast, a high-resolution numerical weather model (WRF) is able to reproduce observed precipitation patterns, leading to confidence in its predictions for areas without measurements and new observations support this. Running WRF for a future climate scenario shows substantial changes in the spatial patterns of precipitation in the mountains related to the physics of hydrometeor production and detrainment that are not captured by statistical downscaling products. The stationarity in statistical downscaling products is likely to lead to important errors in our estimation of future precipitation in complex terrain.

  10. Statistical algorithms improve accuracy of gene fusion detection

    PubMed Central

    Hsieh, Gillian; Bierman, Rob; Szabo, Linda; Lee, Alex Gia; Freeman, Donald E.; Watson, Nathaniel; Sweet-Cordero, E. Alejandro

    2017-01-01

    Abstract Gene fusions are known to play critical roles in tumor pathogenesis. Yet, sensitive and specific algorithms to detect gene fusions in cancer do not currently exist. In this paper, we present a new statistical algorithm, MACHETE (Mismatched Alignment CHimEra Tracking Engine), which achieves highly sensitive and specific detection of gene fusions from RNA-Seq data, including the highest Positive Predictive Value (PPV) compared to the current state-of-the-art, as assessed in simulated data. We show that the best performing published algorithms either find large numbers of fusions in negative control data or suffer from low sensitivity detecting known driving fusions in gold standard settings, such as EWSR1-FLI1. As proof of principle that MACHETE discovers novel gene fusions with high accuracy in vivo, we mined public data to discover and subsequently PCR validate novel gene fusions missed by other algorithms in the ovarian cancer cell line OVCAR3. These results highlight the gains in accuracy achieved by introducing statistical models into fusion detection, and pave the way for unbiased discovery of potentially driving and druggable gene fusions in primary tumors. PMID:28541529

  11. 76 FR 45505 - Notice of Intent To Revise a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-29

    ... cooperative agreement between the Center for Disease Control (CDC) and the National Agricultural Statistics... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Revise a Currently Approved Information Collection AGENCY: National Agricultural Statistics Service, USDA. ACTION...

  12. A Prototype System for Retrieval of Gene Functional Information

    PubMed Central

    Folk, Lillian C.; Patrick, Timothy B.; Pattison, James S.; Wolfinger, Russell D.; Mitchell, Joyce A.

    2003-01-01

    Microarrays allow researchers to gather data about the expression patterns of thousands of genes simultaneously. Statistical analysis can reveal which genes show statistically significant results. Making biological sense of those results requires the retrieval of functional information about the genes thus identified, typically a manual gene-by-gene retrieval of information from various on-line databases. For experiments generating thousands of genes of interest, retrieval of functional information can become a significant bottleneck. To address this issue, we are currently developing a prototype system to automate the process of retrieval of functional information from multiple on-line sources. PMID:14728346

  13. Statistical tests for detecting associations with groups of genetic variants: generalization, evaluation, and implementation

    PubMed Central

    Ferguson, John; Wheeler, William; Fu, YiPing; Prokunina-Olsson, Ludmila; Zhao, Hongyu; Sampson, Joshua

    2013-01-01

    With recent advances in sequencing, genotyping arrays, and imputation, GWAS now aim to identify associations with rare and uncommon genetic variants. Here, we describe and evaluate a class of statistics, generalized score statistics (GSS), that can test for an association between a group of genetic variants and a phenotype. GSS are a simple weighted sum of single-variant statistics and their cross-products. We show that the majority of statistics currently used to detect associations with rare variants are equivalent to choosing a specific set of weights within this framework. We then evaluate the power of various weighting schemes as a function of variant characteristics, such as MAF, the proportion associated with the phenotype, and the direction of effect. Ultimately, we find that two classical tests are robust and powerful, but details are provided as to when other GSS may perform favorably. The software package CRaVe is available at our website (http://dceg.cancer.gov/bb/tools/crave). PMID:23092956

  14. Analytical Estimation of the Scale of Earth-Like Planetary Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Bologna, Mauro; Tellini, Bernardo

    2014-10-01

    In this paper we analytically estimate the magnetic field scale of planets with physical core conditions similar to that of Earth from a statistical physics point of view. We evaluate the magnetic field on the basis of the physical parameters of the center of the planet, such as density, temperature, and core size. We look at the contribution of the Seebeck effect on the magnetic field, showing that a thermally induced electrical current can exist in a rotating fluid sphere. We apply our calculations to Earth, where the currents would be driven by the temperature difference at the outer-inner core boundary, Jupiter and the Jupiter's satellite Ganymede. In each case we show that the thermal generation of currents leads to a magnetic field scale comparable to the observed fields of the considered celestial bodies.

  15. Biased relevance filtering in the auditory system: A test of confidence-weighted first-impressions.

    PubMed

    Mullens, D; Winkler, I; Damaso, K; Heathcote, A; Whitson, L; Provost, A; Todd, J

    2016-03-01

    Although first-impressions are known to impact decision-making and to have prolonged effects on reasoning, it is less well known that the same type of rapidly formed assumptions can explain biases in automatic relevance filtering outside of deliberate behavior. This paper features two studies in which participants have been asked to ignore sequences of sound while focusing attention on a silent movie. The sequences consisted of blocks, each with a high-probability repetition interrupted by rare acoustic deviations (i.e., a sound of different pitch or duration). The probabilities of the two different sounds alternated across the concatenated blocks within the sequence (i.e., short-to-long and long-to-short). The sound probabilities are rapidly and automatically learned for each block and a perceptual inference is formed predicting the most likely characteristics of the upcoming sound. Deviations elicit a prediction-error signal known as mismatch negativity (MMN). Computational models of MMN generally assume that its elicitation is governed by transition statistics that define what sound attributes are most likely to follow the current sound. MMN amplitude reflects prediction confidence, which is derived from the stability of the current transition statistics. However, our prior research showed that MMN amplitude is modulated by a strong first-impression bias that outweighs transition statistics. Here we test the hypothesis that this bias can be attributed to assumptions about predictable vs. unpredictable nature of each tone within the first encountered context, which is weighted by the stability of that context. The results of Study 1 show that this bias is initially prevented if there is no 1:1 mapping between sound attributes and probability, but it returns once the auditory system determines which properties provide the highest predictive value. The results of Study 2 show that confidence in the first-impression bias drops if assumptions about the temporal stability of the transition-statistics are violated. Both studies provide compelling evidence that the auditory system extrapolates patterns on multiple timescales to adjust its response to prediction-errors, while profoundly distorting the effects of transition-statistics by the assumptions formed on the basis of first-impressions. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. 77 FR 36477 - Notice of Intent To Revise and Extend a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-19

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Revise and Extend a Currently Approved Information Collection AGENCY: National Agricultural Statistics Service, USDA... Reduction Act of 1995 this notice announces the intention of the National Agricultural Statistics Service...

  17. The Necessity of the Hippocampus for Statistical Learning

    PubMed Central

    Covington, Natalie V.; Brown-Schmidt, Sarah; Duff, Melissa C.

    2018-01-01

    Converging evidence points to a role for the hippocampus in statistical learning, but open questions about its necessity remain. Evidence for necessity comes from Schapiro and colleagues who report that a single patient with damage to hippocampus and broader medial temporal lobe cortex was unable to discriminate new from old sequences in several statistical learning tasks. The aim of the current study was to replicate these methods in a larger group of patients who have either damage localized to hippocampus or a broader medial temporal lobe damage, to ascertain the necessity of the hippocampus in statistical learning. Patients with hippocampal damage consistently showed less learning overall compared with healthy comparison participants, consistent with an emerging consensus for hippocampal contributions to statistical learning. Interestingly, lesion size did not reliably predict performance. However, patients with hippocampal damage were not uniformly at chance and demonstrated above-chance performance in some task variants. These results suggest that hippocampus is necessary for statistical learning levels achieved by most healthy comparison participants but significant hippocampal pathology alone does not abolish such learning. PMID:29308986

  18. Walking through the statistical black boxes of plant breeding.

    PubMed

    Xavier, Alencar; Muir, William M; Craig, Bruce; Rainey, Katy Martin

    2016-10-01

    The main statistical procedures in plant breeding are based on Gaussian process and can be computed through mixed linear models. Intelligent decision making relies on our ability to extract useful information from data to help us achieve our goals more efficiently. Many plant breeders and geneticists perform statistical analyses without understanding the underlying assumptions of the methods or their strengths and pitfalls. In other words, they treat these statistical methods (software and programs) like black boxes. Black boxes represent complex pieces of machinery with contents that are not fully understood by the user. The user sees the inputs and outputs without knowing how the outputs are generated. By providing a general background on statistical methodologies, this review aims (1) to introduce basic concepts of machine learning and its applications to plant breeding; (2) to link classical selection theory to current statistical approaches; (3) to show how to solve mixed models and extend their application to pedigree-based and genomic-based prediction; and (4) to clarify how the algorithms of genome-wide association studies work, including their assumptions and limitations.

  19. A Ratio Test of Interrater Agreement with High Specificity

    ERIC Educational Resources Information Center

    Cousineau, Denis; Laurencelle, Louis

    2015-01-01

    Existing tests of interrater agreements have high statistical power; however, they lack specificity. If the ratings of the two raters do not show agreement but are not random, the current tests, some of which are based on Cohen's kappa, will often reject the null hypothesis, leading to the wrong conclusion that agreement is present. A new test of…

  20. Rethinking Attachment: Fostering Positive Relationships between Infants, Toddlers and Their Primary Caregivers

    ERIC Educational Resources Information Center

    Ebbeck, Marjory; Yim, Hoi Yin Bonnie

    2009-01-01

    This article provides a synthesis of current theory and research in relation to attachment between infants/toddlers and their caregivers. Worldwide statistics show that there are a significant number of women working in the global labour market. In Australia, recent research also found that over 300,000 children aged 0-5 years are currently…

  1. Status and Trends in the Education of Racial and Ethnic Groups 2017. NCES 2017-051

    ERIC Educational Resources Information Center

    Musu-Gillette, Lauren; de Brey, Cristobal; McFarland, Joel; Hussar, William; Sonnenberg, William; Wilkinson-Flicker, Sidney

    2017-01-01

    This report uses statistics to examine current conditions and changes over time in education activities and outcomes for different racial/ethnic groups in the United States. This report shows that over time, students in the racial/ethnic groups of White, Black, Hispanic, Asian, Native Hawaiian or Other Pacific Islander, American Indian/Alaska…

  2. Exploring the practicing-connections hypothesis: using gesture to support coordination of ideas in understanding a complex statistical concept.

    PubMed

    Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W

    2018-01-01

    In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.

  3. Commissioning of the NPDGamma Detector Array: Counting Statistics in Current Mode Operation and Parity Violation in the Capture of Cold Neutrons on B 4 C and (27) Al.

    PubMed

    Gericke, M T; Bowman, J D; Carlini, R D; Chupp, T E; Coulter, K P; Dabaghyan, M; Desai, D; Freedman, S J; Gentile, T R; Gillis, R C; Greene, G L; Hersman, F W; Ino, T; Ishimoto, S; Jones, G L; Lauss, B; Leuschner, M B; Losowski, B; Mahurin, R; Masuda, Y; Mitchell, G S; Muto, S; Nann, H; Page, S A; Penttila, S I; Ramsay, W D; Santra, S; Seo, P-N; Sharapov, E I; Smith, T B; Snow, W M; Wilburn, W S; Yuan, V; Zhu, H

    2005-01-01

    The NPDGamma γ-ray detector has been built to measure, with high accuracy, the size of the small parity-violating asymmetry in the angular distribution of gamma rays from the capture of polarized cold neutrons by protons. The high cold neutron flux at the Los Alamos Neutron Scattering Center (LANSCE) spallation neutron source and control of systematic errors require the use of current mode detection with vacuum photodiodes and low-noise solid-state preamplifiers. We show that the detector array operates at counting statistics and that the asymmetries due to B4C and (27)Al are zero to with- in 2 × 10(-6) and 7 × 10(-7), respectively. Boron and aluminum are used throughout the experiment. The results presented here are preliminary.

  4. The influence of non-Gaussian distribution functions on the time-dependent perpendicular transport of energetic particles

    NASA Astrophysics Data System (ADS)

    Lasuik, J.; Shalchi, A.

    2018-06-01

    In the current paper we explore the influence of the assumed particle statistics on the transport of energetic particles across a mean magnetic field. In previous work the assumption of a Gaussian distribution function was standard, although there have been known cases for which the transport is non-Gaussian. In the present work we combine a kappa distribution with the ordinary differential equation provided by the so-called unified non-linear transport theory. We then compute running perpendicular diffusion coefficients for different values of κ and turbulence configurations. We show that changing the parameter κ slightly increases or decreases the perpendicular diffusion coefficient depending on the considered turbulence configuration. Since these changes are small, we conclude that the assumed statistics is less significant in particle transport theory. The results obtained in the current paper support to use a Gaussian distribution function as usually done in particle transport theory.

  5. 76 FR 44960 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Report on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... for OMB Review; Comment Request; Report on Current Employment Statistics ACTION: Notice. SUMMARY: The Department of Labor (DOL) is submitting the revised Bureau of Labor Statistics (BLS) sponsored information collection request (ICR) titled, ``Report on Current Employment Statistics,'' to the Office of Management and...

  6. Clustering, randomness and regularity in cloud fields. I - Theoretical considerations. II - Cumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.

    1992-01-01

    The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.

  7. Fluctuating hydrodynamics, current fluctuations, and hyperuniformity in boundary-driven open quantum chains

    NASA Astrophysics Data System (ADS)

    Carollo, Federico; Garrahan, Juan P.; Lesanovsky, Igor; Pérez-Espigares, Carlos

    2017-11-01

    We consider a class of either fermionic or bosonic noninteracting open quantum chains driven by dissipative interactions at the boundaries and study the interplay of coherent transport and dissipative processes, such as bulk dephasing and diffusion. Starting from the microscopic formulation, we show that the dynamics on large scales can be described in terms of fluctuating hydrodynamics. This is an important simplification as it allows us to apply the methods of macroscopic fluctuation theory to compute the large deviation (LD) statistics of time-integrated currents. In particular, this permits us to show that fermionic open chains display a third-order dynamical phase transition in LD functions. We show that this transition is manifested in a singular change in the structure of trajectories: while typical trajectories are diffusive, rare trajectories associated with atypical currents are ballistic and hyperuniform in their spatial structure. We confirm these results by numerically simulating ensembles of rare trajectories via the cloning method, and by exact numerical diagonalization of the microscopic quantum generator.

  8. Fluctuating hydrodynamics, current fluctuations, and hyperuniformity in boundary-driven open quantum chains.

    PubMed

    Carollo, Federico; Garrahan, Juan P; Lesanovsky, Igor; Pérez-Espigares, Carlos

    2017-11-01

    We consider a class of either fermionic or bosonic noninteracting open quantum chains driven by dissipative interactions at the boundaries and study the interplay of coherent transport and dissipative processes, such as bulk dephasing and diffusion. Starting from the microscopic formulation, we show that the dynamics on large scales can be described in terms of fluctuating hydrodynamics. This is an important simplification as it allows us to apply the methods of macroscopic fluctuation theory to compute the large deviation (LD) statistics of time-integrated currents. In particular, this permits us to show that fermionic open chains display a third-order dynamical phase transition in LD functions. We show that this transition is manifested in a singular change in the structure of trajectories: while typical trajectories are diffusive, rare trajectories associated with atypical currents are ballistic and hyperuniform in their spatial structure. We confirm these results by numerically simulating ensembles of rare trajectories via the cloning method, and by exact numerical diagonalization of the microscopic quantum generator.

  9. Correnti atmosferiche su Giove

    NASA Astrophysics Data System (ADS)

    Adamoli, Gianluigi

    2006-06-01

    UAI observations are presented concerning the surveillance of Jupiter's atmospheric currents by means of digital images. General statistics are derived about the latitude and speed of individual spots and currents in the period 2000-04, compared with the Voyager wind profile. Attention is drawn to the wind shear present at distinct latitudes, namely on the South edge of the SEB, across the NTB and across the NEB. Especially interesting were the 2003 remnants of the disappearing NTB, which showed a motion intermediate between the NTC and the NTBs jet streams. Vorticity was derived in all cases.

  10. Chemical potential of quasi-equilibrium magnon gas driven by pure spin current.

    PubMed

    Demidov, V E; Urazhdin, S; Divinskiy, B; Bessonov, V D; Rinkevich, A B; Ustinov, V V; Demokritov, S O

    2017-11-17

    Pure spin currents provide the possibility to control the magnetization state of conducting and insulating magnetic materials. They allow one to increase or reduce the density of magnons, and achieve coherent dynamic states of magnetization reminiscent of the Bose-Einstein condensation. However, until now there was no direct evidence that the state of the magnon gas subjected to spin current can be treated thermodynamically. Here, we show experimentally that the spin current generated by the spin-Hall effect drives the magnon gas into a quasi-equilibrium state that can be described by the Bose-Einstein statistics. The magnon population function is characterized either by an increased effective chemical potential or by a reduced effective temperature, depending on the spin current polarization. In the former case, the chemical potential can closely approach, at large driving currents, the lowest-energy magnon state, indicating the possibility of spin current-driven Bose-Einstein condensation.

  11. Anisotropic quantum quench in the presence of frustration or background gauge fields: A probe of bulk currents and topological chiral edge modes

    NASA Astrophysics Data System (ADS)

    Killi, Matthew; Trotzky, Stefan; Paramekanti, Arun

    2012-12-01

    Bosons and fermions, in the presence of frustration or background gauge fields, can form many-body ground states that support equilibrium charge or spin currents. Motivated by the experimental creation of frustration or synthetic gauge fields in ultracold atomic systems, we propose a general scheme by which making a sudden anisotropic quench of the atom tunneling across the lattice and tracking the ensuing density modulations provides a powerful and gauge-invariant route to probing diverse equilibrium current patterns. Using illustrative examples of trapped superfluid Bose and normal Fermi systems in the presence of artificial magnetic fluxes on square lattices, and frustrated bosons in a triangular lattice, we show that this scheme to probe equilibrium bulk current order works independent of particle statistics. We also show that such quenches can detect chiral edge modes in gapped topological states, such as quantum Hall or quantum spin Hall insulators.

  12. Higher Education Financial Statistics, 1981-82.

    ERIC Educational Resources Information Center

    Hottinger, Gerald W.

    Statistical data on Pennsylvania higher education finance for 1981-1982 are presented. Tables provide the following information: current-funds revenues by institutional control, 1972-1973 through 1981-1982; percent of current-funds revenues by source, 1972-1973 through 1981-1982; current-funds expenditures by institutional control, 1972-1973…

  13. A quadratically regularized functional canonical correlation analysis for identifying the global structure of pleiotropy with NGS data

    PubMed Central

    Zhu, Yun; Fan, Ruzong; Xiong, Momiao

    2017-01-01

    Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274

  14. Wavelet analysis in ecology and epidemiology: impact of statistical tests

    PubMed Central

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-01-01

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892

  15. Wavelet analysis in ecology and epidemiology: impact of statistical tests.

    PubMed

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-02-06

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.

  16. Infants' statistical learning: 2- and 5-month-olds' segmentation of continuous visual sequences.

    PubMed

    Slone, Lauren Krogh; Johnson, Scott P

    2015-05-01

    Past research suggests that infants have powerful statistical learning abilities; however, studies of infants' visual statistical learning offer differing accounts of the developmental trajectory of and constraints on this learning. To elucidate this issue, the current study tested the hypothesis that young infants' segmentation of visual sequences depends on redundant statistical cues to segmentation. A sample of 20 2-month-olds and 20 5-month-olds observed a continuous sequence of looming shapes in which unit boundaries were defined by both transitional probability and co-occurrence frequency. Following habituation, only 5-month-olds showed evidence of statistically segmenting the sequence, looking longer to a statistically improbable shape pair than to a probable pair. These results reaffirm the power of statistical learning in infants as young as 5 months but also suggest considerable development of statistical segmentation ability between 2 and 5 months of age. Moreover, the results do not support the idea that infants' ability to segment visual sequences based on transitional probabilities and/or co-occurrence frequencies is functional at the onset of visual experience, as has been suggested previously. Rather, this type of statistical segmentation appears to be constrained by the developmental state of the learner. Factors contributing to the development of statistical segmentation ability during early infancy, including memory and attention, are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensitymore » dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.« less

  18. Learner Autonomy and Selected Demographic Characteristics as They Relate to Life Satisfaction among Older Adults in Malaysia

    ERIC Educational Resources Information Center

    Ng, Siew Foen; Confessore, Gary J.

    2015-01-01

    Malaysia currently has about three million senior citizens. United Nations statistics show that Malaysia is likely to reach "aging nation" status by the year 2035. It is important to address the issues that may have impact on the needs and concerns of this growing population. This study examined the relationships of life satisfaction,…

  19. A Model Comparison for Count Data with a Positively Skewed Distribution with an Application to the Number of University Mathematics Courses Completed

    ERIC Educational Resources Information Center

    Liou, Pey-Yan

    2009-01-01

    The current study examines three regression models: OLS (ordinary least square) linear regression, Poisson regression, and negative binomial regression for analyzing count data. Simulation results show that the OLS regression model performed better than the others, since it did not produce more false statistically significant relationships than…

  20. Computer Use in the United States: 1989. Current Population Reports, Special Studies.

    ERIC Educational Resources Information Center

    Kominski, Robert

    1991-01-01

    This report provides statistical information on computer use in the United States in 1989, including home, work, and school use, and use according to socioeconomic status, race, and sex. The data show that between 1984 and 1989 there was a substantial increase in the levels of computer ownership and use. Fifteen percent of all U.S. households…

  1. The Research of Feature Extraction Method of Liver Pathological Image Based on Multispatial Mapping and Statistical Properties

    PubMed Central

    Liu, Huiling; Xia, Bingbing; Yi, Dehui

    2016-01-01

    We propose a new feature extraction method of liver pathological image based on multispatial mapping and statistical properties. For liver pathological images of Hematein Eosin staining, the image of R and B channels can reflect the sensitivity of liver pathological images better, while the entropy space and Local Binary Pattern (LBP) space can reflect the texture features of the image better. To obtain the more comprehensive information, we map liver pathological images to the entropy space, LBP space, R space, and B space. The traditional Higher Order Local Autocorrelation Coefficients (HLAC) cannot reflect the overall information of the image, so we propose an average correction HLAC feature. We calculate the statistical properties and the average gray value of pathological images and then update the current pixel value as the absolute value of the difference between the current pixel gray value and the average gray value, which can be more sensitive to the gray value changes of pathological images. Lastly the HLAC template is used to calculate the features of the updated image. The experiment results show that the improved features of the multispatial mapping have the better classification performance for the liver cancer. PMID:27022407

  2. Statistics: Number of Cancer Survivors

    MedlinePlus

    ... Current Survivorship Funding Opportunities at NCI Active Grant Portfolio Funding History and Trends Definitions Statistics Graphs Home ... Current Survivorship Funding Opportunities at NCI Active Grant Portfolio Funding History and Trends Last Updated: October 17, ...

  3. An analysis of quantum coherent solar photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Kirk, A. P.

    2012-02-01

    A new hypothesis (Scully et al., Proc. Natl. Acad. Sci. USA 108 (2011) 15097) suggests that it is possible to break the statistical physics-based detailed balance-limiting power conversion efficiency and increase the power output of a solar photovoltaic cell by using “noise-induced quantum coherence” to increase the current. The fundamental errors of this hypothesis are explained here. As part of this analysis, we show that the maximum photogenerated current density for a practical solar cell is a function of the incident spectrum, sunlight concentration factor, and solar cell energy bandgap and thus the presence of quantum coherence is irrelevant as it is unable to lead to increased current output from a solar cell.

  4. Empirical evidence for acceleration-dependent amplification factors

    USGS Publications Warehouse

    Borcherdt, R.D.

    2002-01-01

    Site-specific amplification factors, Fa and Fv, used in current U.S. building codes decrease with increasing base acceleration level as implied by the Loma Prieta earthquake at 0.1g and extrapolated using numerical models and laboratory results. The Northridge earthquake recordings of 17 January 1994 and subsequent geotechnical data permit empirical estimates of amplification at base acceleration levels up to 0.5g. Distance measures and normalization procedures used to infer amplification ratios from soil-rock pairs in predetermined azimuth-distance bins significantly influence the dependence of amplification estimates on base acceleration. Factors inferred using a hypocentral distance norm do not show a statistically significant dependence on base acceleration. Factors inferred using norms implied by the attenuation functions of Abrahamson and Silva show a statistically significant decrease with increasing base acceleration. The decrease is statistically more significant for stiff clay and sandy soil (site class D) sites than for stiffer sites underlain by gravely soils and soft rock (site class C). The decrease in amplification with increasing base acceleration is more pronounced for the short-period amplification factor, Fa, than for the midperiod factor, Fv.

  5. Considering whether Medicaid is worth the cost: revisiting the Oregon Health Study.

    PubMed

    Muennig, Peter A; Quan, Ryan; Chiuzan, Codruta; Glied, Sherry

    2015-05-01

    The Oregon Health Study was a groundbreaking experiment in which uninsured participants were randomized to either apply for Medicaid or stay with their current care. The study showed that Medicaid produced numerous important socioeconomic and health benefits but had no statistically significant impact on hypertension, hypercholesterolemia, or diabetes. Medicaid opponents interpreted the findings to mean that Medicaid is not a worthwhile investment. Medicaid proponents viewed the experiment as statistically underpowered and, irrespective of the laboratory values, suggestive that Medicaid is a good investment. We tested these competing claims and, using a sensitive joint test and statistical power analysis, confirmed that the Oregon Health Study did not improve laboratory values. However, we also found that Medicaid is a good value, with a cost of just $62 000 per quality-adjusted life-years gained.

  6. Graphical Presentation of Patient-Treatment Interaction Elucidated by Continuous Biomarkers. Current Practice and Scope for Improvement.

    PubMed

    Shen, Yu-Ming; Le, Lien D; Wilson, Rory; Mansmann, Ulrich

    2017-01-09

    Biomarkers providing evidence for patient-treatment interaction are key in the development and practice of personalized medicine. Knowledge that a patient with a specific feature - as demonstrated through a biomarker - would have an advantage under a given treatment vs. a competing treatment can aid immensely in medical decision-making. Statistical strategies to establish evidence of continuous biomarkers are complex and their formal results are thus not easy to communicate. Good graphical representations would help to translate such findings for use in the clinical community. Although general guidelines on how to present figures in clinical reports are available, there remains little guidance for figures elucidating the role of continuous biomarkers in patient-treatment interaction (CBPTI). To combat the current lack of comprehensive reviews or adequate guides on graphical presentation within this topic, our study proposes presentation principles for CBPTI plots. In order to understand current practice, we review the development of CBPTI methodology and how CBPTI plots are currently used in clinical research. The quality of a CBPTI plot is determined by how well the presentation provides key information for clinical decision-making. Several criteria for a good CBPTI plot are proposed, including general principles of visual display, use of units presenting absolute outcome measures, appropriate quantification of statistical uncertainty, correct display of benchmarks, and informative content for answering clinical questions especially on the quantitative advantage for an individual patient with regard to a specific treatment. We examined the development of CBPTI methodology from the years 2000 - 2014, and reviewed how CBPTI plots were currently used in clinical research in six major clinical journals from 2013 - 2014 using the principle of theoretical saturation. Each CBPTI plot found was assessed for appropriateness of its presentation and clinical utility. In our review, a total of seven methodological papers and five clinical reports used CBPTI plots which we categorized into four types: those that distinguish the outcome effect for each treatment group; those that show the outcome differences between treatment groups (by either partitioning all individuals into subpopulations or modelling the functional form of the interaction); those that evaluate the proportion of population impact of the biomarker; and those that show the classification accuracy of the biomarker. The current practice of utilizing CBPTI plots in clinical reports suffers from methodological shortcomings: the lack of presentation of statistical uncertainty, the outcome measure scaled by relative unit instead of absolute unit, incorrect use of benchmarks, and being non-informative in answering clinical questions. There is considerable scope for improvement in the graphical representation of CBPTI in clinical reports. The current challenge is to develop instruments for high-quality graphical plots which not only convey quantitative concepts to readers with limited statistical knowledge, but also facilitate medical decision-making.

  7. Ocean dynamics studies. [of current-wave interactions

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Both the theoretical and experimental investigations into current-wave interactions are discussed. The following three problems were studied: (1) the dispersive relation of a random gravity-capillary wave field; (2) the changes of the statistical properties of surface waves under the influence of currents; and (3) the interaction of capillary-gravity with the nonuniform currents. Wave current interaction was measured and the feasibility of using such measurements for remote sensing of surface currents was considered. A laser probe was developed to measure the surface statistics, and the possibility of using current-wave interaction as a means of current measurement was demonstrated.

  8. Statistical polarization in greenhouse gas emissions: Theory and evidence.

    PubMed

    Remuzgo, Lorena; Trueba, Carmen

    2017-11-01

    The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990-2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  10. The response of cortical neurons to in vivo-like input current: theory and experiment: II. Time-varying and spatially distributed inputs.

    PubMed

    Giugliano, Michele; La Camera, Giancarlo; Fusi, Stefano; Senn, Walter

    2008-11-01

    The response of a population of neurons to time-varying synaptic inputs can show a rich phenomenology, hardly predictable from the dynamical properties of the membrane's inherent time constants. For example, a network of neurons in a state of spontaneous activity can respond significantly more rapidly than each single neuron taken individually. Under the assumption that the statistics of the synaptic input is the same for a population of similarly behaving neurons (mean field approximation), it is possible to greatly simplify the study of neural circuits, both in the case in which the statistics of the input are stationary (reviewed in La Camera et al. in Biol Cybern, 2008) and in the case in which they are time varying and unevenly distributed over the dendritic tree. Here, we review theoretical and experimental results on the single-neuron properties that are relevant for the dynamical collective behavior of a population of neurons. We focus on the response of integrate-and-fire neurons and real cortical neurons to long-lasting, noisy, in vivo-like stationary inputs and show how the theory can predict the observed rhythmic activity of cultures of neurons. We then show how cortical neurons adapt on multiple time scales in response to input with stationary statistics in vitro. Next, we review how it is possible to study the general response properties of a neural circuit to time-varying inputs by estimating the response of single neurons to noisy sinusoidal currents. Finally, we address the dendrite-soma interactions in cortical neurons leading to gain modulation and spike bursts, and show how these effects can be captured by a two-compartment integrate-and-fire neuron. Most of the experimental results reviewed in this article have been successfully reproduced by simple integrate-and-fire model neurons.

  11. Double-row vs single-row rotator cuff repair: a review of the biomechanical evidence.

    PubMed

    Wall, Lindley B; Keener, Jay D; Brophy, Robert H

    2009-01-01

    A review of the current literature will show a difference between the biomechanical properties of double-row and single-row rotator cuff repairs. Rotator cuff tears commonly necessitate surgical repair; however, the optimal technique for repair continues to be investigated. Recently, double-row repairs have been considered an alternative to single-row repair, allowing a greater coverage area for healing and a possibly stronger repair. We reviewed the literature of all biomechanical studies comparing double-row vs single-row repair techniques. Inclusion criteria included studies using cadaveric, animal, or human models that directly compared double-row vs single-row repair techniques, written in the English language, and published in peer reviewed journals. Identified articles were reviewed to provide a comprehensive conclusion of the biomechanical strength and integrity of the repair techniques. Fifteen studies were identified and reviewed. Nine studies showed a statistically significant advantage to a double-row repair with regards to biomechanical strength, failure, and gap formation. Three studies produced results that did not show any statistical advantage. Five studies that directly compared footprint reconstruction all demonstrated that the double-row repair was superior to a single-row repair in restoring anatomy. The current literature reveals that the biomechanical properties of a double-row rotator cuff repair are superior to a single-row repair. Basic Science Study, SRH = Single vs. Double Row RCR.

  12. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    PubMed

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  13. The statistical reporting quality of articles published in 2010 in five dental journals.

    PubMed

    Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti

    2015-01-01

    Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.

  14. The Status of Male Teachers in Public Education Today. Education Policy Brief. Volume 6, Number 4, Winter 2008

    ERIC Educational Resources Information Center

    Johnson, Shaun P.

    2008-01-01

    Current statistics show that roughly one quarter of all classroom teachers are male: this proportion sinks to approximately ten percent in the elementary grades. A scarcity of men in teaching is not a new phenomenon and has remained relatively constant through more than a century of various educational reforms. This brief discusses the historical…

  15. 1979-80 Financial Statistics for Current Cost of Education... Showing San Joaquin Delta College Position...

    ERIC Educational Resources Information Center

    DeRicco, Lawrence A.

    The costs of education per unit of Average Daily Attendance (ADA) are detailed in this two-part report for 70 California community college districts for the academic year 1979-80. Both Part I, which presents data excluding non-resident ADA, and Part II, which presents figures including non-resident ADA, begin with tables which rank order the…

  16. The imprint of f(R) gravity on weak gravitational lensing - II. Information content in cosmic shear statistics

    NASA Astrophysics Data System (ADS)

    Shirasaki, Masato; Nishimichi, Takahiro; Li, Baojiu; Higuchi, Yuichi

    2017-04-01

    We investigate the information content of various cosmic shear statistics on the theory of gravity. Focusing on the Hu-Sawicki-type f(R) model, we perform a set of ray-tracing simulations and measure the convergence bispectrum, peak counts and Minkowski functionals. We first show that while the convergence power spectrum does have sensitivity to the current value of extra scalar degree of freedom |fR0|, it is largely compensated by a change in the present density amplitude parameter σ8 and the matter density parameter Ωm0. With accurate covariance matrices obtained from 1000 lensing simulations, we then examine the constraining power of the three additional statistics. We find that these probes are indeed helpful to break the parameter degeneracy, which cannot be resolved from the power spectrum alone. We show that especially the peak counts and Minkowski functionals have the potential to rigorously (marginally) detect the signature of modified gravity with the parameter |fR0| as small as 10-5 (10-6) if we can properly model them on small (˜1 arcmin) scale in a future survey with a sky coverage of 1500 deg2. We also show that the signal level is similar among the additional three statistics and all of them provide complementary information to the power spectrum. These findings indicate the importance of combining multiple probes beyond the standard power spectrum analysis to detect possible modifications to general relativity.

  17. On the radiated EMI current extraction of dc transmission line based on corona current statistical measurements

    NASA Astrophysics Data System (ADS)

    Yi, Yong; Chen, Zhengying; Wang, Liming

    2018-05-01

    Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.

  18. What are the implications of rapid global warming for landslide-triggered turbidity current activity?

    NASA Astrophysics Data System (ADS)

    Clare, Michael; Peter, Talling; James, Hunt

    2014-05-01

    A geologically short-lived (~170kyr) episode of global warming occurred at ~55Ma, termed the Initial Eocene Thermal Maximum (IETM). Global temperatures rose by up to 8oC over only ~10kyr and a massive perturbation of the global carbon cycle occurred; creating a negative carbon isotopic (~-4% δ13C) excursion in sedimentary records. This interval has relevance to study of future climate change and its influence on geohazards including submarine landslides and turbidity currents. We analyse the recurrence frequency of turbidity currents, potentially initiated from large-volume slope failures. The study focuses on two sedimentary intervals that straddle the IETM and we discuss implications for turbidity current triggering. We present the results of statistical analyses (regression, generalised linear model, and proportional hazards model) for extensive turbidite records from an outcrop at Zumaia in NE Spain (N=285; 54.0 to 56.5 Ma) and based on ODP site 1068 on the Iberian Margin (N=1571; 48.2 to 67.6 Ma). The sedimentary sequences provide clear differentiation between hemipelagic and turbiditic mud with only negligible evidence of erosion. We infer dates for turbidites by converting hemipelagic bed thicknesses to time using interval-averaged accumulation rates. Multi-proxy dating techniques provide good age constraint. The background trend for the Zumaia record shows a near-exponential distribution of turbidite recurrence intervals, while the Iberian Margin shows a log-normal response. This is interpreted to be related to regional time-independence (exponential) and the effects of additive processes (log-normal). We discuss how a log-normal response may actually be generated over geological timescales from multiple shorter periods of random turbidite recurrence. The IETM interval shows a dramatic departure from both these background trends, however. This is marked by prolonged hiatuses (0.1 and 0.6 Myr duration) in turbidity current activity in contrast to the arithmetic mean recurrence, λ, for the full records (λ=0.007 and 0.0125 Myr). This period of inactivity is coincident with a dramatic carbon isotopic excursion (i.e. warmest part of the IETM) and heavily skews statistical analyses for both records. Dramatic global warming appears to exert a strong control on inhibiting turbidity current activity; whereas the effects of sea level change are not shown to be statistically significant. Rapid global warming is often implicated as a potential landslide trigger, due to dissociation of gas hydrates in response to elevated ocean temperatures. Other studies have suggested that intense global warming may actually be attributed to the atmospheric release of gas hydrates following catastrophic failure of large parts of a continental slope. Either way, a greater intensity of landslide and resultant turbidity current activity would be expected during the IETM; however, our findings are to the contrary. We offer some explanations in relation to potential triggers. Our work suggests that previous rapid global warming at the IETM did not trigger more frequent turbidity currents. This has direct relevance to future assessments relating to landslide-triggered tsunami hazard, and breakage of subsea cables by turbidity currents.

  19. Intercomparison of four regional climate models for the German State of Saxonia

    NASA Astrophysics Data System (ADS)

    Kreienkamp, F.; Spekat, A.; Enke, W.

    2009-09-01

    Results from four regional climate models which focus on Central Europe are presented: CCLM, the climate version of the German Weather Service's Local Model - REMO, the regional dynamic model from the Max Planck Institute for Meteorology in Hamburg - STAR, the statistical model developed at the PIK Potsdam Institute and WETTREG, the statistic-dynamic model developed by the company CEC Potsdam. For the area of the German State of Saxonia a host of properties and indicators were analyzed aiming to show the models' abilities to reconstruct the current climate and compare climate model scenarios. These include a group of thermal indicators, such as the number of ice, frost, summer and hot days, the number of tropical nights; then there are hydrometeorological indicators such as the exceedance of low and high precipitation thresholds; humidity, cloudiness and wind indicators complement the array. A selection of them showing similarities and differences of the models investigated will be presented.

  20. Why Flash Type Matters: A Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Mecikalski, Retha M.; Bitzer, Phillip M.; Carey, Lawrence D.

    2017-09-01

    While the majority of research only differentiates between intracloud (IC) and cloud-to-ground (CG) flashes, there exists a third flash type, known as hybrid flashes. These flashes have extensive IC components as well as return strokes to ground but are misclassified as CG flashes in current flash type analyses due to the presence of a return stroke. In an effort to show that IC, CG, and hybrid flashes should be separately classified, the two-sample Kolmogorov-Smirnov (KS) test was applied to the flash sizes, flash initiation, and flash propagation altitudes for each of the three flash types. The KS test statistically showed that IC, CG, and hybrid flashes do not have the same parent distributions and thus should be separately classified. Separate classification of hybrid flashes will lead to improved lightning-related research, because unambiguously classified hybrid flashes occur on the same order of magnitude as CG flashes for multicellular storms.

  1. Statistical analysis of electroconvection near an ion-selective membrane in the highly chaotic regime

    NASA Astrophysics Data System (ADS)

    Druzgalski, Clara; Mani, Ali

    2016-11-01

    We investigate electroconvection and its impact on ion transport in a model system comprised of an ion-selective membrane, an aqueous electrolyte, and an external electric field applied normal to the membrane. We develop a direct numerical simulation code to solve the governing Poisson-Nernst-Planck and Navier-Stokes equations in three dimensions using a specialized parallel numerical algorithm and sufficient resolution to capture the high frequency and high wavenumber physics. We show a comprehensive statistical analysis of the transport phenomena in the highly chaotic regime. Qualitative and quantitative comparisons of two-dimensional (2D) and 3D simulations include prediction of the mean concentration fields as well as the spectra of concentration, charge density, and velocity signals. Our analyses reveal a significant quantitative difference between 2D and 3D electroconvection. Furthermore, we show that high-intensity yet short-lived current density hot spots appear randomly on the membrane surface, contributing significantly to the mean current density. By examining cross correlations between current density on the membrane and other field quantities we explore the physical mechanisms leading to current hot spots. We also present analysis of transport fluxes in the context of ensemble-averaged equations. Our analysis reveals that in the highly chaotic regime the mixing layer (ML), which spans the majority of the domain extent, is governed by advective fluctuations. Furthermore, we show that in the ML the mean electromigration fluxes cancel out for positive and negative ions, indicating that the mean transport of total salt content within the ML can be represented via the electroneutral approximation. Finally, we present an assessment of the importance of different length scales in enhancing transport by computing the cross covariance of concentration and velocity fluctuations in the wavenumber space. Our analysis indicates that in the majority of the domain the large scales contribute most significantly to transport, while the effects of small scales become more appreciable in regions very near the membrane.

  2. A new approach for land degradation and desertification assessment using geospatial techniques

    NASA Astrophysics Data System (ADS)

    Masoudi, Masoud; Jokar, Parviz; Pradhan, Biswajeet

    2018-04-01

    Land degradation reduces the production of biomass and vegetation cover for all forms of land use. The lack of specific data related to degradation is a severe limitation for its monitoring. Assessment of the current state of land degradation or desertification is very difficult because this phenomenon includes several complex processes. For that reason, no common agreement has been achieved among the scientific community for its assessment. This study was carried out as an attempt to develop a new approach for land degradation assessment, based on its current state by modifying of Food and Agriculture Organization (FAO)-United Nations Environment Programme (UNEP) index and the normalized difference vegetation index (NDVI) index in Khuzestan province, southwestern Iran. Using the proposed evaluation method it is easy to understand the degree of destruction caused by the pursuit of low costs and in order to save time. Results showed that based on the percent of hazard classes in the current condition of land degradation, the most and least widespread areas of hazard classes are moderate (38.6 %) and no hazard (0.65 %) classes, respectively. Results in the desert component of the study area showed that the severe class is much more widespread than the other hazard classes, which could indicate an environmentally dangerous situation. Statistical results indicated that degradation is highest in deserts and rangeland areas compared to dry cultivated areas and forests. Statistical tests also showed that the average degradation amount in the arid region is higher than in other climates. It is hoped that this study's use of geospatial techniques will be found to be applicable in other regions of the world and can also contribute to better planning and management of land.

  3. Climate sensitivity to the lower stratospheric ozone variations

    NASA Astrophysics Data System (ADS)

    Kilifarska, N. A.

    2012-12-01

    The strong sensitivity of the Earth's radiation balance to variations in the lower stratospheric ozone—reported previously—is analysed here by the use of non-linear statistical methods. Our non-linear model of the land air temperature (T)—driven by the measured Arosa total ozone (TOZ)—explains 75% of total variability of Earth's T variations during the period 1926-2011. We have analysed also the factors which could influence the TOZ variability and found that the strongest impact belongs to the multi-decadal variations of galactic cosmic rays. Constructing a statistical model of the ozone variability, we have been able to predict the tendency in the land air T evolution till the end of the current decade. Results show that Earth is facing a weak cooling of the surface T by 0.05-0.25 K (depending on the ozone model) until the end of the current solar cycle. A new mechanism for O3 influence on climate is proposed.

  4. A supervised learning approach for Crohn's disease detection using higher-order image statistics and a novel shape asymmetry measure.

    PubMed

    Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M

    2013-10-01

    Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.

  5. Measurement and analysis of time-domain characteristics of corona-generated radio interference from a single positive corona source

    NASA Astrophysics Data System (ADS)

    Li, Xuebao; Li, Dayong; Chen, Bo; Cui, Xiang; Lu, Tiebing; Li, Yinfei

    2018-04-01

    The corona-generated electromagnetic interference commonly known as radio interference (RI) has become a limiting factor for the design of high voltage direct current transmission lines. In this paper, a time-domain measurement system is developed to measure the time-domain characteristics of corona-generated RI from a single corona source under a positive corona source. In the experiments, the corona current pulses are synchronously measured through coupling capacitors. The one-to-one relationship between the corona current pulse and measured RI voltage pulse is observed. The statistical characteristics of pulse parameters are analyzed, and the correlations between the corona current pulse and RI voltage pulse in the time-domain and frequency-domain are analyzed. Depending on the measured corona current pulses, the time-domain waveform of corona-generated RI is calculated on the basis of the propagation model of corona current on the conductor, the dipolar model for electric field calculation, and the antenna model for inducing voltage calculation. The well matched results between measured and simulated waveforms of RI voltage can show the validity of the measurement and calculation method presented in this paper, which also further show the close correlation between corona current and corona-generated RI.

  6. Single-electron thermal noise

    NASA Astrophysics Data System (ADS)

    Nishiguchi, Katsuhiko; Ono, Yukinori; Fujiwara, Akira

    2014-07-01

    We report the observation of thermal noise in the motion of single electrons in an ultimately small dynamic random access memory (DRAM). The nanometer-scale transistors that compose the DRAM resolve the thermal noise in single-electron motion. A complete set of fundamental tests conducted on this single-electron thermal noise shows that the noise perfectly follows all the aspects predicted by statistical mechanics, which include the occupation probability, the law of equipartition, a detailed balance, and the law of kT/C. In addition, the counting statistics on the directional motion (i.e., the current) of the single-electron thermal noise indicate that the individual electron motion follows the Poisson process, as it does in shot noise.

  7. Single-electron thermal noise.

    PubMed

    Nishiguchi, Katsuhiko; Ono, Yukinori; Fujiwara, Akira

    2014-07-11

    We report the observation of thermal noise in the motion of single electrons in an ultimately small dynamic random access memory (DRAM). The nanometer-scale transistors that compose the DRAM resolve the thermal noise in single-electron motion. A complete set of fundamental tests conducted on this single-electron thermal noise shows that the noise perfectly follows all the aspects predicted by statistical mechanics, which include the occupation probability, the law of equipartition, a detailed balance, and the law of kT/C. In addition, the counting statistics on the directional motion (i.e., the current) of the single-electron thermal noise indicate that the individual electron motion follows the Poisson process, as it does in shot noise.

  8. Visualizing water

    NASA Astrophysics Data System (ADS)

    Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.

    2016-12-01

    A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.

  9. Which sociodemographic factors are important on smoking behaviour of high school students? The contribution of classification and regression tree methodology in a broad epidemiological survey.

    PubMed

    Ozge, C; Toros, F; Bayramkaya, E; Camdeviren, H; Sasmaz, T

    2006-08-01

    The purpose of this study is to evaluate the most important sociodemographic factors on smoking status of high school students using a broad randomised epidemiological survey. Using in-class, self administered questionnaire about their sociodemographic variables and smoking behaviour, a representative sample of total 3304 students of preparatory, 9th, 10th, and 11th grades, from 22 randomly selected schools of Mersin, were evaluated and discriminative factors have been determined using appropriate statistics. In addition to binary logistic regression analysis, the study evaluated combined effects of these factors using classification and regression tree methodology, as a new statistical method. The data showed that 38% of the students reported lifetime smoking and 16.9% of them reported current smoking with a male predominancy and increasing prevalence by age. Second hand smoking was reported at a 74.3% frequency with father predominance (56.6%). The significantly important factors that affect current smoking in these age groups were increased by household size, late birth rank, certain school types, low academic performance, increased second hand smoking, and stress (especially reported as separation from a close friend or because of violence at home). Classification and regression tree methodology showed the importance of some neglected sociodemographic factors with a good classification capacity. It was concluded that, as closely related with sociocultural factors, smoking was a common problem in this young population, generating important academic and social burden in youth life and with increasing data about this behaviour and using new statistical methods, effective coping strategies could be composed.

  10. Lagrangian statistics of mesoscale turbulence in a natural environment: The Agulhas return current.

    PubMed

    Carbone, Francesco; Gencarelli, Christian N; Hedgecock, Ian M

    2016-12-01

    The properties of mesoscale geophysical turbulence in an oceanic environment have been investigated through the Lagrangian statistics of sea surface temperature measured by a drifting buoy within the Agulhas return current, where strong temperature mixing produces locally sharp temperature gradients. By disentangling the large-scale forcing which affects the small-scale statistics, we found that the statistical properties of intermittency are identical to those obtained from the multifractal prediction in the Lagrangian frame for the velocity trajectory. The results suggest a possible universality of turbulence scaling.

  11. Fundamental quantum noise mapping with tunnelling microscopes tested at surface structures of subatomic lateral size.

    PubMed

    Herz, Markus; Bouvron, Samuel; Ćavar, Elizabeta; Fonin, Mikhail; Belzig, Wolfgang; Scheer, Elke

    2013-10-21

    We present a measurement scheme that enables quantitative detection of the shot noise in a scanning tunnelling microscope while scanning the sample. As test objects we study defect structures produced on an iridium single crystal at low temperatures. The defect structures appear in the constant current images as protrusions with curvature radii well below the atomic diameter. The measured power spectral density of the noise is very near to the quantum limit with Fano factor F = 1. While the constant current images show detailed structures expected for tunnelling involving d-atomic orbitals of Ir, we find the current noise to be without pronounced spatial variation as expected for shot noise arising from statistically independent events.

  12. Ten Years of Cloud Properties from MODIS: Global Statistics and Use in Climate Model Evaluation

    NASA Technical Reports Server (NTRS)

    Platnick, Steven E.

    2011-01-01

    The NASA Moderate Resolution Imaging Spectroradiometer (MODIS), launched onboard the Terra and Aqua spacecrafts, began Earth observations on February 24, 2000 and June 24,2002, respectively. Among the algorithms developed and applied to this sensor, a suite of cloud products includes cloud masking/detection, cloud-top properties (temperature, pressure), and optical properties (optical thickness, effective particle radius, water path, and thermodynamic phase). All cloud algorithms underwent numerous changes and enhancements between for the latest Collection 5 production version; this process continues with the current Collection 6 development. We will show example MODIS Collection 5 cloud climatologies derived from global spatial . and temporal aggregations provided in the archived gridded Level-3 MODIS atmosphere team product (product names MOD08 and MYD08 for MODIS Terra and Aqua, respectively). Data sets in this Level-3 product include scalar statistics as well as 1- and 2-D histograms of many cloud properties, allowing for higher order information and correlation studies. In addition to these statistics, we will show trends and statistical significance in annual and seasonal means for a variety of the MODIS cloud properties, as well as the time required for detection given assumed trends. To assist in climate model evaluation, we have developed a MODIS cloud simulator with an accompanying netCDF file containing subsetted monthly Level-3 statistical data sets that correspond to the simulator output. Correlations of cloud properties with ENSO offer the potential to evaluate model cloud sensitivity; initial results will be discussed.

  13. Influence of Tube Current Settings on Diagnostic Detection of Root Fractures Using Cone-beam Computed Tomography: An In Vitro Study.

    PubMed

    Tangari-Meira, Ricardo; Vancetto, José Ricardo; Dovigo, Lívia Nordi; Tosoni, Guilherme Monteiro

    2017-10-01

    This study assessed the influence of tube current settings (milliamperes [mA]) on the diagnostic detection of root fractures (RFs) using cone-beam computed tomographic (CBCT) imaging. Sixty-eight human anterior and posterior teeth were submitted to root canal preparation, and 34 root canals were filled. The teeth were divided into 2 groups: the control group and the fractured group. RFs were induced using a universal mechanical testing machine; afterward, the teeth were placed in a phantom. Images were acquired using a Scanora 3DX unit (Soredex, Tuusula, Finland) with 5 different mA settings: 4.0, 5.0, 6.3, 8.0, and 10.0. Two examiners (E1 and E2) classified the images according to a 5-point confidence scale. Intra- and interexaminer reproducibility was assessed using the kappa statistic; diagnostic performance was assessed using the area under the receiver operating characteristic curve (AUROC). Intra- and interexaminer reproducibility showed substantial (κE1 = 0.791 and κE2 = 0.695) and moderate (κE1 × E2 = 0.545) agreement, respectively. AUROC was significantly higher (P ≤ .0389) at 8.0 and 10.0 mA and showed no statistical difference between the 2 tube current settings. Tube current has a significant influence on the diagnostic detection of RFs in CBCT images. Despite the acceptable diagnosis of RFs using 4.0 and 5.0 mA, those settings had lower discrimination abilities when compared with settings of 8.0 and 10.0 mA. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. The Intensity, Directionality, and Statistics of Underwater Noise From Melting Icebergs

    NASA Astrophysics Data System (ADS)

    Glowacki, Oskar; Deane, Grant B.; Moskalik, Mateusz

    2018-05-01

    Freshwater fluxes from melting icebergs and glaciers are important contributors to both sea level rise and anomalies of seawater salinity in polar regions. However, the hazards encountered close to icebergs and glaciers make it difficult to quantify their melt rates directly, motivating the development of cryoacoustics as a remote sensing technique. Recent studies have shown a qualitative link between ice melting and the accompanying underwater noise, but the properties of this signal remain poorly understood. Here we examine the intensity, directionality, and temporal statistics of the underwater noise radiated by melting icebergs in Hornsund Fjord, Svalbard, using a three-element acoustic array. We present the first estimate of noise energy per unit area associated with iceberg melt and demonstrate its qualitative dependence on exposure to surface current. Finally, we show that the analysis of noise directionality and statistics makes it possible to distinguish iceberg melt from the glacier terminus melt.

  15. Simulation and assimilation of satellite altimeter data at the oceanic mesoscale

    NASA Technical Reports Server (NTRS)

    Demay, P.; Robinson, A. R.

    1984-01-01

    An improved "objective analysis' technique is used along with an altimeter signal statistical model, an altimeter noise statistical model, an orbital model, and synoptic surface current maps in the POLYMODE-SDE area, to evaluate the performance of various observational strategies in catching the mesoscale variability at mid-latitudes. In particular, simulated repetitive nominal orbits of ERS-1, TOPEX, and SPOT/POSEIDON are examined. Results show the critical importance of existence of a subcycle, scanning in either direction. Moreover, long repeat cycles ( 20 days) and short cross-track distances ( 300 km) seem preferable, since they match mesoscale statistics. Another goal of the study is to prepare and discuss sea-surface height (SSH) assimilation in quasigeostrophic models. Restored SSH maps are shown to meet that purpose, if an efficient extrapolation method or deep in-situ data (floats) are used on the vertical to start and update the model.

  16. Robust Statistical Detection of Power-Law Cross-Correlation.

    PubMed

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-06-02

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  17. Robust Statistical Detection of Power-Law Cross-Correlation

    PubMed Central

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  18. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  19. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  20. Assessing land-use history for reporting on cropland dynamics - A case study using the Land-Parcel Identification System in Ireland

    NASA Astrophysics Data System (ADS)

    Zimmermann, Jesko; González, Ainhoa; Jones, Michael; O'Brien, Phillip; Stout, Jane C.; Green, Stuart

    2016-04-01

    In developed countries, cropland and grassland conversions and management can be a major factor in Land Use and Land Use Change (LULUC) related Greenhouse Gas (GHG) dynamics. Depending on land use, management and factors such as soil properties land can either act as source or sink for GHGs. Currently many countries depend on national statistics combined with socio-economic modelling to assess current land use as well as inter-annual changes. This potentially introduces a bias as it neither provides information on direct land- use change trajectories nor spatially explicit information to assess the environmental context. In order to improve reporting countries are shifting towards high resolution spatial datasets. In this case study, we used the Land Parcel Identification System (LPIS), a pan-European geographical database developed to assist farmers and authorities with agricultural subsidies, to analyse cropland dynamics in Ireland. The database offer high spatial resolution and is updated annually. Generally Ireland is considered grassland dominated with 90 % of its agricultural area under permanent grassland, and only a small area dedicated to cropland. However an in-depth analysis of the LPIS for the years 2000 to 2012 showed strong underlying dynamics. While the annual area reported as cropland remained relatively constant at 3752.3 ± 542.3 km2, the area of permanent cropland was only 1251.9 km2. Reversely, the area that was reported as cropland for at least one year during the timeframe was 7373.4 km2, revealing a significantly higher area with cropland history than annual statistics would suggest. Furthermore, the analysis showed that one quarter of the land converting from or to cropland will return to the previous land use within a year. To demonstrate potential policy impact, we assessed cropland/grassland dynamics from the 2008 to 2012 commitment period using (a) annual statistics, and (b) data including land use history derived from LPIS. Under current reporting standards temporary grassland is considered cropland for reporting purposes. Therefore taking land use history into account increases the area reported as cropland in 2008 by 45.7 % and the area remaining cropland in 2012 by 17.5 % compared to using annual statistics. In conclusion we showed that high resolution spatial datasets are an important tool to better understand land use dynamics, and can directly improve national GHG accounting efforts. Furthermore, knowledge of land use history is important to assess local GHG dynamics, and can therefore contribute to ultimately progress reporting to higher Tier level reporting.

  1. Antibiotics for exacerbations of chronic obstructive pulmonary disease.

    PubMed

    Vollenweider, Daniela J; Jarrett, Harish; Steurer-Stey, Claudia A; Garcia-Aymerich, Judith; Puhan, Milo A

    2012-12-12

    Many patients with an exacerbation of chronic obstructive pulmonary disease (COPD) are treated with antibiotics. However, the value of antibiotics remains uncertain as systematic reviews and clinical trials have shown conflicting results. To assess the effects of antibiotics in the management of acute COPD exacerbations on treatment failure as observed between seven days and one month after treatment initiation (primary outcome) and on other patient-important outcomes (mortality, adverse events, length of hospital stay). We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and other electronically available databases up to September 2012. Randomised controlled trials (RCTs) in people with acute COPD exacerbations comparing antibiotic therapy and placebo with a follow-up of at least seven days. Two review authors independently screened references and extracted data from trial reports. We kept the three groups of outpatients, inpatients and patients admitted to the intensive care unit (ICU) separate for benefit outcomes and mortality because we considered them to be clinically too different to be summarised in one group. We considered outpatients to have a mild to moderate exacerbation, inpatients to have a severe exacerbation and ICU patients to have a very severe exacerbation. Where outcomes or study details were not reported we requested missing data from the authors of the primary studies. We calculated pooled risk ratios (RR) for treatment failure, Peto odds ratios (OR) for rare events (mortality and adverse events) and weighted mean differences (MD) for continuous outcomes using fixed-effect models. We used GRADE to assess the quality of the evidence. Sixteen trials with 2068 participants were included. In outpatients (mild to moderate exacerbations), there was evidence of low quality that antibiotics did statistically significantly reduce the risk for treatment failure between seven days and one month after treatment initiation (RR 0.75; 95% CI 0.60 to 0.94; I(2) = 35%) but they did not significantly reduce the risk when the meta-analysis was restricted to currently available drugs (RR 0.80; 95% CI 0.63 to 1.01; I(2) = 33%). Evidence of high quality showed that antibiotics statistically significantly reduced the risk of treatment failure in inpatients with severe exacerbations (ICU not included) (RR 0.77; 95% CI 0.65 to 0.91; I(2) = 47%) regardless of whether restricted to current drugs. The only trial with 93 patients admitted to the ICU showed a large and statistically significant effect on treatment failure (RR 0.19; 95% CI 0.08 to 0.45; high-quality evidence).Evidence of low-quality from four trials in inpatients showed no effect of antibiotics on mortality (Peto OR 1.02; 95% CI 0.37 to 2.79). High-quality evidence from one trial showed a statistically significant effect on mortality in ICU patients (Peto OR 0.21; 95% CI 0.06 to 0.72). Length of hospital stay (in days) was similar in the antibiotics and placebo groups except for the ICU study where antibiotics statistically significantly reduced length of hospital stay (mean difference -9.60 days; 95% CI -12.84 to -6.36 days). One trial showed no effect of antibiotics on re-exacerbations between two and six weeks after treatment initiation. Only one trial (N = 35) reported health-related quality of life but did not show a statistically significant difference between the treatment and control group.Evidence of moderate quality showed that the overall incidence of adverse events was higher in the antibiotics groups (Peto OR 1.53; 95% CI 1.03 to 2.27). Patients treated with antibiotics experienced statistically significantly more diarrhoea based on three trials (Peto OR 2.62; 95% CI 1.11 to 6.17; high-quality evidence). Antibiotics for COPD exacerbations showed large and consistent beneficial effects across outcomes of patients admitted to an ICU. However, for outpatients and inpatients the results were inconsistent. The risk for treatment failure was significantly reduced in both inpatients and outpatients when all trials (1957 to 2012) were included but not when the analysis for outpatients was restricted to currently used antibiotics. Also, antibiotics had no statistically significant effect on mortality and length of hospital stay in inpatients and almost no data on patient-reported outcomes exist. These inconsistent effects call for research into clinical signs and biomarkers that help identify patients who benefit from antibiotics and patients who experience no effect, and in whom downsides of antibiotics (side effects, costs and multi-resistance) could be avoided.

  2. Relationships between Employment Quality and Intention to Quit: Focus on PhD Candidates as Traditional Workers

    ERIC Educational Resources Information Center

    Travaglianti, F.; Babic, A.; Hansez, I.

    2018-01-01

    Current statistics show that the attrition rate among PhD candidates is high (i.e. from 30% to 40% depending on the discipline and the country). This high-attrition rate has an impact on both economic (e.g. negative impact on the return-on investment in doctoral education) and human levels (e.g. negative consequences on candidates' self-esteem and…

  3. E-Learning Works--Exactly How Well Depends on Its Unique Features and Barriers: CAHRS ResearchLink No. 1

    ERIC Educational Resources Information Center

    Bell, Bradford; Federman, Jessica E.

    2013-01-01

    E-learning has grown at a considerable rate, and current projections show no slowdown in the near future. The National Center for Education Statistics estimates that between 2000 and 2008 the share of undergraduates enrolled in at least one online course grew from 8 percent to 20 percent. This study refers to e-learning as all forms of…

  4. Food consumption and the actual statistics of cardiovascular diseases: an epidemiological comparison of 42 European countries

    PubMed Central

    Grasgruber, Pavel; Sebera, Martin; Hrazdira, Eduard; Hrebickova, Sylva; Cacek, Jan

    2016-01-01

    Background The aim of this ecological study was to identify the main nutritional factors related to the prevalence of cardiovascular diseases (CVDs) in Europe, based on a comparison of international statistics. Design The mean consumption of 62 food items from the FAOSTAT database (1993–2008) was compared with the actual statistics of five CVD indicators in 42 European countries. Several other exogenous factors (health expenditure, smoking, body mass index) and the historical stability of results were also examined. Results We found exceptionally strong relationships between some of the examined factors, the highest being a correlation between raised cholesterol in men and the combined consumption of animal fat and animal protein (r=0.92, p<0.001). The most significant dietary correlate of low CVD risk was high total fat and animal protein consumption. Additional statistical analyses further highlighted citrus fruits, high-fat dairy (cheese) and tree nuts. Among other non-dietary factors, health expenditure showed by far the highest correlation coefficients. The major correlate of high CVD risk was the proportion of energy from carbohydrates and alcohol, or from potato and cereal carbohydrates. Similar patterns were observed between food consumption and CVD statistics from the period 1980–2000, which shows that these relationships are stable over time. However, we found striking discrepancies in men's CVD statistics from 1980 and 1990, which can probably explain the origin of the ‘saturated fat hypothesis’ that influenced public health policies in the following decades. Conclusion Our results do not support the association between CVDs and saturated fat, which is still contained in official dietary guidelines. Instead, they agree with data accumulated from recent studies that link CVD risk with the high glycaemic index/load of carbohydrate-based diets. In the absence of any scientific evidence connecting saturated fat with CVDs, these findings show that current dietary recommendations regarding CVDs should be seriously reconsidered. PMID:27680091

  5. Statistical modelling predicts almost complete loss of major periglacial processes in Northern Europe by 2100.

    PubMed

    Aalto, Juha; Harrison, Stephan; Luoto, Miska

    2017-09-11

    The periglacial realm is a major part of the cryosphere, covering a quarter of Earth's land surface. Cryogenic land surface processes (LSPs) control landscape development, ecosystem functioning and climate through biogeochemical feedbacks, but their response to contemporary climate change is unclear. Here, by statistically modelling the current and future distributions of four major LSPs unique to periglacial regions at fine scale, we show fundamental changes in the periglacial climate realm are inevitable with future climate change. Even with the most optimistic CO 2 emissions scenario (Representative Concentration Pathway (RCP) 2.6) we predict a 72% reduction in the current periglacial climate realm by 2050 in our climatically sensitive northern Europe study area. These impacts are projected to be especially severe in high-latitude continental interiors. We further predict that by the end of the twenty-first century active periglacial LSPs will exist only at high elevations. These results forecast a future tipping point in the operation of cold-region LSP, and predict fundamental landscape-level modifications in ground conditions and related atmospheric feedbacks.Cryogenic land surface processes characterise the periglacial realm and control landscape development and ecosystem functioning. Here, via statistical modelling, the authors predict a 72% reduction of the periglacial realm in Northern Europe by 2050, and almost complete disappearance by 2100.

  6. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    2018-01-01

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  7. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  8. A two-question method for assessing gender categories in the social and medical sciences.

    PubMed

    Tate, Charlotte Chuck; Ledbetter, Jay N; Youssef, Cris P

    2013-01-01

    Three studies (N = 990) assessed the statistical reliability of two methods of determining gender identity that can capture transgender spectrum identities (i.e., current gender identities different from birth-assigned gender categories). Study 1 evaluated a single question with four response options (female, male, transgender, other) on university students. The missing data rate was higher than the valid response rates for transgender and other options using this method. Study 2 evaluated a method of asking two separate questions (i.e., one for current identity and another for birth-assigned category), with response options specific to each. Results showed no missing data and two times the transgender spectrum response rate compared to Study 1. Study 3 showed that the two-question method also worked in community samples, producing near-zero missing data. The two-question method also identified cisgender identities (same birth-assigned and current gender identity), making it a dynamic and desirable measurement tool for the social and medical sciences.

  9. Quality of Death Rates by Race and Hispanic Origin: A Summary of Current Research, 1999. Vital and Health Statistics. Series 2: Data Evaluation and Methods Research. No. 128.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHHS/PHS), Hyattsville, MD.

    This report summarizes current knowledge and research on the quality and reliability of death rates by race and Hispanic origin in official mortality statistics of the United States produced by the National Center for Health Statistics (NCHS). It provides a quantitative assessment of bias in death rates by race and Hispanic origin and identifies…

  10. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  11. 2009 Canadian Radiation Oncology Resident Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Debenham, Brock, E-mail: debenham@ualberta.net; Banerjee, Robyn; Fairchild, Alysa

    2012-03-15

    Purpose: Statistics from the Canadian post-MD education registry show that numbers of Canadian radiation oncology (RO) trainees have risen from 62 in 1999 to approximately 150 per year between 2003 and 2009, contributing to the current perceived downturn in employment opportunities for radiation oncologists in Canada. When last surveyed in 2003, Canadian RO residents identified job availability as their main concern. Our objective was to survey current Canadian RO residents on their training and career plans. Methods and Materials: Trainees from the 13 Canadian residency programs using the national matching service were sought. Potential respondents were identified through individual programmore » directors or chief resident and were e-mailed a secure link to an online survey. Descriptive statistics were used to report responses. Results: The eligible response rate was 53% (83/156). Similar to the 2003 survey, respondents generally expressed high satisfaction with their programs and specialty. The most frequently expressed perceived weakness in their training differed from 2003, with 46.5% of current respondents feeling unprepared to enter the job market. 72% plan on pursuing a postresidency fellowship. Most respondents intend to practice in Canada. Fewer than 20% of respondents believe that there is a strong demand for radiation oncologists in Canada. Conclusions: Respondents to the current survey expressed significant satisfaction with their career choice and training program. However, differences exist compared with the 2003 survey, including the current perceived lack of demand for radiation oncologists in Canada.« less

  12. Feature selection from a facial image for distinction of sasang constitution.

    PubMed

    Koo, Imhoi; Kim, Jong Yeol; Kim, Myoung Geun; Kim, Keun Ho

    2009-09-01

    Recently, oriental medicine has received attention for providing personalized medicine through consideration of the unique nature and constitution of individual patients. With the eventual goal of globalization, the current trend in oriental medicine research is the standardization by adopting western scientific methods, which could represent a scientific revolution. The purpose of this study is to establish methods for finding statistically significant features in a facial image with respect to distinguishing constitution and to show the meaning of those features. From facial photo images, facial elements are analyzed in terms of the distance, angle and the distance ratios, for which there are 1225, 61 250 and 749 700 features, respectively. Due to the very large number of facial features, it is quite difficult to determine truly meaningful features. We suggest a process for the efficient analysis of facial features including the removal of outliers, control for missing data to guarantee data confidence and calculation of statistical significance by applying ANOVA. We show the statistical properties of selected features according to different constitutions using the nine distances, 10 angles and 10 rates of distance features that are finally established. Additionally, the Sasang constitutional meaning of the selected features is shown here.

  13. Enhancing image classification models with multi-modal biomarkers

    NASA Astrophysics Data System (ADS)

    Caban, Jesus J.; Liao, David; Yao, Jianhua; Mollura, Daniel J.; Gochuico, Bernadette; Yoo, Terry

    2011-03-01

    Currently, most computer-aided diagnosis (CAD) systems rely on image analysis and statistical models to diagnose, quantify, and monitor the progression of a particular disease. In general, CAD systems have proven to be effective at providing quantitative measurements and assisting physicians during the decision-making process. As the need for more flexible and effective CADs continues to grow, questions about how to enhance their accuracy have surged. In this paper, we show how statistical image models can be augmented with multi-modal physiological values to create more robust, stable, and accurate CAD systems. In particular, this paper demonstrates how highly correlated blood and EKG features can be treated as biomarkers and used to enhance image classification models designed to automatically score subjects with pulmonary fibrosis. In our results, a 3-5% improvement was observed when comparing the accuracy of CADs that use multi-modal biomarkers with those that only used image features. Our results show that lab values such as Erythrocyte Sedimentation Rate and Fibrinogen, as well as EKG measurements such as QRS and I:40, are statistically significant and can provide valuable insights about the severity of the pulmonary fibrosis disease.

  14. Feature Selection from a Facial Image for Distinction of Sasang Constitution

    PubMed Central

    Koo, Imhoi; Kim, Jong Yeol; Kim, Myoung Geun

    2009-01-01

    Recently, oriental medicine has received attention for providing personalized medicine through consideration of the unique nature and constitution of individual patients. With the eventual goal of globalization, the current trend in oriental medicine research is the standardization by adopting western scientific methods, which could represent a scientific revolution. The purpose of this study is to establish methods for finding statistically significant features in a facial image with respect to distinguishing constitution and to show the meaning of those features. From facial photo images, facial elements are analyzed in terms of the distance, angle and the distance ratios, for which there are 1225, 61 250 and 749 700 features, respectively. Due to the very large number of facial features, it is quite difficult to determine truly meaningful features. We suggest a process for the efficient analysis of facial features including the removal of outliers, control for missing data to guarantee data confidence and calculation of statistical significance by applying ANOVA. We show the statistical properties of selected features according to different constitutions using the nine distances, 10 angles and 10 rates of distance features that are finally established. Additionally, the Sasang constitutional meaning of the selected features is shown here. PMID:19745013

  15. Occupational noise exposure, psychosocial working conditions and the risk of tinnitus.

    PubMed

    Frederiksen, Thomas Winther; Ramlau-Hansen, Cecilia Høst; Stokholm, Zara Ann; Grynderup, Matias Brødsgaard; Hansen, Åse Marie; Lund, Søren Peter; Kristiansen, Jesper; Vestergaard, Jesper Medom; Bonde, Jens Peter; Kolstad, Henrik Albert

    2017-02-01

    The purpose of this study was to evaluate the influence of occupational noise (current and cumulative doses) and psychosocial work factors (psychological demands and decision latitude) on tinnitus occurrence among workers, using objective and non-self-reported exposure measures to prevent reporting bias. In a cross-sectional study, we analyzed data from a Danish survey from 2009 to 2010 that included 534 workers from children day care units and 10 manufacturing trades. Associations between risk factors (current noise exposure, cumulative noise exposure and psychosocial working conditions) and tinnitus were analyzed with logistic regression. We found no statistically significant associations between either current [OR 0.95 (95% CI 0.89; 1.01)] or cumulative [OR 0.93 (95% CI 0.81; 1.06)] occupational noise exposure and tinnitus. Likewise, results for psychosocial working conditions showed no statistically significant association between work place decision latitude [OR 1.06 (95% CI 0.94; 1.13)] or psychological demands [OR 1.07 (95% CI 0.90; 1.26)] and tinnitus. Our results suggest that current Danish occupational noise levels (in combination with relevant noise protection) are not associated with tinnitus. Also, results indicated that the psychosocial working conditions we observed in this cohort of mainly industrial workers were not associated with tinnitus. Therefore, psychosocial working conditions comparable to those observed in this study are probably not relevant to take into account in the evaluation of workers presenting with tinnitus.

  16. Diagnostic Factors of Odontogenic Cysts in Iranian Population: A Retrospective Study Over the Past Two Decades.

    PubMed

    Mohajerani, Hassan; Esmaeelinejad, Mohammad; Sabour, Siamak; Aghdashi, Farzad; Dehghani, Nima

    2015-06-01

    Early diagnosis of odontogenic cysts due to their silent progression is always a challenging problem for clinicians. The current study aimed to evaluate the frequency of odontogenic cysts and related factors in a selected Iranian population. The current cross-sectional study was conducted on 312 patients' recorded data in Taleghani Hospital, Tehran, Iran, from April 1993 to December 2013. All related data were extracted from the records and categorized in tables. The correlation between the variables was analyzed by either chi-square or multinominal logistic regression tests. The P values < 0.05 were considered significant. Evaluation of 312 patients' records (185 males and 127 females) with the mean age of 27.6 showed that Odontogenic Keratocyst (OKC) was the most common odontogenic cyst of all followed by the dentigerous cyst as the second most common lesion. Most of the patients were in the second or third decades of their lives, although there was no statistically significant age distribution. The finding of the current study showed that calcifying odontogenic cyst (COC) occurrence was significantly related to the history of trauma. Enucleation and curettage of the odontogenic cysts were the most common treatment plans of all. The current study showed that clinicians should consider the many factors associated with the occurrence of odontogenic cysts.

  17. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  18. Black Truffle Harvesting in Spanish Forests: Trends, Current Policies and Practices, and Implications on its Sustainability

    NASA Astrophysics Data System (ADS)

    Garcia-Barreda, Sergi; Forcadell, Ricardo; Sánchez, Sergio; Martín-Santafé, María; Marco, Pedro; Camarero, J. Julio; Reyna, Santiago

    2018-04-01

    The European black truffle is a mycorrhizal fungus native to Spanish Mediterranean forests. In most Spanish regions it was originally commercially harvested in the second half of the 20th century. Experts agree that wild truffle yields suffered a sharp decline during the 1970s and 1980s. However, official statistics for Spanish harvest are scarce and seemingly conflicting, and little attention has been paid to the regime for the exploitation of truffle-producing forests and its implications on the sustainability of this resource. Trends in harvest from 1969 to 2013 and current harvesting practices were analyzed as a case study, taking into account that Spain is a major truffle producer worldwide, but at the same time truffles have only recently been exploited. The available statistical sources, which include an increasing proportion of cultivated truffles since the mid-1990s, were explored, with estimates from Truffle Harvesters Federation showing higher consistency. Statistical sources were then compared with proxies for wild harvest (rents from truffle leases in public forests) to corroborate time trends in wild harvesting. Results suggest that black truffle production is recovering in recent years thanks to plantations, whereas wild harvest is still declining. The implications of Spanish legal and institutional framework on sustainability of wild truffle use are reviewed. In the current scenario, the decline of wild harvest is likely to continue and eventually make commercial harvesting economically unattractive, thus aggravating sustainability issues. Strengthening of property rights, rationalization of harvesting pressure, forest planning and involvement of public stakeholders are proposed as corrective measures.

  19. Hydrometeorological application of an extratropical cyclone classification scheme in the southern United States

    NASA Astrophysics Data System (ADS)

    Senkbeil, J. C.; Brommer, D. M.; Comstock, I. J.; Loyd, T.

    2012-07-01

    Extratropical cyclones (ETCs) in the southern United States are often overlooked when compared with tropical cyclones in the region and ETCs in the northern United States. Although southern ETCs are significant weather events, there is currently not an operational scheme used for identifying and discussing these nameless storms. In this research, we classified 84 ETCs (1970-2009). We manually identified five distinct formation regions and seven unique ETC types using statistical classification. Statistical classification employed the use of principal components analysis and two methods of cluster analysis. Both manual and statistical storm types generally showed positive (negative) relationships with El Niño (La Niña). Manual storm types displayed precipitation swaths consistent with discrete storm tracks which further legitimizes the existence of multiple modes of southern ETCs. Statistical storm types also displayed unique precipitation intensity swaths, but these swaths were less indicative of track location. It is hoped that by classifying southern ETCs into types, that forecasters, hydrologists, and broadcast meteorologists might be able to better anticipate projected amounts of precipitation at their locations.

  20. Two-mode mazer injected with V-type three-level atoms

    NASA Astrophysics Data System (ADS)

    Liang, Wen-Qing; Zhang, Zhi-Ming; Xie, Sheng-Wu

    2003-12-01

    The properties of the two-mode mazer operating on V-type three-level atoms are studied. The effect of the one-atom pumping on the two modes of the cavity field in number-state is asymmetric, that is, the atom emits a photon into one mode with some probability and absorbs a photon from the other mode with some other probability. This effect makes the steady-state photon distribution and the steady-state photon statistics asymmetric for the two modes. The diagram of the probability currents for the photon distribution, given by the analysis of the master equation, reveals that there is no detailed balance solution for the master equation. The computations show that the photon statistics of one mode or both modes can be sub-Poissonian, that the two modes can have anticorrelation or correlation, that the photon statistics increases with the increase of thermal photons and that the resonant position and strength of the photon statistics are influenced by the ratio of the two coupling strengths of the two modes. These properties are also discussed physically.

  1. Student Mobility Rate: A Moving Target.

    ERIC Educational Resources Information Center

    Ligon, Glynn; Paredes, Vicente

    One of the most elusive statistics in education today is student mobility. Current mobility statistics are based on available rather than appropriate data, resulting in the best available mobility index, rather than one that would serve real information needs. This study documents methods currently being used by school districts and other entities…

  2. 77 FR 58510 - Proposed Information Collection; Comment Request; Current Population Survey (CPS), Annual Social...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... various population groups. A prime statistic of interest is the classification of people in poverty and... Information Collection; Comment Request; Current Population Survey (CPS), Annual Social and Economic... conducted this supplement annually for over 50 years. The Census Bureau and the Bureau of Labor Statistics...

  3. 77 FR 47029 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection AGENCY: National Agricultural... Reduction Act of 1995 this notice announces the intention of the National Agricultural Statistics Service...

  4. Collective behavior of networks with linear (VLSI) integrate-and-fire neurons.

    PubMed

    Fusi, S; Mattia, M

    1999-04-01

    We analyze in detail the statistical properties of the spike emission process of a canonical integrate-and-fire neuron, with a linear integrator and a lower bound for the depolarization, as often used in VLSI implementations (Mead, 1989). The spike statistics of such neurons appear to be qualitatively similar to conventional (exponential) integrate-and-fire neurons, which exhibit a wide variety of characteristics observed in cortical recordings. We also show that, contrary to current opinion, the dynamics of a network composed of such neurons has two stable fixed points, even in the purely excitatory network, corresponding to two different states of reverberating activity. The analytical results are compared with numerical simulations and are found to be in good agreement.

  5. Frequentist Model Averaging in Structural Equation Modelling.

    PubMed

    Jin, Shaobo; Ankargren, Sebastian

    2018-06-04

    Model selection from a set of candidate models plays an important role in many structural equation modelling applications. However, traditional model selection methods introduce extra randomness that is not accounted for by post-model selection inference. In the current study, we propose a model averaging technique within the frequentist statistical framework. Instead of selecting an optimal model, the contributions of all candidate models are acknowledged. Valid confidence intervals and a [Formula: see text] test statistic are proposed. A simulation study shows that the proposed method is able to produce a robust mean-squared error, a better coverage probability, and a better goodness-of-fit test compared to model selection. It is an interesting compromise between model selection and the full model.

  6. Results of a joint NOAA/NASA sounder simulation study

    NASA Technical Reports Server (NTRS)

    Phillips, N.; Susskind, Joel; Mcmillin, L.

    1988-01-01

    This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.

  7. Probing the Fluctuations of Optical Properties in Time-Resolved Spectroscopy

    NASA Astrophysics Data System (ADS)

    Randi, Francesco; Esposito, Martina; Giusti, Francesca; Misochko, Oleg; Parmigiani, Fulvio; Fausti, Daniele; Eckstein, Martin

    2017-11-01

    We show that, in optical pump-probe experiments on bulk samples, the statistical distribution of the intensity of ultrashort light pulses after interaction with a nonequilibrium complex material can be used to measure the time-dependent noise of the current in the system. We illustrate the general arguments for a photoexcited Peierls material. The transient noise spectroscopy allows us to measure to what extent electronic degrees of freedom dynamically obey the fluctuation-dissipation theorem, and how well they thermalize during the coherent lattice vibrations. The proposed statistical measurement developed here provides a new general framework to retrieve dynamical information on the excited distributions in nonequilibrium experiments, which could be extended to other degrees of freedom of magnetic or vibrational origin.

  8. THE COMMON MARKET AND EUROPEAN UNIFICATION,

    DTIC Science & Technology

    A study of the Common Market ; its past problems, current difficulties, and future possibilities are presented. The study consists of seven sections...each of which may be read independently: (1) an introduction to the Common Market ; (2) the Common Market and internal trade; (3) external economic...European Economic Community agriculture; and (7) the Common Market and European political unification. Statistical tables showing import and export data of the Common Market countries are appended. (Author)

  9. Communications Magnetospheric Substorms.

    DTIC Science & Technology

    1983-01-17

    Magnetospheric Study, edited by K . Knott and B . Battrick, D. Reidel Publ. Co., 345-364, 1976. 26. Bossen, M., R.L. McPherron, and C.T. Russell, A statistical...DUPG JIM AGNETIC SU]STORMS. THE FORMATICU-1 OF PARTIAL RING CURRENTS AND ITS RELATIONSHIP TD SDLAR WIND PARAIETERS AND THE RELATIONSHIP B -ETWEEN...noise amplified by the K -H instability which then couples to a resonance. Power spectra of Pc 3 pulsations at synchronous orbit often show multiple

  10. First Marriages in the United States: Data from the 2006-2010 National Survey of Family Growth. National Health Statistics Reports. Number 49

    ERIC Educational Resources Information Center

    Copen, Casey E.; Daniels, Kimberly; Vespa, Jonathan; Mosher, William D.

    2012-01-01

    Objectives: This report shows trends and group differences in current marital status, with a focus on first marriages among women and men aged 15-44 years in the United States. Trends and group differences in the timing and duration of first marriages are also discussed. These data are based on the 2006-2010 National Survey of Family Growth…

  11. Stress-induced electric current fluctuations in rocks: a superstatistical model

    NASA Astrophysics Data System (ADS)

    Cartwright-Taylor, Alexis; Vallianatos, Filippos; Sammonds, Peter

    2017-04-01

    We recorded spontaneous electric current flow in non-piezoelectric Carrara marble samples during triaxial deformation. Mechanical data, ultrasonic velocities and acoustic emissions were acquired simultaneously with electric current to constrain the relationship between electric current flow, differential stress and damage. Under strain-controlled loading, spontaneous electric current signals (nA) were generated and sustained under all conditions tested. In dry samples, a detectable electric current arises only during dilatancy and the overall signal is correlated with the damage induced by microcracking. Our results show that fracture plays a key role in the generation of electric currents in deforming rocks (Cartwright-Taylor et al., in prep). We also analysed the high-frequency fluctuations of these electric current signals and found that they are not normally distributed - they exhibit power-law tails (Cartwright-Taylor et al., 2014). We modelled these distributions with q-Gaussian statistics, derived by maximising the Tsallis entropy. This definition of entropy is particularly applicable to systems which are strongly correlated and far from equilibrium. Good agreement, at all experimental conditions, between the distributions of electric current fluctuations and the q-Gaussian function with q-values far from one, illustrates the highly correlated, fractal nature of the electric source network within the samples and provides further evidence that the source of the electric signals is the developing fractal network of cracks. It has been shown (Beck, 2001) that q-Gaussian distributions can arise from the superposition of local relaxations in the presence of a slowly varying driving force, thus providing a dynamic reason for the appearance of Tsallis statistics in systems with a fluctuating energy dissipation rate. So, the probability distribution for a dynamic variable, u under some external slow forcing, β, can be obtained as a superposition of temporary local equilibrium processes whose variance fluctuates over time. The appearance of q-Gaussian statistics are caused by the fluctuating β parameter, which effectively models the fluctuating energy dissipation rate in the system. This concept is known as superstatistics and is physically relevant for modelling driven non-equilibrium systems where the environmental conditions fluctuate on a large scale. The idea is that the environmental variable, such as temperature or pressure, changes so slowly that a rapidly fluctuating variable within that environment has time to relax back to equilibrium between each change in the environment. The application of superstatistical techniques to our experimental electric current fluctuations show that they can indeed be described, to good approximation, by the superposition of local Gaussian processes with fluctuating variance. We conclude, then, that the measured electric current fluctuates in response to intermittent energy dissipation and is driven to varying temporary local equilibria during deformation by the variations in stress intensity. The advantage of this technique is that, once the model has been established to be a good description of the system in question, the average β parameter (a measure of the average energy dissipation rate) for the system can be obtained simply from the macroscopic q-Gaussian distribution parameters.

  12. Lithuanian women physicists: Current situation and involvement in gender projects

    NASA Astrophysics Data System (ADS)

    Šatkovskienė, Dalia; Ruželė, Živilė; Rutkūnienė, Živilė; Kupliauskienė, Alicija

    2015-12-01

    The changes in the situation of women in physics since the last Lithuanian country report are discussed on the basis of available statistics. The overall percentage of women physicists in research is 28%. Results show that there is a noticeable increase in female scientists in most phases of the academic career progression except in the highest positions. The results also show a permanent change in the awareness of gender-related issues in research. We also discuss the initiatives taken by Lithuanian women scientists to change the situation during three last years and their outcomes.

  13. Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?

    NASA Astrophysics Data System (ADS)

    Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui

    2016-11-01

    Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.

  14. Cosmological Constraints from Galaxy Cluster Velocity Statistics

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Suman; Kosowsky, Arthur

    2007-04-01

    Future microwave sky surveys will have the sensitivity to detect the kinematic Sunyaev-Zeldovich signal from moving galaxy clusters, thus providing a direct measurement of their line-of-sight peculiar velocity. We show that cluster peculiar velocity statistics applied to foreseeable surveys will put significant constraints on fundamental cosmological parameters. We consider three statistical quantities that can be constructed from a cluster peculiar velocity catalog: the probability density function, the mean pairwise streaming velocity, and the pairwise velocity dispersion. These quantities are applied to an envisioned data set that measures line-of-sight cluster velocities with normal errors of 100 km s-1 for all clusters with masses larger than 1014 Msolar over a sky area of up to 5000 deg2. A simple Fisher matrix analysis of this survey shows that the normalization of the matter power spectrum and the dark energy equation of state can be constrained to better than 10%, and that the Hubble constant and the primordial power spectrum index can be constrained to a few percent, independent of any other cosmological observations. We also find that the current constraint on the power spectrum normalization can be improved by more than a factor of 2 using data from a 400 deg2 survey and WMAP third-year priors. We also show how the constraints on cosmological parameters change if cluster velocities are measured with normal errors of 300 km s-1.

  15. Long-term impact of sewage sludge application on soil microbial biomass: An evaluation using meta-analysis.

    PubMed

    Charlton, Alex; Sakrabani, Ruben; Tyrrel, Sean; Rivas Casado, Monica; McGrath, Steve P; Crooks, Bill; Cooper, Pat; Campbell, Colin D

    2016-12-01

    The Long-Term Sludge Experiments (LTSE) began in 1994 as part of continuing research into the effects of sludge-borne heavy metals on soil fertility. The long-term effects of Zn, Cu, and Cd on soil microbial biomass carbon (C mic ) were monitored for 8 years (1997-2005) in sludge amended soils at nine UK field sites. To assess the statutory limits set by the UK Sludge (Use in Agriculture) Regulations the experimental data has been reviewed using the statistical methods of meta-analysis. Previous LTSE studies have focused predominantly on statistical significance rather than effect size, whereas meta-analysis focuses on the magnitude and direction of an effect, i.e. the practical significance, rather than its statistical significance. The results presented here show that significant decreases in C mic have occurred in soils where the total concentrations of Zn and Cu fall below the current UK statutory limits. For soils receiving sewage sludge predominantly contaminated with Zn, decreases of approximately 7-11% were observed at concentrations below the UK statutory limit. The effect of Zn appeared to increase over time, with increasingly greater decreases in C mic observed over a period of 8 years. This may be due to an interactive effect between Zn and confounding Cu contamination which has augmented the bioavailability of these metals over time. Similar decreases (7-12%) in C mic were observed in soils receiving sewage sludge predominantly contaminated with Cu; however, C mic appeared to show signs of recovery after a period of 6 years. Application of sewage sludge predominantly contaminated with Cd appeared to have no effect on C mic at concentrations below the current UK statutory limit. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Which sociodemographic factors are important on smoking behaviour of high school students? The contribution of classification and regression tree methodology in a broad epidemiological survey

    PubMed Central

    Özge, C; Toros, F; Bayramkaya, E; Çamdeviren, H; Şaşmaz, T

    2006-01-01

    Background The purpose of this study is to evaluate the most important sociodemographic factors on smoking status of high school students using a broad randomised epidemiological survey. Methods Using in‐class, self administered questionnaire about their sociodemographic variables and smoking behaviour, a representative sample of total 3304 students of preparatory, 9th, 10th, and 11th grades, from 22 randomly selected schools of Mersin, were evaluated and discriminative factors have been determined using appropriate statistics. In addition to binary logistic regression analysis, the study evaluated combined effects of these factors using classification and regression tree methodology, as a new statistical method. Results The data showed that 38% of the students reported lifetime smoking and 16.9% of them reported current smoking with a male predominancy and increasing prevalence by age. Second hand smoking was reported at a 74.3% frequency with father predominance (56.6%). The significantly important factors that affect current smoking in these age groups were increased by household size, late birth rank, certain school types, low academic performance, increased second hand smoking, and stress (especially reported as separation from a close friend or because of violence at home). Classification and regression tree methodology showed the importance of some neglected sociodemographic factors with a good classification capacity. Conclusions It was concluded that, as closely related with sociocultural factors, smoking was a common problem in this young population, generating important academic and social burden in youth life and with increasing data about this behaviour and using new statistical methods, effective coping strategies could be composed. PMID:16891446

  17. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    PubMed

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Preservice Secondary Mathematics Teachers' Statistical Knowledge: A Snapshot of Strengths and Weaknesses

    ERIC Educational Resources Information Center

    Lovett, Jennifer N.; Lee, Hollylynne S.

    2017-01-01

    Amid the implementation of new curriculum standard regarding statistics and new recommendations for preservice secondary mathematics teachers [PSMTs] to teach statistics, there is a need to examine the current state of PSMTs' common statistical knowledge. This study reports on the statistical knowledge 217 PSMTs from a purposeful sample of 18…

  19. Wave Propagation in Non-Stationary Statistical Mantle Models at the Global Scale

    NASA Astrophysics Data System (ADS)

    Meschede, M.; Romanowicz, B. A.

    2014-12-01

    We study the effect of statistically distributed heterogeneities that are smaller than the resolution of current tomographic models on seismic waves that propagate through the Earth's mantle at teleseismic distances. Current global tomographic models are missing small-scale structure as evidenced by the failure of even accurate numerical synthetics to explain enhanced coda in observed body and surface waveforms. One way to characterize small scale heterogeneity is to construct random models and confront observed coda waveforms with predictions from these models. Statistical studies of the coda typically rely on models with simplified isotropic and stationary correlation functions in Cartesian geometries. We show how to construct more complex random models for the mantle that can account for arbitrary non-stationary and anisotropic correlation functions as well as for complex geometries. Although this method is computationally heavy, model characteristics such as translational, cylindrical or spherical symmetries can be used to greatly reduce the complexity such that this method becomes practical. With this approach, we can create 3D models of the full spherical Earth that can be radially anisotropic, i.e. with different horizontal and radial correlation functions, and radially non-stationary, i.e. with radially varying model power and correlation functions. Both of these features are crucial for a statistical description of the mantle in which structure depends to first order on the spherical geometry of the Earth. We combine different random model realizations of S velocity with current global tomographic models that are robust at long wavelengths (e.g. Meschede and Romanowicz, 2014, GJI submitted), and compute the effects of these hybrid models on the wavefield with a spectral element code (SPECFEM3D_GLOBE). We finally analyze the resulting coda waves for our model selection and compare our computations with observations. Based on these observations, we make predictions about the strength of unresolved small-scale structure and extrinsic attenuation.

  20. Climate change over Leh (Ladakh), India

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Thayyen, R. J.

    2018-01-01

    Mountains over the world are considered as the indicators of climate change. The Himalayas are comprised of five ranges, viz., Pir Panjal, Great Himalayas, Zanskar, Ladhak, and Karakorum. The Ladakh region lies in the northernmost state of India, Jammu and Kashmir, in the Ladhak range. It has a unique cold-arid climate and lies immediately south of the Karakorum range. With scarce water resources, such regions show high sensitivity and vulnerability to the change in climate and need urgent attention. The objective of this study is to understand the climate of the Ladakh region and to characterize its changing climate. Using different temperature and precipitation datasets over Leh and surrounding regions, we statistically analyze the current trends of climatic patterns over the region. The study shows that the climate over Leh shows a warming trend with reduced precipitation in the current decade. The reduced average seasonal precipitation might also be associated with some indications of reducing number of days with higher precipitation amounts over the region.

  1. Combined Dextroamphetamine and Transcranial Direct Current Stimulation in Poststroke Aphasia.

    PubMed

    Keser, Zafer; Dehgan, Michelle Weber; Shadravan, Shaparak; Yozbatiran, Nuray; Maher, Lynn M; Francisco, Gerard E

    2017-10-01

    There is a growing need for various effective adjunctive treatment options for speech recovery after stroke. A pharmacological agent combined with noninvasive brain stimulation has not been previously reported for poststroke aphasia recovery. In this "proof of concept" study, we aimed to test the safety of a combined intervention consisting of dextroamphetamine, transcranial direct current stimulation, and speech and language therapy in subjects with nonfluent aphasia. Ten subjects with chronic nonfluent aphasia underwent two experiments where they received dextroamphetamine or placebo along with transcranial direct current stimulation and speech and language therapy on two separate days. The Western Aphasia Battery-Revised was used to monitor changes in speech performance. No serious adverse events were observed. There was no significant increase in blood pressure with amphetamine or deterioration in speech and language performance. Western Aphasia Battery-Revised aphasia quotient and language quotient showed a statistically significant increase in the active experiment. Comparison of proportional changes of aphasia quotient and language quotient in active experiment with those in placebo experiment showed significant difference. We showed that the triple combination therapy is safe and implementable and seems to induce positive changes in speech and language performance in the patients with chronic nonfluent aphasia due to stroke.

  2. Use of Electrochemical Noise (EN) Technique to Study the Effect of sulfate and Chloride Ions on Passivation and Pitting Corrosion Behavior of 316 Stainless Steel

    NASA Astrophysics Data System (ADS)

    Pujar, M. G.; Anita, T.; Shaikh, H.; Dayal, R. K.; Khatak, H. S.

    2007-08-01

    In the present paper, studies were conducted on AISI Type 316 stainless steel (SS) in deaerated solutions of sodium sulfate as well as sodium chloride to establish the effect of sulfate and chloride ions on the electrochemical corrosion behavior of the material. The experiments were conducted in deaerated solutions of 0.5 M sodium sulfate as well as 0.5 M sodium chloride using electrochemical noise (EN) technique at open circuit potential (OCP) to collect the correlated current and potential signals. Visual records of the current and potential, analysis of data to arrive at the statistical parameters, spectral density estimation using the maximum entropy method (MEM) showed that sulfate ions were incorporated in the passive film to strengthen the same. However, the adsorption of chloride ions resulted in pitting corrosion thereby adversely affecting noise resistance ( R N). Distinct current and potential signals were observed for metastable pitting, stable pitting and passive film build-up. Distinct changes in the values of the statistical parameters like R N and the spectral noise resistance at zero frequency ( R°SN) revealed adsorption and incorporation of sulfate and chloride ions on the passive film/solution interface.

  3. On a magnetic reconnection in the Venusian wake. The experimental evidences.

    NASA Astrophysics Data System (ADS)

    Fedorov, Andrei; Jarvinen, Riku; Volwerk, Martin; Barabash, Stas; Zhang, Tielong; Sauvaud, Jean-Andre

    2010-05-01

    The Venusian magnetotail is formed by solar wind magnetic flux tubes draping around the planet and stretched antisunward. The magnetotail topology represents two magnetic lobes separated by a thin current sheet. Such a configuration is a free energy reservoir. The accumulated energy is generally released by antisunward acceleration of the planetary ions. But in the case of a magnetic reconnection, hypothetically appeared somewhere in the equatorial current sheet, some part of the planetary ions filling the tail, should be accelerated toward the planet. To check this hypothesis we have performed statistical and case studies based on the data from the IMA mass-spectrometer and the magnetometer onboard ESA Venus Express mission. We found that the distribution function of the planetary ions in the equatorial plane of the wake, near the midnight, and at the distances less than 1.7Rv from the center of the planet contains the significant part moving toward the planet. At the same time the magnetic field statistics and the numerical simulation show the magnetic field minimum similar to an X-line in the current sheet at the distance about 1.7 Rv from the planet center. This could be an evidence for a quasi-permanent reconnection in the Venusian wake.

  4. NCES Handbook of Survey Methods: Technical Report.

    ERIC Educational Resources Information Center

    Thurgood, Lori; Walter, Elizabeth; Carter, George; Henn, Susan; Huang, Gary; Nooter, Daniel; Smith, Wray; Cash, R. William; Salvucci, Sameena; Seastrom, Marilyn; Phan, Tai; Cohen, Michael

    This handbook presents current explanations of how each survey program of the National Center for Education Statistics (NCES) obtains and prepared the data it publishes. The handbook aims to provide users of NCES data with the most current information necessary to evaluate the suitability of the statistics for their needs, with a focus on the…

  5. A Tablet-PC Software Application for Statistics Classes

    ERIC Educational Resources Information Center

    Probst, Alexandre C.

    2014-01-01

    A significant deficiency in the area of introductory statistics education exists: Student performance on standardized assessments after a full semester statistics course is poor and students report a very low desire to learn statistics. Research on the current generation of students indicates an affinity for technology and for multitasking.…

  6. Quantitative evaluation of ASiR image quality: an adaptive statistical iterative reconstruction technique

    NASA Astrophysics Data System (ADS)

    Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan

    2012-03-01

    Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.

  7. Plasma current collection of Z-93 thermal control paint as measured in the Lewis Research Center's plasma interaction facility

    NASA Technical Reports Server (NTRS)

    Hillard, G. Barry

    1993-01-01

    A sample of Z-93 thermal control paint was exposed to a simulated space environment in a plasma chamber. The sample was biased through a series of voltages ranging from -100 volts to +300 volts and electron and ion currents were measured. Currents were found to be in the micro-ampere range indicating that the material remains a reasonably good insulator under plasma conditions. As a second step, the sample was left in the chamber for six days and retested. Collected currents were reduced by from two to five times from the previous values indicating a substantial loss of conductivity. As a final test, the sample was removed, exposed to room conditions for two days, and returned to the chamber. Current measurements showed that the sample had partially recovered the lost conductivity. In addition to presenting these results, this report documents all of the experimental data as well as the statistical analyses performed.

  8. Statistical survey of day-side magnetospheric current flow using Cluster observations: magnetopause

    NASA Astrophysics Data System (ADS)

    Liebert, Evelyn; Nabert, Christian; Perschke, Christopher; Fornaçon, Karl-Heinz; Glassmeier, Karl-Heinz

    2017-05-01

    We present a statistical survey of current structures observed by the Cluster spacecraft at high-latitude day-side magnetopause encounters in the close vicinity of the polar cusps. Making use of the curlometer technique and the fluxgate magnetometer data, we calculate the 3-D current densities and investigate the magnetopause current direction, location, and magnitude during varying solar wind conditions. We find that the orientation of the day-side current structures is in accordance with existing magnetopause current models. Based on the ambient plasma properties, we distinguish five different transition regions at the magnetopause surface and observe distinctive current properties for each region. Additionally, we find that the location of currents varies with respect to the onset of the changes in the plasma environment during magnetopause crossings.

  9. Statistical Challenges in Military Research

    DTIC Science & Technology

    2016-07-30

    University of Tennessee Health Science Center currently has five NIH/DOD funded grant projects addressing tobacco, alcohol abuse, and obesity prevention in... American Statistical Association (Section on Defense and National Security), Joint Statistical Meetings, Chicago, IL, August 2016 The opinions

  10. Cathodal transcranial direct current stimulation in children with dystonia: a pilot open-label trial.

    PubMed

    Young, Scott J; Bertucco, Matteo; Sheehan-Stross, Rebecca; Sanger, Terence D

    2013-10-01

    Studies suggest that dystonia is associated with increased motor cortex excitability. Cathodal transcranial direct current stimulation can temporarily reduce motor cortex excitability. To test whether stimulation of the motor cortex can reduce dystonic symptoms in children, we measured tracking performance and muscle overflow using an electromyogram tracking task before and after stimulation. Of 10 participants, 3 showed a significant reduction in overflow, and a fourth showed a significant reduction in tracking error. Overflow decreased more when the hand contralateral to the cathode performed the task than when the hand ipsilateral to the cathode performed the task. Averaged over all participants, the results did not reach statistical significance. These results suggest that cathodal stimulation may allow a subset of children to control muscles or reduce involuntary overflow activity. Further testing is needed to confirm these results in a blinded trial and identify the subset of children who are likely to respond.

  11. Current and high-β sheets in CIR streams: statistics and interaction with the HCS and the magnetosphere

    NASA Astrophysics Data System (ADS)

    Potapov, A. S.

    2018-04-01

    Thirty events of CIR streams (corotating interaction regions between fast and slow solar wind) were analyzed in order to study statistically plasma structure within the CIR shear zones and to examine the interaction of the CIRs with the heliospheric current sheet (HCS) and the Earth's magnetosphere. The occurrence of current layers and high-beta plasma sheets in the CIR structure has been estimated. It was found that on average, each of the CIR streams had four current layers in its structure with a current density of more than 0.12 A/m2 and about one and a half high-beta plasma regions with a beta value of more than five. Then we traced how and how often the high-speed stream associated with the CIR can catch up with the heliospheric current sheet (HCS) and connect to it. The interface of each fourth CIR stream coincided in time within an hour with the HCS, but in two thirds of cases, the CIR connection with the HCS was completely absent. One event of the simultaneous observation of the CIR stream in front of the magnetosphere by the ACE satellite in the vicinity of the L1 libration point and the Wind satellite in the remote geomagnetic tail was considered in detail. Measurements of the components of the interplanetary magnetic field and plasma parameters showed that the overall structure of the stream is conserved. Moreover, some details of the fine structure are also transferred through the magnetosphere. In particular, the so-called "magnetic hole" almost does not change its shape when moving from L1 point to a neighborhood of L2 point.

  12. Asymmetry of magnetic motor evoked potentials recorded in calf muscles of the dominant and non-dominant lower extremity.

    PubMed

    Olex-Zarychta, Dorota; Koprowski, Robert; Sobota, Grzegorz; Wróbel, Zygmunt

    2009-08-07

    The aim of the study was to determine the applicability of magnetic stimulation and magnetic motor evoked potentials (MEPs) in motor asymmetry studies by obtaining quantitative and qualitative measures of efferent activity during low intensity magnetic stimulation of the dominant and non-dominant lower extremities. Magnetic stimulation of the tibial nerve in the popliteal fossa was performed in 10 healthy male right-handed and right-footed young adults. Responses were recorded from the lateral head of the gastrocnemius muscles of the right and left lower extremities. Response characteristics (duration, onset latency, amplitude) were analyzed in relation to the functional dominance of the limbs and in relation to the direction of the current in the magnetic coil by use of the Wilcoxon pair sequence test. The CCW direction of coil current was related to reduced amplitudes of recorded MEPs. Greater amplitudes of evoked potentials were recorded in the non-dominant extremity, both in the CW and CCW coil current directions, with the statistical significance of this effect (p=0.005). No differences in duration of response were found in the CW current direction, while in CCW the time of the left-side response was prolonged (p=0.01). In the non-dominant extremity longer onset latencies were recorded in both current directions, but only for the CW direction the side asymmetries showed a statistical significance of p=0.005. In the dominant extremity the stimulation correlated with stronger paresthesias, especially using the CCW direction of coil current. The results indicate that low intensity magnetic stimulation may be useful in quantitative and qualitative research into the motor asymmetry.

  13. SparRec: An effective matrix completion framework of missing data imputation for GWAS

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Ma, Shiqian; Causey, Jason; Qiao, Linbo; Hardin, Matthew Price; Bitts, Ian; Johnson, Daniel; Zhang, Shuzhong; Huang, Xiuzhen

    2016-10-01

    Genome-wide association studies present computational challenges for missing data imputation, while the advances of genotype technologies are generating datasets of large sample sizes with sample sets genotyped on multiple SNP chips. We present a new framework SparRec (Sparse Recovery) for imputation, with the following properties: (1) The optimization models of SparRec, based on low-rank and low number of co-clusters of matrices, are different from current statistics methods. While our low-rank matrix completion (LRMC) model is similar to Mendel-Impute, our matrix co-clustering factorization (MCCF) model is completely new. (2) SparRec, as other matrix completion methods, is flexible to be applied to missing data imputation for large meta-analysis with different cohorts genotyped on different sets of SNPs, even when there is no reference panel. This kind of meta-analysis is very challenging for current statistics based methods. (3) SparRec has consistent performance and achieves high recovery accuracy even when the missing data rate is as high as 90%. Compared with Mendel-Impute, our low-rank based method achieves similar accuracy and efficiency, while the co-clustering based method has advantages in running time. The testing results show that SparRec has significant advantages and competitive performance over other state-of-the-art existing statistics methods including Beagle and fastPhase.

  14. Communication Systems through Artificial Earth Satellites (Selected Pages)

    DTIC Science & Technology

    1987-02-05

    A. The speaking currents of this subscriber from equipment AP come through AC only into the two-wire circuit, but also are branched/ shunted to AY, and...distribution of cloud cover. The evaluation, based on the statistic study of clouds [3.3) and rains of South England, at A=50 at the frequency of 4 GHz... Studies of conditions for passage of radio waves through disturbed ionosphere showed [3.16] that aurorae polares increase speed of fadings and are

  15. Drought Persistence Errors in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, H.; Gudmundsson, L.; Seneviratne, S. I.

    2018-04-01

    The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.

  16. Effects of Inductively Coupled Plasma Hydrogen on Long-Wavelength Infrared HgCdTe Photodiodes

    NASA Astrophysics Data System (ADS)

    Boieriu, P.; Buurma, C.; Bommena, R.; Blissett, C.; Grein, C.; Sivananthan, S.

    2013-12-01

    Bulk passivation of semiconductors with hydrogen continues to be investigated for its potential to improve device performance. In this work, hydrogen-only inductively coupled plasma (ICP) was used to incorporate hydrogen into long-wavelength infrared HgCdTe photodiodes grown by molecular-beam epitaxy. Fully fabricated devices exposed to ICP showed statistically significant increases in zero-bias impedance values, improved uniformity, and decreased dark currents. HgCdTe photodiodes on Si substrates passivated with amorphous ZnS exhibited reductions in shunt currents, whereas devices on CdZnTe substrates passivated with polycrystalline CdTe exhibited reduced surface leakage, suggesting that hydrogen passivates defects in bulk HgCdTe and in CdTe.

  17. Probing Majorana bound states via counting statistics of a single electron transistor

    NASA Astrophysics Data System (ADS)

    Li, Zeng-Zhao; Lam, Chi-Hang; You, J. Q.

    2015-06-01

    We propose an approach for probing Majorana bound states (MBSs) in a nanowire via counting statistics of a nearby charge detector in the form of a single-electron transistor (SET). We consider the impacts on the counting statistics by both the local coupling between the detector and an adjacent MBS at one end of a nanowire and the nonlocal coupling to the MBS at the other end. We show that the Fano factor and the skewness of the SET current are minimized for a symmetric SET configuration in the absence of the MBSs or when coupled to a fermionic state. However, the minimum points of operation are shifted appreciably in the presence of the MBSs to asymmetric SET configurations with a higher tunnel rate at the drain than at the source. This feature persists even when varying the nonlocal coupling and the pairing energy between the two MBSs. We expect that these MBS-induced shifts can be measured experimentally with available technologies and can serve as important signatures of the MBSs.

  18. Sub-Shot Noise Power Source for Microelectronics

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.; Yu, Nan; Mansour, Kamjou

    2011-01-01

    Low-current, high-impedance microelectronic devices can be affected by electric current shot noise more than they are affected by Nyquist noise, even at room temperature. An approach to implementing a sub-shot noise current source for powering such devices is based on direct conversion of amplitude-squeezed light to photocurrent. The phenomenon of optical squeezing allows for the optical measurements below the fundamental shot noise limit, which would be impossible in the domain of classical optics. This becomes possible by affecting the statistical properties of photons in an optical mode, which can be considered as a case of information encoding. Once encoded, the information describing the photon (or any other elementary excitations) statistics can be also transmitted. In fact, it is such information transduction from optics to an electronics circuit, via photoelectric effect, that has allowed the observation of the optical squeezing. It is very difficult, if not technically impossible, to directly measure the statistical distribution of optical photons except at extremely low light level. The photoelectric current, on the other hand, can be easily analyzed using RF spectrum analyzers. Once it was observed that the photocurrent noise generated by a tested light source in question is below the shot noise limit (e.g. produced by a coherent light beam), it was concluded that the light source in question possess the property of amplitude squeezing. The main novelty of this technology is to turn this well-known information transduction approach around. Instead of studying the statistical property of an optical mode by measuring the photoelectron statistics, an amplitude-squeezed light source and a high-efficiency linear photodiode are used to generate photocurrent with sub-Poissonian electron statistics. By powering microelectronic devices with this current source, their performance can be improved, especially their noise parameters. Therefore, a room-temperature sub-shot noise current source can be built that will be beneficial for a very broad range of low-power, low-noise electronic instruments and applications, both cryogenic and room-temperature. Taking advantage of recent demonstrations of the squeezed light sources based on optical micro-disks, this sub-shot noise current source can be made compatible with the size/power requirements specific of the electronic devices it will support.

  19. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    PubMed

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P < 0.001). At qualitative analysis of the third study, it also showed that the images reconstructed using ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P < 0.001). Our phantom studies showed that ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  20. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    ERIC Educational Resources Information Center

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  1. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  2. The Preparedness of Preservice Secondary Mathematics Teachers to Teach Statistics: A Cross-Institutional Mixed Methods Study

    ERIC Educational Resources Information Center

    Lovett, Jennifer Nickell

    2016-01-01

    The purpose of this study is to provide researchers, mathematics educators, and statistics educators information about the current state of preservice secondary mathematics teachers' preparedness to teach statistics. To do so, this study employed an explanatory mixed methods design to quantitatively examine the statistical knowledge and statistics…

  3. National Institute of Statistical Sciences Data Confidentiality Technical Panel: Final Report. NCES 2011-608

    ERIC Educational Resources Information Center

    Karr, Alan

    2011-01-01

    NCES asked the National Institute of Statistical Sciences (NISS) to convene a technical panel of survey and policy experts to examine the NCES current and planned data dissemination strategies for confidential data with respect to: mandates and directives that NCES make data available; current and prospective technologies for protecting and…

  4. How often should we expect to be wrong? Statistical power, P values, and the expected prevalence of false discoveries.

    PubMed

    Marino, Michael J

    2018-05-01

    There is a clear perception in the literature that there is a crisis in reproducibility in the biomedical sciences. Many underlying factors contributing to the prevalence of irreproducible results have been highlighted with a focus on poor design and execution of experiments along with the misuse of statistics. While these factors certainly contribute to irreproducibility, relatively little attention outside of the specialized statistical literature has focused on the expected prevalence of false discoveries under idealized circumstances. In other words, when everything is done correctly, how often should we expect to be wrong? Using a simple simulation of an idealized experiment, it is possible to show the central role of sample size and the related quantity of statistical power in determining the false discovery rate, and in accurate estimation of effect size. According to our calculations, based on current practice many subfields of biomedical science may expect their discoveries to be false at least 25% of the time, and the only viable course to correct this is to require the reporting of statistical power and a minimum of 80% power (1 - β = 0.80) for all studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Energetic Particles of keV–MeV Energies Observed near Reconnecting Current Sheets at 1 au

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khabarova, Olga V.; Zank, Gary P.

    2017-07-01

    We provide evidence for particle acceleration up to ∼5 MeV at reconnecting current sheets in the solar wind based on both case studies and a statistical analysis of the energetic ion and electron flux data from the five Advanced Composition Explorer Electron, Proton, and Alpha Monitor (EPAM) detectors. The case study of a typical reconnection exhaust event reveals (i) a small-scale peak of the energetic ion flux observed in the vicinity of the reconnection exhaust and (ii) a long-timescale atypical energetic particle event (AEPE) encompassing the reconnection exhaust. AEPEs associated with reconnecting strong current sheets last for many hours, evenmore » days, as confirmed by statistical studies. The case study shows that time-intensity profiles of the ion flux may vary significantly from one EPAM detector to another partially because of the local topology of magnetic fields, but mainly because of the impact of upstream magnetospheric events; therefore, the occurrence of particle acceleration can be hidden. The finding of significant particle energization within a time interval of ±30 hr around reconnection exhausts is supported by a superposed epoch analysis of 126 reconnection exhaust events. We suggest that energetic particles initially accelerated via prolonged magnetic reconnection are trapped and reaccelerated in small- or medium-scale magnetic islands surrounding the reconnecting current sheet, as predicted by the transport theory of Zank et al. Other mechanisms of initial particle acceleration can contribute also.« less

  6. How to inhibit a distractor location? Statistical learning versus active, top-down suppression.

    PubMed

    Wang, Benchi; Theeuwes, Jan

    2018-05-01

    Recently, Wang and Theeuwes (Journal of Experimental Psychology: Human Perception and Performance, 44(1), 13-17, 2018a) demonstrated the role of lingering selection biases in an additional singleton search task in which the distractor singleton appeared much more often in one location than in all other locations. For this location, there was less capture and selection efficiency was reduced. It was argued that statistical learning induces plasticity within the spatial priority map such that particular locations that are high likely to contain a distractor are suppressed relative to all other locations. The current study replicated these findings regarding statistical learning (Experiment 1) and investigated whether similar effects can be obtained by cueing the distractor location in a top-down way on a trial-by-trial basis. The results show that top-down cueing of the distractor location with long (1,500 ms; Experiment 2) and short stimulus-onset symmetries (SOAs) (600 ms; Experiment 3) does not result in suppression: The amount of capture nor the efficiency of selection was affected by the cue. If anything, we found an attentional benefit (instead of the suppression) for the short SOA. We argue that through statistical learning, weights within the attentional priority map are changed such that one location containing a salient distractor is suppressed relative to all other locations. Our cueing experiments show that this effect cannot be accomplished by active, top-down suppression. Consequences for recent theories of distractor suppression are discussed.

  7. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  8. The Marburg-Münster Affective Disorders Cohort Study (MACS): A quality assurance protocol for MR neuroimaging data.

    PubMed

    Vogelbacher, Christoph; Möbius, Thomas W D; Sommer, Jens; Schuster, Verena; Dannlowski, Udo; Kircher, Tilo; Dempfle, Astrid; Jansen, Andreas; Bopp, Miriam H A

    2018-05-15

    Large, longitudinal, multi-center MR neuroimaging studies require comprehensive quality assurance (QA) protocols for assessing the general quality of the compiled data, indicating potential malfunctions in the scanning equipment, and evaluating inter-site differences that need to be accounted for in subsequent analyses. We describe the implementation of a QA protocol for functional magnet resonance imaging (fMRI) data based on the regular measurement of an MRI phantom and an extensive variety of currently published QA statistics. The protocol is implemented in the MACS (Marburg-Münster Affective Disorders Cohort Study, http://for2107.de/), a two-center research consortium studying the neurobiological foundations of affective disorders. Between February 2015 and October 2016, 1214 phantom measurements have been acquired using a standard fMRI protocol. Using 444 healthy control subjects which have been measured between 2014 and 2016 in the cohort, we investigate the extent of between-site differences in contrast to the dependence on subject-specific covariates (age and sex) for structural MRI, fMRI, and diffusion tensor imaging (DTI) data. We show that most of the presented QA statistics differ severely not only between the two scanners used for the cohort but also between experimental settings (e.g. hardware and software changes), demonstrate that some of these statistics depend on external variables (e.g. time of day, temperature), highlight their strong dependence on proper handling of the MRI phantom, and show how the use of a phantom holder may balance this dependence. Site effects, however, do not only exist for the phantom data, but also for human MRI data. Using T1-weighted structural images, we show that total intracranial (TIV), grey matter (GMV), and white matter (WMV) volumes significantly differ between the MR scanners, showing large effect sizes. Voxel-based morphometry (VBM) analyses show that these structural differences observed between scanners are most pronounced in the bilateral basal ganglia, thalamus, and posterior regions. Using DTI data, we also show that fractional anisotropy (FA) differs between sites in almost all regions assessed. When pooling data from multiple centers, our data show that it is a necessity to account not only for inter-site differences but also for hardware and software changes of the scanning equipment. Also, the strong dependence of the QA statistics on the reliable placement of the MRI phantom shows that the use of a phantom holder is recommended to reduce the variance of the QA statistics and thus to increase the probability of detecting potential scanner malfunctions. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Evaluation of the Kinetic Property of Single-Molecule Junctions by Tunneling Current Measurements.

    PubMed

    Harashima, Takanori; Hasegawa, Yusuke; Kiguchi, Manabu; Nishino, Tomoaki

    2018-01-01

    We investigated the formation and breaking of single-molecule junctions of two kinds of dithiol molecules by time-resolved tunneling current measurements in a metal nanogap. The resulting current trajectory was statistically analyzed to determine the single-molecule conductance and, more importantly, to reveal the kinetic property of the single-molecular junction. These results suggested that combining a measurement of the single-molecule conductance and statistical analysis is a promising method to uncover the kinetic properties of the single-molecule junction.

  10. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  11. Diagnostic Factors of Odontogenic Cysts in Iranian Population: A Retrospective Study Over the Past Two Decades

    PubMed Central

    Mohajerani, Hassan; Esmaeelinejad, Mohammad; Sabour, Siamak; Aghdashi, Farzad; Dehghani, Nima

    2015-01-01

    Background: Early diagnosis of odontogenic cysts due to their silent progression is always a challenging problem for clinicians. Objectives: The current study aimed to evaluate the frequency of odontogenic cysts and related factors in a selected Iranian population. Patients and Methods: The current cross-sectional study was conducted on 312 patients’ recorded data in Taleghani Hospital, Tehran, Iran, from April 1993 to December 2013. All related data were extracted from the records and categorized in tables. The correlation between the variables was analyzed by either chi-square or multinominal logistic regression tests. The P values < 0.05 were considered significant. Results: Evaluation of 312 patients’ records (185 males and 127 females) with the mean age of 27.6 showed that Odontogenic Keratocyst (OKC) was the most common odontogenic cyst of all followed by the dentigerous cyst as the second most common lesion. Most of the patients were in the second or third decades of their lives, although there was no statistically significant age distribution. The finding of the current study showed that calcifying odontogenic cyst (COC) occurrence was significantly related to the history of trauma. Enucleation and curettage of the odontogenic cysts were the most common treatment plans of all. Conclusions: The current study showed that clinicians should consider the many factors associated with the occurrence of odontogenic cysts. PMID:26357548

  12. Brain serotonin transporter density and aggression in abstinent methamphetamine abusers.

    PubMed

    Sekine, Yoshimoto; Ouchi, Yasuomi; Takei, Nori; Yoshikawa, Etsuji; Nakamura, Kazuhiko; Futatsubashi, Masami; Okada, Hiroyuki; Minabe, Yoshio; Suzuki, Katsuaki; Iwata, Yasuhide; Tsuchiya, Kenji J; Tsukada, Hideo; Iyo, Masaomi; Mori, Norio

    2006-01-01

    In animals, methamphetamine is known to have a neurotoxic effect on serotonin neurons, which have been implicated in the regulation of mood, anxiety, and aggression. It remains unknown whether methamphetamine damages serotonin neurons in humans. To investigate the status of brain serotonin neurons and their possible relationship with clinical characteristics in currently abstinent methamphetamine abusers. Case-control analysis. A hospital research center. Twelve currently abstinent former methamphetamine abusers (5 women and 7 men) and 12 age-, sex-, and education-matched control subjects recruited from the community. The brain regional density of the serotonin transporter, a structural component of serotonin neurons, was estimated using positron emission tomography and trans-1,2,3,5,6,10-beta-hexahydro-6-[4-(methylthio)phenyl]pyrrolo-[2,1-a]isoquinoline ([(11)C](+)McN-5652). Estimates were derived from region-of-interest and statistical parametric mapping methods, followed by within-case analysis using the measures of clinical variables. The duration of methamphetamine use, the magnitude of aggression and depressive symptoms, and changes in serotonin transporter density represented by the [(11)C](+)McN-5652 distribution volume. Methamphetamine abusers showed increased levels of aggression compared with controls. Region-of-interest and statistical parametric mapping analyses revealed that the serotonin transporter density in global brain regions (eg, the midbrain, thalamus, caudate, putamen, cerebral cortex, and cerebellum) was significantly lower in methamphetamine abusers than in control subjects, and this reduction was significantly inversely correlated with the duration of methamphetamine use. Furthermore, statistical parametric mapping analyses indicated that the density in the orbitofrontal, temporal, and anterior cingulate areas was closely associated with the magnitude of aggression in methamphetamine abusers. Protracted abuse of methamphetamine may reduce the density of the serotonin transporter in the brain, leading to elevated aggression, even in currently abstinent abusers.

  13. Ring Current He Ion Control by Bounce Resonant ULF Waves

    NASA Astrophysics Data System (ADS)

    Kim, Hyomin; Gerrard, Andrew J.; Lanzerotti, Louis J.; Soto-Chavez, Rualdo; Cohen, Ross J.; Manweiler, Jerry W.

    2017-12-01

    Ring current energy He ion (˜65 keV to ˜520 keV) differential flux data from the Radiation Belt Storm Probe Ion Composition Experiment (RBSPICE) instrument aboard the Van Allan Probes spacecraft show considerable variability during quiet solar wind and geomagnetic time periods. Such variability is apparent from orbit to orbit (˜9 h) of the spacecraft and is observed to be ˜50-100% of the nominal flux. Using data from the Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrument, also aboard the Van Allen Probes spacecraft, we identify that a dominant source of this variability is from ULF waveforms with periods of tens of seconds. These periods correspond to the bounce resonant timescales of the ring current He ions being measured by RBSPICE. A statistical survey using the particle and field data for one full spacecraft precession period (approximately 2 years) shows that the wave and He ion flux variations are generally anticorrelated, suggesting the bounce resonant pitch angle scattering process as a major component in the scattering of He ions.

  14. Field-aligned current and auroral Hall current characteristics derived from the Swarm constellation

    NASA Astrophysics Data System (ADS)

    Huang, Tao; Wang, Hui; Hermann, Luehr

    2017-04-01

    On the basis of field-aligned currents (FACs) and Hall currents derived from high-resolution magnetic field data of the Swarm constellation the average characteristics of these two current systems in the auroral regions are comprehensively investigated by statistical methods. This is the first study considering both current types simultaneously and for both hemispheres. The FAC distribution, derived from the Swarm dual-spacecraft approach, reveals the well-known features of Region 1 (R1) and Region 2 (R2) FACs. At high latitudes, Region 0 (R0) FACs appear on the dayside. Their direction depends on the orientation of the interplanetary magnetic field (IMF) By component. Of particular interest is the distribution of auroral Hall currents. The most prominent auroral electrojets are found to be closely controlled by the solar wind input. But there is no dependence on the IMF By orientation. The eastward electrojet is about twice as strong in summer as in winter. Conversely, the westward electrojet shows less dependence on season. Part of the electrojet current is closed over the polar cap. Here the seasonal variation of conductivity mainly controls the current density. There is a clear channeling of return currents over the polar cap. Depending on IMF By orientation most of the current is flowing either on the dawn or dusk side. The direction of Hall currents in the noon sector depends directly on the orientation of the IMF By. This is true for both signs of the IMF Bz component. But largest differences between summer and winter seasons are found for northward IMF Bz. Around the midnight sector the westward substorm electrojet is dominating. As expected, it is highly dependent on magnetic activity, but shows only little response to the IMF By polarity.

  15. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  16. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  17. Statistical parametric mapping of LORETA using high density EEG and individual MRI: application to mismatch negativities in schizophrenia.

    PubMed

    Park, Hae-Jeong; Kwon, Jun Soo; Youn, Tak; Pae, Ji Soo; Kim, Jae-Jin; Kim, Myung-Sun; Ha, Kyoo-Seob

    2002-11-01

    We describe a method for the statistical parametric mapping of low resolution electromagnetic tomography (LORETA) using high-density electroencephalography (EEG) and individual magnetic resonance images (MRI) to investigate the characteristics of the mismatch negativity (MMN) generators in schizophrenia. LORETA, using a realistic head model of the boundary element method derived from the individual anatomy, estimated the current density maps from the scalp topography of the 128-channel EEG. From the current density maps that covered the whole cortical gray matter (up to 20,000 points), volumetric current density images were reconstructed. Intensity normalization of the smoothed current density images was used to reduce the confounding effect of subject specific global activity. After transforming each image into a standard stereotaxic space, we carried out statistical parametric mapping of the normalized current density images. We applied this method to the source localization of MMN in schizophrenia. The MMN generators, produced by a deviant tone of 1,200 Hz (5% of 1,600 trials) under the standard tone of 1,000 Hz, 80 dB binaural stimuli with 300 msec of inter-stimulus interval, were measured in 14 right-handed schizophrenic subjects and 14 age-, gender-, and handedness-matched controls. We found that the schizophrenic group exhibited significant current density reductions of MMN in the left superior temporal gyrus and the left inferior parietal gyrus (P < 0. 0005). This study is the first voxel-by-voxel statistical mapping of current density using individual MRI and high-density EEG. Copyright 2002 Wiley-Liss, Inc.

  18. High Impact = High Statistical Standards? Not Necessarily So

    PubMed Central

    Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533

  19. High impact  =  high statistical standards? Not necessarily so.

    PubMed

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  20. Can We Spin Straw Into Gold? An Evaluation of Immigrant Legal Status Imputation Approaches

    PubMed Central

    Van Hook, Jennifer; Bachmeier, James D.; Coffman, Donna; Harel, Ofer

    2014-01-01

    Researchers have developed logical, demographic, and statistical strategies for imputing immigrants’ legal status, but these methods have never been empirically assessed. We used Monte Carlo simulations to test whether, and under what conditions, legal status imputation approaches yield unbiased estimates of the association of unauthorized status with health insurance coverage. We tested five methods under a range of missing data scenarios. Logical and demographic imputation methods yielded biased estimates across all missing data scenarios. Statistical imputation approaches yielded unbiased estimates only when unauthorized status was jointly observed with insurance coverage; when this condition was not met, these methods overestimated insurance coverage for unauthorized relative to legal immigrants. We next showed how bias can be reduced by incorporating prior information about unauthorized immigrants. Finally, we demonstrated the utility of the best-performing statistical method for increasing power. We used it to produce state/regional estimates of insurance coverage among unauthorized immigrants in the Current Population Survey, a data source that contains no direct measures of immigrants’ legal status. We conclude that commonly employed legal status imputation approaches are likely to produce biased estimates, but data and statistical methods exist that could substantially reduce these biases. PMID:25511332

  1. Method for simulating atmospheric turbulence phase effects for multiple time slices and anisoplanatic conditions.

    PubMed

    Roggemann, M C; Welsh, B M; Montera, D; Rhoadarmer, T A

    1995-07-10

    Simulating the effects of atmospheric turbulence on optical imaging systems is an important aspect of understanding the performance of these systems. Simulations are particularly important for understanding the statistics of some adaptive-optics system performance measures, such as the mean and variance of the compensated optical transfer function, and for understanding the statistics of estimators used to reconstruct intensity distributions from turbulence-corrupted image measurements. Current methods of simulating the performance of these systems typically make use of random phase screens placed in the system pupil. Methods exist for making random draws of phase screens that have the correct spatial statistics. However, simulating temporal effects and anisoplanatism requires one or more phase screens at different distances from the aperture, possibly moving with different velocities. We describe and demonstrate a method for creating random draws of phase screens with the correct space-time statistics for a bitrary turbulence and wind-velocity profiles, which can be placed in the telescope pupil in simulations. Results are provided for both the von Kármán and the Kolmogorov turbulence spectra. We also show how to simulate anisoplanatic effects with this technique.

  2. Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.

    2012-10-01

    In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less

  3. School Enrollment--Social and Economic Characteristics of Students: October 1977 (Advance Report). Current Population Reports. Population Characteristics. Series P-20, No. 321.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Suitland, MD.

    This report presents a summary of recent trends in school and college enrollment based on the October 1977 Current Population Survey (CPS) and earlier surveys. Enrollment statistics representing growth and decline at various educational levels are evaluated in written summaries. Comparative and distributive enrollment statistics of the population…

  4. Seasonal dependence of large-scale Birkeland currents

    NASA Technical Reports Server (NTRS)

    Fujii, R.; Iijima, T.; Potemra, T. A.; Sugiura, M.

    1981-01-01

    Seasonal variations of large-scale Birkeland currents are examined in a study of the source mechanisms and the closure of the three-dimensional current systems in the ionosphere. Vector magnetic field data acquired by the TRIAD satellite in the Northern Hemisphere were analyzed for the statistics of single sheet and double sheet Birkeland currents during 555 passes during the summer and 408 passes during the winter. The single sheet currents are observed more frequently in the dayside of the auroral zone, and more often in summer than in winter. The intensities of both the single and double dayside currents are found to be greater in the summer than in the winter by a factor of two, while the intensities of the double sheet Birkeland currents on the nightside do not show a significant difference from summer to winter. Both the single and double sheet currents are found at higher latitudes in the summer than in the winter on the dayside. Results suggest that the Birkeland current intensities are controlled by the ionospheric conductivity in the polar region, and that the currents close via the polar cap when the conductivity there is sufficiently high. It is also concluded that an important source of these currents must be a voltage generator in the magnetosphere.

  5. The role of remote wind forcing in the subinertial current variability in the central and northern parts of the South Brazil Bight

    NASA Astrophysics Data System (ADS)

    Dottori, Marcelo; Castro, Belmiro Mendes

    2018-06-01

    Data analysis of continental shelf currents and coastal sea level, together with the application of a semi-analytical model, are used to estimate the importance of remote wind forcing on the subinertial variability of the current in the central and northern areas of the South Brazil Bight. Results from both the data analysis and from the semi-analytical model are robust in showing subinertial variability that propagates along-shelf leaving the coast to the left in accordance with theoretical studies of Continental Shelf Waves (CSW). Both the subinertial variability observed in along-shelf currents and sea level oscillations present different propagation speeds for the narrow northern part of the SBB ( 6-7 m/s) and the wide central SBB region ( 11 m/s), those estimates being in agreement with the modeled CSW propagation speed. On the inner and middle shelf, observed along-shelf subinertial currents show higher correlation coefficients with the winds located southward and earlier in time than with the local wind at the current meter mooring position and at the time of measurement. The inclusion of the remote (located southwestward) wind forcing improves the prediction of the subinertial currents when compared to the currents forced only by the local wind, since the along-shelf-modeled currents present correlation coefficients with observed along-shelf currents up to 20% higher on the inner and middle shelf when the remote wind is included. For most of the outer shelf, on the other hand, this is not observed since usually, the correlation between the currents and the synoptic winds is not statistically significant.

  6. The role of remote wind forcing in the subinertial current variability in the central and northern parts of the South Brazil Bight

    NASA Astrophysics Data System (ADS)

    Dottori, Marcelo; Castro, Belmiro Mendes

    2018-05-01

    Data analysis of continental shelf currents and coastal sea level, together with the application of a semi-analytical model, are used to estimate the importance of remote wind forcing on the subinertial variability of the current in the central and northern areas of the South Brazil Bight. Results from both the data analysis and from the semi-analytical model are robust in showing subinertial variability that propagates along-shelf leaving the coast to the left in accordance with theoretical studies of Continental Shelf Waves (CSW). Both the subinertial variability observed in along-shelf currents and sea level oscillations present different propagation speeds for the narrow northern part of the SBB ( 6-7 m/s) and the wide central SBB region ( 11 m/s), those estimates being in agreement with the modeled CSW propagation speed. On the inner and middle shelf, observed along-shelf subinertial currents show higher correlation coefficients with the winds located southward and earlier in time than with the local wind at the current meter mooring position and at the time of measurement. The inclusion of the remote (located southwestward) wind forcing improves the prediction of the subinertial currents when compared to the currents forced only by the local wind, since the along-shelf-modeled currents present correlation coefficients with observed along-shelf currents up to 20% higher on the inner and middle shelf when the remote wind is included. For most of the outer shelf, on the other hand, this is not observed since usually, the correlation between the currents and the synoptic winds is not statistically significant.

  7. Digital signal processing methods for biosequence comparison.

    PubMed Central

    Benson, D C

    1990-01-01

    A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096

  8. Downregulation of Checkpoint Protein Kinase 2 in the Urothelium of Healthy Male Tobacco Smokers.

    PubMed

    Breyer, Johannes; Denzinger, Stefan; Hartmann, Arndt; Otto, Wolfgang

    2016-01-01

    With this letter to the editor we present for the first time a study on CHEK2 expression in normal urothelium of healthy male smokers, former smokers and non-smokers. We could show a statistically significant downregulation of this DNA repair gene in current smokers compared to non-smokers, suggesting that smoking downregulates CHEK2 in normal urothelium, probably associated with an early step in carcinogenesis of urothelial bladder carcinoma. © 2016 S. Karger AG, Basel.

  9. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  10. 75 FR 27221 - Fisheries of the Northeastern United States; Atlantic Bluefish Fishery; 2010 Atlantic Bluefish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... the Council's Bluefish Monitoring Committee (Monitoring Committee) and Scientific and Statistical... to, commercial and recreational catch/landing statistics, current estimates of fishing mortality... Marine Recreational Fisheries Statistics Survey (MRFSS) data through Wave 2 were available for 2009, and...

  11. 7 CFR 295.5 - Program statistical reports.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...

  12. A Statistical Portrait of Women in the United States. Current Population Reports, Special Studies Series P-32, No. 58.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Suitland, MD.

    This report presents a statistical portrait of the changing role of women in the United States during the 20th century. Data are from United States Government sources--from surveys, decennial censuses, vital statistics, and administrative records. The majority of the statistics have been published previously, either in government documents or…

  13. Study of magnetic fields from power-frequency current on water lines.

    PubMed

    Lanera, D; Zapotosky, J E; Colby, J A

    1997-01-01

    The magnetic fields from power-frequency current flowing on water lines were investigated in a new approach that involved an area-wide survey in a small town. Magnetic fields were measured outside the residence under power cables and over water lines, and each residence was characterized as to whether it received water from a private well or the municipal water system. The magnetic field data revealed two statistical modes when they were related to water supply type. The data also showed that in the case of the high mode, the magnetic field remained constant along the line formed by power drop wires, at the back of the house, and the water hookup service, in front of the house, all the way to the street. The patterns are explained by the coincidence of measurement points and the presence of net current flowing on power mains, power drop conductors, residential plumbing, water service hookups, and water mains. These patterns, together with other characteristics of this magnetic field source, such as the gradual spatial fall-off of this field and the presence of a constant component in the time sequence, portray a magnetic field more uniform and constant in the residential environment than has been thought to exist. Such characteristics make up for the weakness of the source and make net current a significant source of exposure in the lives of individuals around the house, when human exposure to magnetic fields is assumed to be a cumulative effect over time. This, together with the bimodal statistical distribution of the residential magnetic field (related to water supply type), presents opportunities for retrospective epidemiological analysis. Water line type and its ability to conduct power-frequency current can be used as the historical marker for a bimodal exposure inference, as Wertheimer et al. have shown.

  14. Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.

    2018-05-01

    To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become increasingly available, eruption forecasting is becoming an increasingly viable and important research field. We demonstrate an approach to utilize such data in order to appropriately 'tune' probabilistic hazard assessments for pyroclastic flows. Our broader objective with development of this method is to help advance time-dependent volcanic hazard assessment, by bridging the

  15. Information Measures for Statistical Orbit Determination

    ERIC Educational Resources Information Center

    Mashiku, Alinda K.

    2013-01-01

    The current Situational Space Awareness (SSA) is faced with a huge task of tracking the increasing number of space objects. The tracking of space objects requires frequent and accurate monitoring for orbit maintenance and collision avoidance using methods for statistical orbit determination. Statistical orbit determination enables us to obtain…

  16. Advances in statistics

    Treesearch

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  17. Global limits and interference patterns in dark matter direct detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catena, Riccardo; Gondolo, Paolo

    2015-08-13

    We compare the general effective theory of one-body dark matter nucleon interactions to current direct detection experiments in a global multidimensional statistical analysis. We derive exclusion limits on the 28 isoscalar and isovector coupling constants of the theory, and show that current data place interesting constraints on dark matter-nucleon interaction operators usually neglected in this context. We characterize the interference patterns that can arise in dark matter direct detection from pairs of dark matter-nucleon interaction operators, or from isoscalar and isovector components of the same operator. We find that commonly neglected destructive interference effects weaken standard direct detection exclusion limitsmore » by up to one order of magnitude in the coupling constants.« less

  18. Global limits and interference patterns in dark matter direct detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catena, Riccardo; Gondolo, Paolo, E-mail: riccardo.catena@theorie.physik.uni-goettingen.de, E-mail: paolo.gondolo@utah.edu

    2015-08-01

    We compare the general effective theory of one-body dark matter nucleon interactions to current direct detection experiments in a global multidimensional statistical analysis. We derive exclusion limits on the 28 isoscalar and isovector coupling constants of the theory, and show that current data place interesting constraints on dark matter-nucleon interaction operators usually neglected in this context. We characterize the interference patterns that can arise in dark matter direct detection from pairs of dark matter-nucleon interaction operators, or from isoscalar and isovector components of the same operator. We find that commonly neglected destructive interference effects weaken standard direct detection exclusion limitsmore » by up to one order of magnitude in the coupling constants.« less

  19. Conducting-insulating transition in adiabatic memristive networks

    NASA Astrophysics Data System (ADS)

    Sheldon, Forrest C.; Di Ventra, Massimiliano

    2017-01-01

    The development of neuromorphic systems based on memristive elements—resistors with memory—requires a fundamental understanding of their collective dynamics when organized in networks. Here, we study an experimentally inspired model of two-dimensional disordered memristive networks subject to a slowly ramped voltage and show that they undergo a discontinuous transition in the conductivity for sufficiently high values of memory, as quantified by the memristive ON-OFF ratio. We investigate the consequences of this transition for the memristive current-voltage characteristics both through simulation and theory, and demonstrate the role of current-voltage duality in relating forward and reverse switching processes. Our work sheds considerable light on the statistical properties of memristive networks that are presently studied both for unconventional computing and as models of neural networks.

  20. Analytic drain current model for III-V cylindrical nanowire transistors

    NASA Astrophysics Data System (ADS)

    Marin, E. G.; Ruiz, F. G.; Schmidt, V.; Godoy, A.; Riel, H.; Gámiz, F.

    2015-07-01

    An analytical model is proposed to determine the drain current of III-V cylindrical nanowires (NWs). The model uses the gradual channel approximation and takes into account the complete analytical solution of the Poisson and Schrödinger equations for the Γ-valley and for an arbitrary number of subbands. Fermi-Dirac statistics are considered to describe the 1D electron gas in the NWs, being the resulting recursive Fermi-Dirac integral of order -1/2 successfully integrated under reasonable assumptions. The model has been validated against numerical simulations showing excellent agreement for different semiconductor materials, diameters up to 40 nm, gate overdrive biases up to 0.7 V, and densities of interface states up to 1013eV-1cm-2 .

  1. An Introduction of Surveying and Geomatics Education with E-Platform in Nctu, Taiwan

    NASA Astrophysics Data System (ADS)

    Teo, T.-A.; Shih, P. T.-Y.

    2011-09-01

    This article presents the current status of Surveying and Geomatics education provided by Department of Civil Engineering at National Chiao Tung University, Taiwan. The Surveying and Geomatics Education at NCTU is introduced first. Then the current status of using E-learning platform for Surveying and Geomatics courses is described. This platform, also known as E-Campus, is designed and implemented by NCTU Digital Content Production Center. This paper also shows some statistical numbers of Surveying and Geomatics courses using E-Campus. The practical results indicated that the average login for undergraduate student is from 38 to 60 times per student for each course while the average login for graduate student is from 69 to 105.

  2. Electron Waiting Times of a Cooper Pair Splitter

    NASA Astrophysics Data System (ADS)

    Walldorf, Nicklas; Padurariu, Ciprian; Jauho, Antti-Pekka; Flindt, Christian

    2018-02-01

    Electron waiting times are an important concept in the analysis of quantum transport in nanoscale conductors. Here we show that the statistics of electron waiting times can be used to characterize Cooper pair splitters that create spatially separated spin-entangled electrons. A short waiting time between electrons tunneling into different leads is associated with the fast emission of a split Cooper pair, while long waiting times are governed by the slow injection of Cooper pairs from a superconductor. Experimentally, the waiting time distributions can be measured using real-time single-electron detectors in the regime of slow tunneling, where conventional current measurements are demanding. Our work is important for understanding the fundamental transport processes in Cooper pair splitters and the predictions may be verified using current technology.

  3. Parametric study of variation in cargo-airplane performance related to progression from current to spanloader designs

    NASA Technical Reports Server (NTRS)

    Toll, T. A.

    1980-01-01

    A parametric analysis was made to investigate the relationship between current cargo airplanes and possible future designs that may differ greatly in both size and configuration. The method makes use of empirical scaling laws developed from statistical studies of data from current and advanced airplanes and, in addition, accounts for payload density, effects of span distributed load, and variations in tail area ratio. The method is believed to be particularly useful for exploratory studies of design and technology options for large airplanes. The analysis predicts somewhat more favorable variations of the ratios of payload to gross weight and block fuel to payload as the airplane size is increased than has been generally understood from interpretations of the cube-square law. In terms of these same ratios, large all wing (spanloader) designs show an advantage over wing-fuselage designs.

  4. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.

    PubMed

    Chung, Dongjun; Kim, Hang J; Zhao, Hongyu

    2017-02-01

    Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.

  5. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  6. Howthe IMF By induces a By component in the closed magnetosphere and how it leads to asymmetric currents and convection patterns in the two hemispheres

    NASA Astrophysics Data System (ADS)

    Tenfjord, Paul; Østgaard, Nikolai; Snekvik, Kristian; Reistad, Jone; Magnus Laundal, Karl; Haaland, Stein; Milan, Steve

    2016-04-01

    We describe the effects of the interplanetary magnetic field (IMF) By component on the coupling between the solar wind and magnetosphere-ionosphere system using AMPERE observations and MHD simulations. We show how By is induced on closed magnetospheric field lines on both the dayside and nightside. The magnetosphere imposes asymmetric forces on the ionosphere, and the effects on the ionospheric flow are characterized by distorted convection cell patterns, often referred to as "banana" and "orange" cell patterns. The flux asymmetrically added to the lobes results in a nonuniform induced By in the closed magnetosphere. We present a mechanism that predicts asymmetric Birkeland currents at conjugate foot points. Asymmetric Birkeland currents are created as a consequence of y directed tension contained in the return flow. Associated with these currents, we expect aurora and fast localized ionospheric azimuthal flows present in one hemisphere but not necessarily in the other. We present a statistical study where we show that these processes should occur on timescales of about 30 minutes after the IMF By has arrived at the magnetopause. We also present an event with simultaneous global imaging of the aurora and SuperDARN measurements from both hemisphere. The event is interpreted as an example of the of the proposed asymmetric current mechanism.

  7. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    NASA Technical Reports Server (NTRS)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  8. Evaluation of Lightning Incidence to Elements of a Complex Structure: A Monte Carlo Approach

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Rakov, V. A.

    2008-01-01

    There are complex structures for which the installation and positioning of the lightning protection system (LPS) cannot be done using the lightning protection standard guidelines. As a result, there are some "unprotected" or "exposed" areas. In an effort to quantify the lightning threat to these areas, a Monte Carlo statistical tool has been developed. This statistical tool uses two random number generators: a uniform distribution to generate origins of downward propagating leaders and a lognormal distribution to generate returns stroke peak currents. Downward leaders propagate vertically downward and their striking distances are defined by the polarity and peak current. Following the electrogeometrical concept, we assume that the leader attaches to the closest object within its striking distance. The statistical analysis is run for 10,000 years with an assumed ground flash density and peak current distributions, and the output of the program is the probability of direct attachment to objects of interest with its corresponding peak current distribution.

  9. When can social media lead financial markets?

    PubMed

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-02-27

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.

  10. When Can Social Media Lead Financial Markets?

    NASA Astrophysics Data System (ADS)

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-02-01

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.

  11. When Can Social Media Lead Financial Markets?

    PubMed Central

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-01-01

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes. PMID:24572909

  12. How much to trust the senses: Likelihood learning

    PubMed Central

    Sato, Yoshiyuki; Kording, Konrad P.

    2014-01-01

    Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975

  13. A Pilot Study Assessing Performance and Visual Attention of Teenagers with ASD in a Novel Adaptive Driving Simulator.

    PubMed

    Wade, Joshua; Weitlauf, Amy; Broderick, Neill; Swanson, Amy; Zhang, Lian; Bian, Dayi; Sarkar, Medha; Warren, Zachary; Sarkar, Nilanjan

    2017-11-01

    Individuals with Autism Spectrum Disorder (ASD), compared to typically-developed peers, may demonstrate behaviors that are counter to safe driving. The current work examines the use of a novel simulator in two separate studies. Study 1 demonstrates statistically significant performance differences between individuals with (N = 7) and without ASD (N = 7) with regards to the number of turning-related driving errors (p < 0.01). Study 2 shows that both the performance-based feedback group (N = 9) and combined performance- and gaze-sensitive feedback group (N = 8) achieved statistically significant reductions in driving errors following training (p < 0.05). These studies are the first to present results of fine-grained measures of visual attention of drivers and an adaptive driving intervention for individuals with ASD.

  14. A statistical framework for biomedical literature mining.

    PubMed

    Chung, Dongjun; Lawson, Andrew; Zheng, W Jim

    2017-09-30

    In systems biology, it is of great interest to identify new genes that were not previously reported to be associated with biological pathways related to various functions and diseases. Identification of these new pathway-modulating genes does not only promote understanding of pathway regulation mechanisms but also allow identification of novel targets for therapeutics. Recently, biomedical literature has been considered as a valuable resource to investigate pathway-modulating genes. While the majority of currently available approaches are based on the co-occurrence of genes within an abstract, it has been reported that these approaches show only sub-optimal performances because 70% of abstracts contain information only for a single gene. To overcome such limitation, we propose a novel statistical framework based on the concept of ontology fingerprint that uses gene ontology to extract information from large biomedical literature data. The proposed framework simultaneously identifies pathway-modulating genes and facilitates interpreting functions of these new genes. We also propose a computationally efficient posterior inference procedure based on Metropolis-Hastings within Gibbs sampler for parameter updates and the poor man's reversible jump Markov chain Monte Carlo approach for model selection. We evaluate the proposed statistical framework with simulation studies, experimental validation, and an application to studies of pathway-modulating genes in yeast. The R implementation of the proposed model is currently available at https://dongjunchung.github.io/bayesGO/. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Quadriceps Tendon Autograft in Anterior Cruciate Ligament Reconstruction: A Systematic Review.

    PubMed

    Hurley, Eoghan T; Calvo-Gurry, Manuel; Withers, Dan; Farrington, Shane K; Moran, Ray; Moran, Cathal J

    2018-05-01

    To systematically review the current evidence to ascertain whether quadriceps tendon autograft (QT) is a viable option in anterior cruciate ligament reconstruction. A literature review was conducted in accordance with Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. Cohort studies comparing QT with bone-patellar tendon-bone autograft (BPTB) or hamstring tendon autograft (HT) were included. Clinical outcomes were compared, with all statistical analyses performed using IBM SPSS Statistics for Windows, version 22.0, with P < .05 being considered statistically significant. We identified 15 clinical trials with 1,910 patients. In all included studies, QT resulted in lower rates of anterior knee pain than BPTB. There was no difference in the rate of graft rupture between QT and BPTB or HT in any of the studies reporting this. One study found that QT resulted in greater knee stability than BPTB, and another study found increased stability compared with HT. One study found that QT resulted in improved functional outcomes compared with BPTB, and another found improved outcomes compared with HT, but one study found worse outcomes compared with BPTB. Current literature suggests QT is a viable option in anterior cruciate ligament reconstruction, with published literature showing comparable knee stability, functional outcomes, donor-site morbidity, and rerupture rates compared with BPTB and HT. Level III, systematic review of Level I, II, and III studies. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  16. Localized N20 Component of Somatosensory Evoked Magnetic Fields in Frontoparietal Brain Tumor Patients Using Noise-Normalized Approaches.

    PubMed

    Elaina, Nor Safira; Malik, Aamir Saeed; Shams, Wafaa Khazaal; Badruddin, Nasreen; Abdullah, Jafri Malin; Reza, Mohammad Faruque

    2018-06-01

    To localize sensorimotor cortical activation in 10 patients with frontoparietal tumors using quantitative magnetoencephalography (MEG) with noise-normalized approaches. Somatosensory evoked magnetic fields (SEFs) were elicited in 10 patients with somatosensory tumors and in 10 control participants using electrical stimulation of the median nerve via the right and left wrists. We localized the N20m component of the SEFs using dynamic statistical parametric mapping (dSPM) and standardized low-resolution brain electromagnetic tomography (sLORETA) combined with 3D magnetic resonance imaging (MRI). The obtained coordinates were compared between groups. Finally, we statistically evaluated the N20m parameters across hemispheres using non-parametric statistical tests. The N20m sources were accurately localized to Brodmann area 3b in all members of the control group and in seven of the patients; however, the sources were shifted in three patients relative to locations outside the primary somatosensory cortex (SI). Compared with the affected (tumor) hemispheres in the patient group, N20m amplitudes and the strengths of the current sources were significantly lower in the unaffected hemispheres and in both hemispheres of the control group. These results were consistent for both dSPM and sLORETA approaches. Tumors in the sensorimotor cortex lead to cortical functional reorganization and an increase in N20m amplitude and current-source strengths. Noise-normalized approaches for MEG analysis that are integrated with MRI show accurate and reliable localization of sensorimotor function.

  17. The T(ea) Test: Scripted Stories Increase Statistical Method Selection Skills

    ERIC Educational Resources Information Center

    Hackathorn, Jana; Ashdown, Brien

    2015-01-01

    To teach statistics, teachers must attempt to overcome pedagogical obstacles, such as dread, anxiety, and boredom. There are many options available to teachers that facilitate a pedagogically conducive environment in the classroom. The current study examined the effectiveness of incorporating scripted stories and humor into statistical method…

  18. 75 FR 34971 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  19. 78 FR 63961 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  20. Mobile Digest of Education Statistics, 2013. NCES 2014-086

    ERIC Educational Resources Information Center

    Snyder, Thomas D.

    2014-01-01

    This is the first edition of the "Mobile Digest of Education Statistics." This compact compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mobile Digest" is designed as an easy mobile reference for materials found in detail in the…

  1. The Utility of Robust Means in Statistics

    ERIC Educational Resources Information Center

    Goodwyn, Fara

    2012-01-01

    Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…

  2. Handbook of Labor Statistics. Bulletin 2175.

    ERIC Educational Resources Information Center

    Springsteen, Rosalind, Comp.; Epstein, Rosalie, Comp.

    This publication makes available in one volume the major series produced by the Bureau of Labor Statistics. Technical notes preceding each major section contain information on data changes and explain the services. Forty-four tables derived from the Current Population Survey (CPS) provide statistics on labor force and employment status,…

  3. 77 FR 36995 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  4. 77 FR 18689 - Changes to Standard Numbering System, Vessel Identification System, and Boating Accident Report...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... requires States to compile and send us reports, information, and statistics on casualties reported to them... data and statistical information received from the current collection to establish National... accident prevention programs; and publish accident statistics in accordance with Title 46 U.S.C. 6102...

  5. 75 FR 7412 - Reporting Information Regarding Falsification of Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... concomitant medications or treatments; omitting data so that a statistical analysis yields a result that would..., results, statistics, items of information, or statements made by individuals. This proposed rule would..., Bureau of Labor Statistics ( www.bls.gov/oes/current/naics4_325400.htm ); compliance officer wage rate...

  6. 76 FR 59998 - Notice of Intent To Suspend the Postharvest Chemical Use Survey and All Associated Reports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Suspend the Postharvest Chemical Use Survey and All Associated Reports AGENCY: National Agricultural Statistics Service... the intention of the National Agricultural Statistics Service (NASS) to suspend a currently approved...

  7. 78 FR 10596 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-14

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995 this notice announces the intention of the National Agricultural Statistics Service...

  8. 78 FR 10597 - Notice of Intent to Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-14

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent to Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  9. 76 FR 64299 - Notice of Intent to Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent to Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  10. 77 FR 5759 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  11. Multivariate Relationships between Statistics Anxiety and Motivational Beliefs

    ERIC Educational Resources Information Center

    Baloglu, Mustafa; Abbassi, Amir; Kesici, Sahin

    2017-01-01

    In general, anxiety has been found to be associated with motivational beliefs and the current study investigated multivariate relationships between statistics anxiety and motivational beliefs among 305 college students (60.0% women). The Statistical Anxiety Rating Scale, the Motivated Strategies for Learning Questionnaire, and a set of demographic…

  12. 78 FR 64910 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  13. 78 FR 10596 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-14

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...

  14. Statistical Literacy among Applied Linguists and Second Language Acquisition Researchers

    ERIC Educational Resources Information Center

    Loewen, Shawn; Lavolette, Elizabeth; Spino, Le Anne; Papi, Mostafa; Schmidtke, Jens; Sterling, Scott; Wolff, Dominik

    2014-01-01

    The importance of statistical knowledge in applied linguistics and second language acquisition (SLA) research has been emphasized in recent publications. However, the last investigation of the statistical literacy of applied linguists occurred more than 25 years ago (Lazaraton, Riggenbach, & Ediger, 1987). The current study undertook a partial…

  15. 77 FR 68732 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention the National Agricultural Statistics Service...

  16. Towards the definition of AMS facies in the deposits of pyroclastic density currents

    USGS Publications Warehouse

    Ort, M.H.; Newkirk, T.T.; Vilas, J.F.; Vazquez, J.A.; Ort, M.H.; Porreca, Massimiliano; Geissman, J.W.

    2014-01-01

    Anisotropy of magnetic susceptibility (AMS) provides a statistically robust technique to characterize the fabrics of deposits of pyroclastic density currents (PDCs). AMS fabrics in two types of pyroclastic deposits (small-volume phreatomagmatic currents in the Hopi Buttes volcanic field, Arizona, USA, and large-volume caldera-forming currents, Caviahue Caldera, Neuquén, Argentina) show similar patterns. Near the vent and in areas of high topographical roughness, AMS depositional fabrics are poorly grouped, with weak lineations and foliations. In a densely welded proximal ignimbrite, this fabric is overprinted by a foliation formed as the rock compacted and deformed. Medial deposits have moderate–strong AMS lineations and foliations. The most distal deposits have strong foliations but weak lineations. Based on these facies and existing models for pyroclastic density currents, deposition in the medial areas occurs from the strongly sheared, high-particle-concentration base of a density-stratified current. In proximal areas and where topography mixes this denser base upwards into the current, deposition occurs rapidly from a current with little uniformity to the shear, in which particles fall and collide in a chaotic fashion. Distal deposits are emplaced by a slowing or stalled current so that the dominant particle motion is vertical, leading to weak lineation and strong foliation.

  17. Current trends in treatment of hypertension in Karachi and cost minimization possibilities.

    PubMed

    Hussain, Izhar M; Naqvi, Baqir S; Qasim, Rao M; Ali, Nasir

    2015-01-01

    This study finds out drug usage trends in Stage I Hypertensive Patients without any compelling indications in Karachi, deviations of current practices from evidence based antihypertensive therapeutic guidelines and looks for cost minimization opportunities. In the present study conducted during June 2012 to August 2012, two sets were used. Randomized stratified independent surveys were conducted in doctors and general population - including patients, using pretested questionnaires. Sample sizes for doctors and general population were 100 and 400 respectively. Statistical analysis was conducted on Statistical Package for Social Science (SPSS). Financial impact was also analyzed. On the basis of patients' doctors' feedback, Beta Blockers, and Angiotensin Converting Enzyme Inhibitors were used more frequently than other drugs. Thiazides and low-priced generics were hardly prescribed. Beta blockers were prescribed widely and considered cost effective. This trend increases cost by two to ten times. Feedbacks showed that therapeutic guidelines were not followed by the doctors practicing in the community and hospitals in Karachi. Thiazide diuretics were hardly used. Beta blockers were widely prescribed. High priced market leaders or expensive branded generics were commonly prescribed. Therefore, there are great opportunities for cost minimization by using evidence-based clinically effective and safe medicines.

  18. Predicting the performance of linear optical detectors in free space laser communication links

    NASA Astrophysics Data System (ADS)

    Farrell, Thomas C.

    2018-05-01

    While the fundamental performance limit for optical communications is set by the quantum nature of light, in practical systems background light, dark current, and thermal noise of the electronics also degrade performance. In this paper, we derive a set of equations predicting the performance of PIN diodes and linear mode avalanche photo diodes (APDs) in the presence of such noise sources. Electrons generated by signal, background, and dark current shot noise are well modeled in PIN diodes as Poissonian statistical processes. In APDs, on the other hand, the amplifying effects of the device result in statistics that are distinctly non-Poissonian. Thermal noise is well modeled as Gaussian. In this paper, we appeal to the central limit theorem and treat both the variability of the signal and the sum of noise sources as Gaussian. Comparison against Monte-Carlo simulation of PIN diode performance (where we do model shot noise with draws from a Poissonian distribution) validates the legitimacy of this approximation. On-off keying, M-ary pulse position, and binary differential phase shift keying modulation are modeled. We conclude with examples showing how the equations may be used in a link budget to estimate the performance of optical links using linear receivers.

  19. Degrees of separation as a statistical tool for evaluating candidate genes.

    PubMed

    Nelson, Ronald M; Pettersson, Mats E

    2014-12-01

    Selection of candidate genes is an important step in the exploration of complex genetic architecture. The number of gene networks available is increasing and these can provide information to help with candidate gene selection. It is currently common to use the degree of connectedness in gene networks as validation in Genome Wide Association (GWA) and Quantitative Trait Locus (QTL) mapping studies. However, it can cause misleading results if not validated properly. Here we present a method and tool for validating the gene pairs from GWA studies given the context of the network they co-occur in. It ensures that proposed interactions and gene associations are not statistical artefacts inherent to the specific gene network architecture. The CandidateBacon package provides an easy and efficient method to calculate the average degree of separation (DoS) between pairs of genes to currently available gene networks. We show how these empirical estimates of average connectedness are used to validate candidate gene pairs. Validation of interacting genes by comparing their connectedness with the average connectedness in the gene network will provide support for said interactions by utilising the growing amount of gene network information available. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Investigating the Relationship between Ocean Surface Currents and Seasonal Precipitation in the Western United States

    NASA Astrophysics Data System (ADS)

    Chiang, F.; AghaKouchak, A.

    2017-12-01

    While many studies have explored the predictive capabilities of teleconnections associated with North American climate, currently established teleconnections offer limited predictability for rainfall in the Western United States. A recent example was the 2015-16 California drought in which a strong ENSO signal did not lead to above average precipitation as was expected. From an exploration of climate and ocean variables available from satellite data, we hypothesize that ocean currents can provide additional information to explain precipitation variability and improve seasonal predictability on the West Coast. Since ocean currents are influenced by surface wind and temperatures, characterizing connections between currents and precipitation patterns has the potential to further our understanding of coastal weather patterns. For the study, we generated gridded point correlation maps to identify ocean areas with high correlation to precipitation time series corresponding to climate regions in the West Coast region. We also used other statistical measures to evaluate ocean `hot spot' regions with significant correlation to West Coast precipitation. Preliminary results show that strong correlations can be found in the tropical regions of the globe.

  1. Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.

    PubMed

    Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar

    2010-09-01

    A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.

  2. A pilot investigation of the hypoalgesic effects of transcutaneous electrical nerve stimulation upon low back pain in people with multiple sclerosis.

    PubMed

    Al-Smadi, J; Warke, K; Wilson, I; Cramp, A F L; Noble, G; Walsh, D M; Lowe-Strong, A S

    2003-11-01

    To investigate the hypoalgesic effects of transcutaneous electrical nerve stimulation (TENS) upon low back pain (LBP) in people with multiple sclerosis (MS). A randomized double-blind placebo controlled clinical pilot study. Fifteen people with MS were recruited and randomly allocated to one of the following groups under double blind conditions (n = 5 per group): TENS 1 (4 Hz, 200 micros), TENS 2 (110 Hz, 200 micros), placebo TENS. Treatment was applied for 45 minutes three times a week for six weeks with a four-week follow-up. The following outcome measures were taken at weeks 1, 6, and 10: visual analogue scale (VAS) (for current LBP, right leg pain, left leg pain); Leeds Multiple Sclerosis Quality of Life Questionnaire; Roland Morris Disability Questionnaire; Short Form-36 (SF-36) Version 1; and the McGill Pain Questionnaire (MPQ). VAS for current LBP, right and left leg pain were also taken before and after treatment, and once a week during the follow-up period. Analysis showed no statistically significant effects for any of the data. However, both active treatment groups showed a trend of improvement in the majority of the outcome measures. Active TENS was more effective than placebo TENS in decreasing VAS scores following each treatment although results were not statistically significant. Further work in this area is warranted and should include a larger number of participants in the form of a randomized controlled clinical trial to determine the efficacy of this modality.

  3. Problem of Auroral Oval Mapping and Multiscale Auroral Structures

    NASA Astrophysics Data System (ADS)

    Antonova, Elizaveta; Stepanova, Marina; Kirpichev, Igor; Vovchenko, Vadim; Vorobjev, Viachislav; Yagodkina, Oksana

    The problem of the auroral oval mapping to the equatorial plane is reanalyzed taking into account the latest results of the analysis of plasma pressure distribution at low altitudes and at the equatorial plane. Statistical pictures of pressure distribution at low latitudes are obtained using data of DMSP observations. We obtain the statistical pictures of pressure distribution at the equatorial plane using data of THEMIS mission. Results of THEMIS observations demonstrate the existence of plasma ring surrounding the Earth at geocentric distances from ~6 till ~12Re. Plasma pressure in the ring is near to isotropic and its averaged values are larger than 0.2 nPa. We take into account that isotropic plasma pressure is constant along the field line and that the existence of field-aligned potential drops in the region of the acceleration of auroral electrons leads to pressure decrease at low altitudes. We show that most part of quite time auroral oval does not map to the real plasma sheet. It maps to the surrounding the Earth plasma ring. We also show that transverse currents in the plasma ring are closed inside the magnetosphere forming the high latitude continuation of the ordinary ring current. The obtained results are used for the explanation of ring like form of the auroral oval. We also analyze the processes of the formation of multiscale auroral structures including thin auroral arcs and discuss the difficulties of the theories of alfvenic acceleration of auroral electrons.

  4. Usefulness of Leukocyte Esterase Test Versus Rapid Strep Test for Diagnosis of Acute Strep Pharyngitis

    PubMed Central

    2015-01-01

    Objective: A study to compare the usage of throat swab testing for leukocyte esterase on a test strip(urine dip stick-multi stick) to rapid strep test for rapid diagnosis of Group A Beta hemolytic streptococci in cases of acute pharyngitis in children. Hypothesis: The testing of throat swab for leukocyte esterase on test strip currently used for urine testing may be used to detect throat infection and might be as useful as rapid strep. Methods: All patients who come with a complaint of sore throat and fever were examined clinically for erythema of pharynx, tonsils and also for any exudates. Informed consent was obtained from the parents and assent from the subjects. 3 swabs were taken from pharyngo-tonsillar region, testing for culture, rapid strep & Leukocyte Esterase. Results: Total number is 100. Cultures 9(+); for rapid strep== 84(-) and16 (+); For LE== 80(-) and 20(+) Statistics: From data configuration Rapid Strep versus LE test don’t seem to be a random (independent) assignment but extremely aligned. The Statistical results show rapid and LE show very agreeable results. Calculated Value of Chi Squared Exceeds Tabulated under 1 Degree Of Freedom (P<.0.0001) reject Null Hypothesis and Conclude Alternative Conclusions: Leukocyte esterase on throat swab is as useful as rapid strep test for rapid diagnosis of strep pharyngitis on test strip currently used for urine dip stick causing acute pharyngitis in children. PMID:27335975

  5. Psychophysical Map Stability in Bilateral Sequential Cochlear Implantation: Comparing Current Audiology Methods to a New Statistical Definition.

    PubMed

    Domville-Lewis, Chloe; Santa Maria, Peter L; Upson, Gemma; Chester-Browne, Ronel; Atlas, Marcus D

    2015-01-01

    The purpose of this study was to establish a statistical definition for stability in cochlear implant maps. Once defined, this study aimed to compare the duration taken to achieve a stable map in first and second implants in patients who underwent sequential bilateral cochlear implantation. This article also sought to evaluate a number of factors that potentially affect map stability. A retrospective cohort study of 33 patients with sensorineural hearing loss who received sequential bilateral cochlear implantation (Cochlear, Sydney, Australia), performed by the senior author. Psychophysical parameters of hearing threshold scores, comfort scores, and the dynamic range were measured for the apical, medial, and basal portions of the cochlear implant electrode at a range of intervals postimplantation. Stability was defined statistically as a less than 10% difference in threshold, comfort, and dynamic range scores over three consecutive mapping sessions. A senior cochlear implant audiologist, blinded to implant order and the statistical results, separately analyzed these psychophysical map parameters using current assessment methods. First and second implants were compared for duration to achieve stability, age, gender, the duration of deafness, etiology of deafness, time between the insertion of the first and second implant, and the presence or absence of preoperative hearing aids were evaluated and its relationship to stability. Statistical analysis included performing a two-tailed Student's t tests and least squares regression analysis, with a statistical significance set at p ≤ 0.05. There was a significant positive correlation between the devised statistical definition and the current audiology methods for assessing stability, with a Pearson correlation coefficient r = 0.36 and a least squares regression slope (b) of 0.41, df(58), 95% confidence interval 0.07 to 0.55 (p = 0.004). The average duration from device switch on to stability in the first implant was 87 days using current audiology methods and 81 days using the statistical definition, with no statistically significant difference between assessment methods (p = 0.2). The duration to achieve stability in the second implant was 51 days using current audiology methods and 60 days using the statistical method, and again no difference between the two assessment methods (p = 0.13). There was a significant reduction in the time to achieve stability in second implants for both audiology and statistical methods (p < 0.001 and p = 0.02, respectively). There was a difference in duration to achieve stability based on electrode array region, with basal portions taking longer to stabilize than apical in the first implant (p = 0.02) and both apical and medial segments in second implants (p = 0.004 and p = 0.01, respectively). No factors that were evaluated in this study, including gender, age, etiology of deafness, duration of deafness, time between implant insertion, and the preoperative hearing aid status, were correlated with stability duration in either stability assessment method. Our statistical definition can accurately predict cochlear implant map stability when compared with current audiology practices. Cochlear implants that are implanted second tend to stabilize sooner than the first, which has a significant impact on counseling before a second implant. No factors evaluated affected the duration required to achieve stability in this study.

  6. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test

    PubMed Central

    2013-01-01

    Background The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. Results One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to “filter” redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. Conclusion We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known. PMID:24199751

  7. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    PubMed

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known.

  8. An Exploration of Preference for Numerical Information in Relation to Math Self-Concept and Statistics Anxiety in a Graduate Statistics Course

    ERIC Educational Resources Information Center

    Williams, Amanda

    2014-01-01

    The purpose of the current research was to investigate the relationship between preference for numerical information (PNI), math self-concept, and six types of statistics anxiety in an attempt to establish support for the nomological validity of the PNI. Correlations indicate that four types of statistics anxiety were strongly related to PNI, and…

  9. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    PubMed

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis

    PubMed Central

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-01-01

    Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689

  11. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  12. A SIGNIFICANCE TEST FOR THE LASSO1

    PubMed Central

    Lockhart, Richard; Taylor, Jonathan; Tibshirani, Ryan J.; Tibshirani, Robert

    2014-01-01

    In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model). Our proof of this result for the special case of the first predictor to enter the model (i.e., testing for a single significant predictor variable against the global null) requires only weak assumptions on the predictor matrix X. On the other hand, our proof for a general step in the lasso path places further technical assumptions on X and the generative model, but still allows for the important high-dimensional case p > n, and does not necessarily require that the current lasso model achieves perfect recovery of the truly active variables. Of course, for testing the significance of an additional variable between two nested linear models, one typically uses the chi-squared test, comparing the drop in residual sum of squares (RSS) to a χ12 distribution. But when this additional variable is not fixed, and has been chosen adaptively or greedily, this test is no longer appropriate: adaptivity makes the drop in RSS stochastically much larger than χ12 under the null hypothesis. Our analysis explicitly accounts for adaptivity, as it must, since the lasso builds an adaptive sequence of linear models as the tuning parameter λ decreases. In this analysis, shrinkage plays a key role: though additional variables are chosen adaptively, the coefficients of lasso active variables are shrunken due to the l1 penalty. Therefore, the test statistic (which is based on lasso fitted values) is in a sense balanced by these two opposing properties—adaptivity and shrinkage—and its null distribution is tractable and asymptotically Exp(1). PMID:25574062

  13. Current State and Development Trends of Education Policy Research in China in the Last Decade (2004-2013): A Statistical Analysis of Papers from Eight Core Chinese Journals

    ERIC Educational Resources Information Center

    Ling, Guo

    2017-01-01

    The author conducted sampling and statistical analysis of papers on education policy research collected by the China National Knowledge Infrastructure in the period from the years 2004--2013. Under the current state of education policy research in China, the number of papers correlates positively with the year; the papers are concentrated in…

  14. Interhemispheric currents in the ring current region as seen by the Cluster spacecraft

    NASA Astrophysics Data System (ADS)

    Tenfjord, P.; Ostgaard, N.; Haaland, S.; Laundal, K.; Reistad, J. P.

    2013-12-01

    The existence of interhemispheric currents has been predicted by several authors, but their extent in the ring current has to our knowledge never been studied systematically by using in-situ measurements. These currents have been suggested to be associated with observed asymmetries of the aurora. We perform a statistical study of current density and direction during ring current crossings using the Cluster spacecraft. We analyse the extent of the interhemispheric field aligned currents for a wide range of solar wind conditions. Direct estimations of equatorial current direction and density are achieved through the curlometer technique. The curlometer technique is based on Ampere's law and requires magnetic field measurements from all four spacecrafts. The use of this method requires careful study of factors that limit the accuracy, such as tetrahedron shape and configuration. This significantly limits our dataset, but is a necessity for accurate current calculations. Our goal is to statistically investigate the occurrence of interhemispheric currents, and determine if there are parameters or magnetospheric states on which the current magnitude and directions depend upon.

  15. Shallow plumbing systems inferred from spatial analysis of pockmark arrays

    NASA Astrophysics Data System (ADS)

    Maia, A.; Cartwright, J. A.; Andersen, E.

    2016-12-01

    This study describes and analyses an extraordinary array of pockmarks at the modern seabed of the Lower Congo Basin (offshore Angola), in order to understand the fluid migration routes and shallow plumbing system of the area. The 3D seismic visualization of feeding conduits (pipes) allowed the identification of the source interval for the fluids expelled during pockmark formation. Spatial statistics are used to show the relationship between the underlying (polarised) polygonal fault (PPFs) patterns and seabed pockmarks distributions. Our results show PPFs control the linear arrangement of pockmarks and feeder pipes along fault strike, but faults do not act as conduits. Spatial statistics also revealed pockmark occurrence is not considered to be random, especially at short distances to nearest neighbours (<200m) where anti-clustering distributions suggest the presence of an exclusion zone around each pockmark in which no other pockmark will form. The results of this study are relevant for the understanding of shallow fluid plumbing systems in offshore settings, with implications on our current knowledge of overall fluid flow systems in hydrocarbon-rich continental margins.

  16. Crossing statistic: reconstructing the expansion history of the universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shafieloo, Arman, E-mail: arman@ewha.ac.kr

    2012-08-01

    We present that by combining Crossing Statistic [1,2] and Smoothing method [3-5] one can reconstruct the expansion history of the universe with a very high precision without considering any prior on the cosmological quantities such as the equation of state of dark energy. We show that the presented method performs very well in reconstruction of the expansion history of the universe independent of the underlying models and it works well even for non-trivial dark energy models with fast or slow changes in the equation of state of dark energy. Accuracy of the reconstructed quantities along with independence of the methodmore » to any prior or assumption gives the proposed method advantages to the other non-parametric methods proposed before in the literature. Applying on the Union 2.1 supernovae combined with WiggleZ BAO data we present the reconstructed results and test the consistency of the two data sets in a model independent manner. Results show that latest available supernovae and BAO data are in good agreement with each other and spatially flat ΛCDM model is in concordance with the current data.« less

  17. Multi-scales region segmentation for ROI separation in digital mammograms

    NASA Astrophysics Data System (ADS)

    Zhang, Dapeng; Zhang, Di; Li, Yue; Wang, Wei

    2017-02-01

    Mammography is currently the most effective imaging modality used by radiologists for the screening of breast cancer. Segmentation is one of the key steps in the process of developing anatomical models for calculation of safe medical dose of radiation. This paper explores the potential of the statistical region merging segmentation technique for Breast segmentation in digital mammograms. First, the mammograms are pre-processing for regions enhancement, then the enhanced images are segmented using SRM with multi scales, finally these segmentations are combined for region of interest (ROI) separation and edge detection. The proposed algorithm uses multi-scales region segmentation in order to: separate breast region from background region, region edge detection and ROIs separation. The experiments are performed using a data set of mammograms from different patients, demonstrating the validity of the proposed criterion. Results show that, the statistical region merging segmentation algorithm actually can work on the segmentation of medical image and more accurate than another methods. And the outcome shows that the technique has a great potential to become a method of choice for segmentation of mammograms.

  18. Effect of Random Circuit Fabrication Errors on Small Signal Gain and Phase in Helix Traveling Wave Tubes

    NASA Astrophysics Data System (ADS)

    Pengvanich, P.; Chernin, D. P.; Lau, Y. Y.; Luginsland, J. W.; Gilgenbach, R. M.

    2007-11-01

    Motivated by the current interest in mm-wave and THz sources, which use miniature, difficult-to-fabricate circuit components, we evaluate the statistical effects of random fabrication errors on a helix traveling wave tube amplifier's small signal characteristics. The small signal theory is treated in a continuum model in which the electron beam is assumed to be monoenergetic, and axially symmetric about the helix axis. Perturbations that vary randomly along the beam axis are introduced in the dimensionless Pierce parameters b, the beam-wave velocity mismatch, C, the gain parameter, and d, the cold tube circuit loss. Our study shows, as expected, that perturbation in b dominates the other two. The extensive numerical data have been confirmed by our analytic theory. They show in particular that the standard deviation of the output phase is linearly proportional to standard deviation of the individual perturbations in b, C, and d. Simple formulas have been derived which yield the output phase variations in terms of the statistical random manufacturing errors. This work was supported by AFOSR and by ONR.

  19. Examination of Solar Cycle Statistical Model and New Prediction of Solar Cycle 23

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.

    2000-01-01

    Sunspot numbers in the current solar cycle 23 were estimated by using a statistical model with the accumulating cycle sunspot data based on the odd-even behavior of historical sunspot cycles from 1 to 22. Since cycle 23 has progressed and the accurate solar minimum occurrence has been defined, the statistical model is validated by comparing the previous prediction with the new measured sunspot number; the improved sunspot projection in short range of future time is made accordingly. The current cycle is expected to have a moderate level of activity. Errors of this model are shown to be self-correcting as cycle observations become available.

  20. Automatic identification of bacterial types using statistical imaging methods

    NASA Astrophysics Data System (ADS)

    Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon

    2003-05-01

    The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.

  1. Complete integrability of information processing by biochemical reactions

    PubMed Central

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-01-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018

  2. Selecting a Classification Ensemble and Detecting Process Drift in an Evolving Data Stream

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heredia-Langner, Alejandro; Rodriguez, Luke R.; Lin, Andy

    2015-09-30

    We characterize the commercial behavior of a group of companies in a common line of business using a small ensemble of classifiers on a stream of records containing commercial activity information. This approach is able to effectively find a subset of classifiers that can be used to predict company labels with reasonable accuracy. Performance of the ensemble, its error rate under stable conditions, can be characterized using an exponentially weighted moving average (EWMA) statistic. The behavior of the EWMA statistic can be used to monitor a record stream from the commercial network and determine when significant changes have occurred. Resultsmore » indicate that larger classification ensembles may not necessarily be optimal, pointing to the need to search the combinatorial classifier space in a systematic way. Results also show that current and past performance of an ensemble can be used to detect when statistically significant changes in the activity of the network have occurred. The dataset used in this work contains tens of thousands of high level commercial activity records with continuous and categorical variables and hundreds of labels, making classification challenging.« less

  3. Complete integrability of information processing by biochemical reactions

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  4. Probing Majorana bound states via counting statistics of a single electron transistor

    PubMed Central

    Li, Zeng-Zhao; Lam, Chi-Hang; You, J. Q.

    2015-01-01

    We propose an approach for probing Majorana bound states (MBSs) in a nanowire via counting statistics of a nearby charge detector in the form of a single-electron transistor (SET). We consider the impacts on the counting statistics by both the local coupling between the detector and an adjacent MBS at one end of a nanowire and the nonlocal coupling to the MBS at the other end. We show that the Fano factor and the skewness of the SET current are minimized for a symmetric SET configuration in the absence of the MBSs or when coupled to a fermionic state. However, the minimum points of operation are shifted appreciably in the presence of the MBSs to asymmetric SET configurations with a higher tunnel rate at the drain than at the source. This feature persists even when varying the nonlocal coupling and the pairing energy between the two MBSs. We expect that these MBS-induced shifts can be measured experimentally with available technologies and can serve as important signatures of the MBSs. PMID:26098973

  5. Complete integrability of information processing by biochemical reactions.

    PubMed

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-04

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  6. Diamagnetic currents

    NASA Astrophysics Data System (ADS)

    Macris, N.; Martin, Ph. A.; Pulé, J. V.

    1988-06-01

    We study the diamagnetic surface currents of particles in thermal equilibrium submitted to a constant magnetic field. The current density of independent electrons with Boltzmann (respectively Fermi) statistics has a gaussian (respectively exponential) bound for its fall off into the bulk. For a system of interacting particles at low activity with Boltzmann statistics, the current density is localized near to the boundary and integrable when the two-body potential decays as |x|-α, α >4, α>4, in three dimensions. In all cases, the integral of the current density is independent of the nature of the confining wall and correctly related to the bulk magnetisation. The results hold for hard and soft walls and all field strength. The analysis relies on the Feynman-Kac-Ito representation of the Gibbs state and on specific properties of the Brownian bridge process.

  7. Robust Statistical Approaches for RSS-Based Floor Detection in Indoor Localization.

    PubMed

    Razavi, Alireza; Valkama, Mikko; Lohan, Elena Simona

    2016-05-31

    Floor detection for indoor 3D localization of mobile devices is currently an important challenge in the wireless world. Many approaches currently exist, but usually the robustness of such approaches is not addressed or investigated. The goal of this paper is to show how to robustify the floor estimation when probabilistic approaches with a low number of parameters are employed. Indeed, such an approach would allow a building-independent estimation and a lower computing power at the mobile side. Four robustified algorithms are to be presented: a robust weighted centroid localization method, a robust linear trilateration method, a robust nonlinear trilateration method, and a robust deconvolution method. The proposed approaches use the received signal strengths (RSS) measured by the Mobile Station (MS) from various heard WiFi access points (APs) and provide an estimate of the vertical position of the MS, which can be used for floor detection. We will show that robustification can indeed increase the performance of the RSS-based floor detection algorithms.

  8. Observing Inflationary Reheating

    NASA Astrophysics Data System (ADS)

    Martin, Jérôme; Ringeval, Christophe; Vennin, Vincent

    2015-02-01

    Reheating is the epoch which connects inflation to the subsequent hot big-bang phase. Conceptually very important, this era is, however, observationally poorly known. We show that the current Planck satellite measurements of the cosmic microwave background (CMB) anisotropies constrain the kinematic properties of the reheating era for most of the inflationary models. This result is obtained by deriving the marginalized posterior distributions of the reheating parameter for about 200 models of slow-roll inflation. Weighted by the statistical evidence of each model to explain the data, we show that the Planck 2013 measurements induce an average reduction of the posterior-to-prior volume by 40%. Making some additional assumptions on reheating, such as specifying a mean equation of state parameter, or focusing the analysis on peculiar scenarios, can enhance or reduce this constraint. Our study also indicates that the Bayesian evidence of a model can substantially be affected by the reheating properties. The precision of the current CMB data is therefore such that estimating the observational performance of a model now requires incorporating information about its reheating history.

  9. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    PubMed

    Kinoshita, Manabu; Sakai, Mio; Arita, Hideyuki; Shofuda, Tomoko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki; Nakanishi, Katsuyuki; Kanemura, Yonehiro

    2016-01-01

    Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006) and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73). Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively). ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively). Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007) but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  10. [Sacral neuromodulation as second-line treatment strategy for lower urinary tract symptoms of various aetiologies: experience of a German high-volume clinic].

    PubMed

    Otto, W; Nowrotek, A; Burger, M; Wieland, W F; Rößler, W; Denzinger, S

    2012-05-01

    Lower urinary tract symptoms (LUTS) are a common and multiform micturition disorder of various possible origins. Several second-line techniques are available in the event of first-line medicinal treatment failure. These include the intravesical injection of Botulinum toxin, bladder augmentation and sacral neuromodulation (SNM). This study presents current data and results from a prospective study of patients with LUTS of various aetiologies. Clinical success was investigated for all patients who underwent SNM for LUTS with or without urge incontinence caused by chronic pelvic pain syndrome, multiple sclerosis and idiopathic disease between May 2007 and December 2010. The preoperatively determined symptoms were compared with current follow-up data. Median follow-up time was 11 months (1 - 43). A total of 47 patients were indicated for SNM over the investigated period. 80.9 % were female, median patient age was 67 years (19 - 84). The testing phase was successful in 38 cases (80.9 %) with 9 electrodes being explanted (19.1 %). In the case of idiopathic LUTS we could show a statistically significant increase of micturition volume and reduction of incontinence pad use. There was no statistically significant improvement of any micturition parameter for patients with multiple sclerosis, patients with chronic pelvic pain syndrome showed a statistically significant reduction of micturition frequency and a subjective improvement of symptoms in 75 %. In the selected patient groups SNM is a promising and, in experienced hands, a low-complication second-line therapy for the treatment of LUTS of idiopathic aetiology. However, the general recommendation of SNM for multiple sclerosis and chronic pelvic pain syndrome patients cannot be given on the basis of our results. Further prospective, randomised multicentre studies are need to further refine the indications for SNM in LUTS of neurogenic and non-neurogenic origins. © Georg Thieme Verlag KG Stuttgart · New York.

  11. The persistent signature of tropical cyclones in ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Gualtieri, Lucia; Camargo, Suzana J.; Pascale, Salvatore; Pons, Flavio M. E.; Ekström, Göran

    2018-02-01

    The spectrum of ambient seismic noise shows strong signals associated with tropical cyclones, yet a detailed understanding of these signals and the relationship between them and the storms is currently lacking. Through the analysis of more than a decade of seismic data recorded at several stations located in and adjacent to the northwest Pacific Ocean, here we show that there is a persistent and frequency-dependent signature of tropical cyclones in ambient seismic noise that depends on characteristics of the storm and on the detailed location of the station relative to the storm. An adaptive statistical model shows that the spectral amplitude of ambient seismic noise, and notably of the short-period secondary microseisms, has a strong relationship with tropical cyclone intensity and can be employed to extract information on the tropical cyclones.

  12. Sociological Paradoxes and Graduate Statistics Classes. A Response to "The Sociology of Teaching Graduate Statistics"

    ERIC Educational Resources Information Center

    Hardy, Melissa

    2005-01-01

    This article presents a response to Timothy Patrick Moran's article "The Sociology of Teaching Graduate Statistics." In his essay, Moran argues that exciting developments in techniques of quantitative analysis are currently coupled with a much less exciting formulaic approach to teaching sociology graduate students about quantitative analysis. The…

  13. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Statistical Rating Organization (NRSRO) for this security, as of the reporting date Wt Avg Gross Margin Gross... Nationally Recognized Statistical Rating Organization (NRSRO) for this security, as of the reporting date [c... The most current rating issued by any Nationally Recognized Statistical Rating Organization (NRSRO...

  14. Forest statistics for New Hampshire

    Treesearch

    Thomas S. Frieswyk; Anne M. Malley

    1985-01-01

    This is a statistical report on the fourth forest survey of New Hampshire conducted in 1982-83 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that...

  15. Forest Statistics for Pennsylvania - 1978

    Treesearch

    Thomas J. Considine; Douglas S. Powell

    1980-01-01

    A statistical report on the third forest survey of Pennsylvania conducted in 1977 and 1978. Statistical findings are based on data from remeasured 115-acre plots and both remeasured and new 10-point variable-radius plots. The current status of forestland area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based...

  16. Using eddy currents for noninvasive in vivo pH monitoring for bone tissue engineering.

    PubMed

    Beck-Broichsitter, Benedicta E; Daschner, Frank; Christofzik, David W; Knöchel, Reinhard; Wiltfang, Jörg; Becker, Stephan T

    2015-03-01

    The metabolic processes that regulate bone healing and bone induction in tissue engineering models are not fully understood. Eddy current excitation is widely used in technical approaches and in the food industry. The aim of this study was to establish eddy current excitation for monitoring metabolic processes during heterotopic osteoinduction in vivo. Hydroxyapatite scaffolds were implanted into the musculus latissimus dorsi of six rats. Bone morphogenetic protein 2 (BMP-2) was applied 1 and 2 weeks after implantation. Weekly eddy current excitation measurements were performed. Additionally, invasive pH measurements were obtained from the scaffolds using fiber optic detection devices. Correlations between the eddy current measurements and the metabolic values were calculated. The eddy current measurements and pH values decreased significantly in the first 2 weeks of the study, followed by a steady increase and stabilization at higher levels towards the end of the study. The measurement curves and statistical evaluations indicated a significant correlation between the resonance frequency values of the eddy current excitation measurements and the observed pH levels (p = 0.0041). This innovative technique was capable of noninvasively monitoring metabolic processes in living tissues according to pH values, showing a direct correlation between eddy current excitation and pH in an in vivo tissue engineering model.

  17. Enhanced Component Performance Study: Emergency Diesel Generators 1998–2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-11-01

    This report presents an enhanced performance evaluation of emergency diesel generators (EDGs) at U.S. commercial nuclear power plants. This report evaluates component performance over time using (1) Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES) data from 1998 through 2014 and (2) maintenance unavailability (UA) performance data from Mitigating Systems Performance Index (MSPI) Basis Document data from 2002 through 2014. The objective is to show estimates of current failure probabilities and rates related to EDGs, trend these data on an annual basis, determine if the current data are consistent with the probability distributions currently recommended for use inmore » NRC probabilistic risk assessments, show how the reliability data differ for different EDG manufacturers and for EDGs with different ratings; and summarize the subcomponents, causes, detection methods, and recovery associated with each EDG failure mode. Engineering analyses were performed with respect to time period and failure mode without regard to the actual number of EDGs at each plant. The factors analyzed are: sub-component, failure cause, detection method, recovery, manufacturer, and EDG rating. Six trends with varying degrees of statistical significance were identified in the data.« less

  18. Randomized, double-blind, comparative study on efficacy and safety of itraconazole pulse therapy and terbinafine pulse therapy on nondermatophyte mold onychomycosis: A study with 90 patients.

    PubMed

    Ranawaka, Ranthilaka R; Nagahawatte, Ajith; Gunasekara, Thusitha Aravinda; Weerakoon, Hema S; de Silva, S H Padmal

    2016-08-01

    Nondermatophyte mold (NDM) onychomycosis shows poor response to current topical, oral or device-related antifungal therapies. This study was aimed to determine the efficacy and safety of itraconazole and terbinafine pulse therapy on NDM onychomycosis. Mycologically proven subjects were treated with itraconazole 400 mg daily or terbinafine 500 mg daily for 7 days/month; two pulses for fingernails and three pulses for toenails(SLCTR/2013/013). One-hundred seventy-eight patients underwent mycological studies and 148 had positive fungal isolates. NDM were the prevailing fungi, 68.2%, followed by candida species 21.6%, and dermatophytes made up only 10.1%. Out of NDM Aspergillus spp (75.1%) predominated followed by 8.9% Fusarium spp and 4.95% Penicillium spp. The clinical cure at completion of pulse therapy was statistically significant 9.2% versus 2.0% (p < 0.05) in itraconazole group. But no statistically significant difference was detected between the two regimens at the end of 12 months; 65.1% versus 54.64%. Recurrences observed in both groups (6.5% vs. 4.1%) were not statistically significant. With itraconazole pulse 68.22% Aspergillus spp, 50.0% Fusarium spp and 84.6% Penicillium spp showed clinical cure, while terbinafine pulse cured 55.0% Aspergillus spp and 50.0% Fusarium spp. NDM was the prevailing fungi in onychomycosis in Sri Lanka. Both itraconazole and terbinafine were partially effective on NDM onychomycosis showing a clinical cure of 54-65%. Future research should focus on searching more effective antifungal for NDM onychomycosis.

  19. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  20. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

  1. Statistical analysis on the concordance of the radiological evaluation of fractures of the distal radius subjected to traction☆

    PubMed Central

    Machado, Daniel Gonçalves; da Cruz Cerqueira, Sergio Auto; de Lima, Alexandre Fernandes; de Mathias, Marcelo Bezerra; Aramburu, José Paulo Gabbi; Rodarte, Rodrigo Ribeiro Pinho

    2016-01-01

    Objective The objective of this study was to evaluate the current classifications for fractures of the distal extremity of the radius, since the classifications made using traditional radiographs in anteroposterior and lateral views have been questioned regarding their reproducibility. In the literature, it has been suggested that other options are needed, such as use of preoperative radiographs on fractures of the distal radius subjected to traction, with stratification by the evaluators. The aim was to demonstrate which classification systems present better statistical reliability. Results In the Universal classification, the results from the third-year resident group (R3) and from the group of more experienced evaluators (Staff) presented excellent correlation, with a statistically significant p-value (p < 0.05). Neither of the groups presented a statistically significant result through the Frykman classification. In the AO classification, there were high correlations in the R3 and Staff groups (respectively 0.950 and 0.800), with p-values lower than 0.05 (respectively <0.001 and 0.003). Conclusion It can be concluded that radiographs performed under traction showed good concordance in the Staff group and in the R3 group, and that this is a good tactic for radiographic evaluations of fractures of the distal extremity of the radius. PMID:26962498

  2. Amplitude analysis and the nature of the Z c(3900)

    DOE PAGES

    Pilloni, A.; Fernandez-Ramirez, C.; Jackura, A.; ...

    2017-06-21

    The microscopic nature of the XYZ states remains an unsettled topic. We show how a thorough amplitude analysis of the data can help constraining models of these states. Specifically, we consider the case of the Z c(3900) peak and discuss possible scenarios of a QCD state, virtual state, or a kinematical enhancement. Here, we conclude that current data are not precise enough to distinguish between these hypotheses, however, the method we propose, when applied to the forthcoming high-statistics measurements should shed light on the nature of these exotic enhancements.

  3. Amplitude analysis and the nature of the Z c(3900)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilloni, A.; Fernandez-Ramirez, C.; Jackura, A.

    The microscopic nature of the XYZ states remains an unsettled topic. We show how a thorough amplitude analysis of the data can help constraining models of these states. Specifically, we consider the case of the Z c(3900) peak and discuss possible scenarios of a QCD state, virtual state, or a kinematical enhancement. Here, we conclude that current data are not precise enough to distinguish between these hypotheses, however, the method we propose, when applied to the forthcoming high-statistics measurements should shed light on the nature of these exotic enhancements.

  4. Vesicle Motion during Sustained Exocytosis in Chromaffin Cells: Numerical Model Based on Amperometric Measurements.

    PubMed

    Jarukanont, Daungruthai; Bonifas Arredondo, Imelda; Femat, Ricardo; Garcia, Martin E

    2015-01-01

    Chromaffin cells release catecholamines by exocytosis, a process that includes vesicle docking, priming and fusion. Although all these steps have been intensively studied, some aspects of their mechanisms, particularly those regarding vesicle transport to the active sites situated at the membrane, are still unclear. In this work, we show that it is possible to extract information on vesicle motion in Chromaffin cells from the combination of Langevin simulations and amperometric measurements. We developed a numerical model based on Langevin simulations of vesicle motion towards the cell membrane and on the statistical analysis of vesicle arrival times. We also performed amperometric experiments in bovine-adrenal Chromaffin cells under Ba2+ stimulation to capture neurotransmitter releases during sustained exocytosis. In the sustained phase, each amperometric peak can be related to a single release from a new vesicle arriving at the active site. The amperometric signal can then be mapped into a spike-series of release events. We normalized the spike-series resulting from the current peaks using a time-rescaling transformation, thus making signals coming from different cells comparable. We discuss why the obtained spike-series may contain information about the motion of all vesicles leading to release of catecholamines. We show that the release statistics in our experiments considerably deviate from Poisson processes. Moreover, the interspike-time probability is reasonably well described by two-parameter gamma distributions. In order to interpret this result we computed the vesicles' arrival statistics from our Langevin simulations. As expected, assuming purely diffusive vesicle motion we obtain Poisson statistics. However, if we assume that all vesicles are guided toward the membrane by an attractive harmonic potential, simulations also lead to gamma distributions of the interspike-time probability, in remarkably good agreement with experiment. We also show that including the fusion-time statistics in our model does not produce any significant changes on the results. These findings indicate that the motion of the whole ensemble of vesicles towards the membrane is directed and reflected in the amperometric signals. Our results confirm the conclusions of previous imaging studies performed on single vesicles that vesicles' motion underneath plasma membranes is not purely random, but biased towards the membrane.

  5. Vesicle Motion during Sustained Exocytosis in Chromaffin Cells: Numerical Model Based on Amperometric Measurements

    PubMed Central

    Jarukanont, Daungruthai; Bonifas Arredondo, Imelda; Femat, Ricardo; Garcia, Martin E.

    2015-01-01

    Chromaffin cells release catecholamines by exocytosis, a process that includes vesicle docking, priming and fusion. Although all these steps have been intensively studied, some aspects of their mechanisms, particularly those regarding vesicle transport to the active sites situated at the membrane, are still unclear. In this work, we show that it is possible to extract information on vesicle motion in Chromaffin cells from the combination of Langevin simulations and amperometric measurements. We developed a numerical model based on Langevin simulations of vesicle motion towards the cell membrane and on the statistical analysis of vesicle arrival times. We also performed amperometric experiments in bovine-adrenal Chromaffin cells under Ba2+ stimulation to capture neurotransmitter releases during sustained exocytosis. In the sustained phase, each amperometric peak can be related to a single release from a new vesicle arriving at the active site. The amperometric signal can then be mapped into a spike-series of release events. We normalized the spike-series resulting from the current peaks using a time-rescaling transformation, thus making signals coming from different cells comparable. We discuss why the obtained spike-series may contain information about the motion of all vesicles leading to release of catecholamines. We show that the release statistics in our experiments considerably deviate from Poisson processes. Moreover, the interspike-time probability is reasonably well described by two-parameter gamma distributions. In order to interpret this result we computed the vesicles’ arrival statistics from our Langevin simulations. As expected, assuming purely diffusive vesicle motion we obtain Poisson statistics. However, if we assume that all vesicles are guided toward the membrane by an attractive harmonic potential, simulations also lead to gamma distributions of the interspike-time probability, in remarkably good agreement with experiment. We also show that including the fusion-time statistics in our model does not produce any significant changes on the results. These findings indicate that the motion of the whole ensemble of vesicles towards the membrane is directed and reflected in the amperometric signals. Our results confirm the conclusions of previous imaging studies performed on single vesicles that vesicles’ motion underneath plasma membranes is not purely random, but biased towards the membrane. PMID:26675312

  6. Post-exposure treatments for Ebola and Marburg virus infections.

    PubMed

    Cross, Robert W; Mire, Chad E; Feldmann, Heinz; Geisbert, Thomas W

    2018-06-01

    The filoviruses - Ebola virus and Marburg virus - cause lethal haemorrhagic fever in humans and non-human primates (NHPs). Filoviruses present a global health threat both as naturally acquired diseases and as potential agents of bioterrorism. In the recent 2013-2016 outbreak of Ebola virus, the most promising therapies for post-exposure use with demonstrated efficacy in the gold-standard NHP models of filovirus disease were unable to show statistically significant protection in patients infected with Ebola virus. This Review briefly discusses these failures and what has been learned from these experiences, and summarizes the current status of post-exposure medical countermeasures in development, including antibodies, small interfering RNA and small molecules. We outline how our current knowledge could be applied to the identification of novel interventions and ways to use interventions more effectively.

  7. Variability and reliability analysis in self-assembled multichannel carbon nanotube field-effect transistors

    NASA Astrophysics Data System (ADS)

    Hu, Zhaoying; Tulevski, George S.; Hannon, James B.; Afzali, Ali; Liehr, Michael; Park, Hongsik

    2015-06-01

    Carbon nanotubes (CNTs) have been widely studied as a channel material of scaled transistors for high-speed and low-power logic applications. In order to have sufficient drive current, it is widely assumed that CNT-based logic devices will have multiple CNTs in each channel. Understanding the effects of the number of CNTs on device performance can aid in the design of CNT field-effect transistors (CNTFETs). We have fabricated multi-CNT-channel CNTFETs with an 80-nm channel length using precise self-assembly methods. We describe compact statistical models and Monte Carlo simulations to analyze failure probability and the variability of the on-state current and threshold voltage. The results show that multichannel CNTFETs are more resilient to process variation and random environmental fluctuations than single-CNT devices.

  8. Critical Current Statistics of a Graphene-Based Josephson Junction Infrared Single Photon Detector

    NASA Astrophysics Data System (ADS)

    Walsh, Evan D.; Lee, Gil-Ho; Efetov, Dmitri K.; Heuck, Mikkel; Crossno, Jesse; Taniguchi, Takashi; Watanabe, Kenji; Ohki, Thomas A.; Kim, Philip; Englund, Dirk; Fong, Kin Chung

    Graphene is a promising material for single photon detection due to its broadband absorption and exceptionally low specific heat. We present a photon detector using a graphene sheet as the weak link in a Josephson junction (JJ) to form a threshold detector for single infrared photons. Calculations show that such a device could experience temperature changes of a few hundred percent leading to sub-Hz dark count rates and internal efficiencies approaching unity. We have fabricated the graphene-based JJ (gJJ) detector and measure switching events that are consistent with single photon detection under illumination by an attenuated laser. We study the physical mechanism for these events through the critical current behavior of the gJJ as a function of incident photon flux.

  9. Temperature Dependent Electron Transport Properties of Gold Nanoparticles and Composites: Scanning Tunneling Spectroscopy Investigations.

    PubMed

    Patil, Sumati; Datar, Suwarna; Dharmadhikari, C V

    2018-03-01

    Scanning tunneling spectroscopy (STS) is used for investigating variations in electronic properties of gold nanoparticles (AuNPs) and its composite with urethane-methacrylate comb polymer (UMCP) as function of temperature. Films are prepared by drop casting AuNPs and UMCP in desired manner on silicon substrates. Samples are further analyzed for morphology under scanning electron microscopy (SEM) and atomic force microscopy (AFM). STS measurements performed in temperature range of 33 °C to 142 °C show systematic variation in current versus voltage (I-V) curves, exhibiting semiconducting to metallic transition/Schottky behavior for different samples, depending upon preparation method and as function of temperature. During current versus time (I-t) measurement for AuNPs, random telegraphic noise is observed at room temperature. Random switching of tunneling current between two discrete levels is observed for this sample. Power spectra derived from I-t show 1/f2 dependence. Statistical analysis of fluctuations shows exponential behavior with time width τ ≈ 7 ms. Local density of states (LDOS) plots derived from I-V curves of each sample show systematic shift in valance/conduction band edge towards/away from Fermi level, with respect to increase in temperature. Schottky emission is best fitted electron emission mechanism for all samples over certain range of bias voltage. Schottky plots are used to calculate barrier heights and temperature dependent measurements helped in measuring activation energies for electron transport in all samples.

  10. The effects of academic grouping on student performance in science

    NASA Astrophysics Data System (ADS)

    Scoggins, Sally Smykla

    The current action research study explored how student placement in heterogeneous or homogeneous classes in seventh-grade science affected students' eighth-grade Science State of Texas Assessment of Academic Readiness (STAAR) scores, and how ability grouping affected students' scores based on race and socioeconomic status. The population included all eighth-grade students in the target district who took the regular eighth-grade science STAAR over four academic school years. The researcher ran three statistical tests: a t-test for independent samples, a one-way between subjects analysis of variance (ANOVA) and a two-way between subjects ANOVA. The results showed no statistically significant difference between eighth-grade Pre-AP students from seventh-grade Pre-AP classes and eighth-grade Pre-AP students from heterogeneous seventh-grade classes and no statistically significant difference between Pre-AP students' scores based on socioeconomic status. There was no statistically significant interaction between socioeconomic status and the seventh-grade science classes. The scores between regular eighth-grade students who were in heterogeneous seventh-grade classes were statistically significantly higher than the scores of regular eighth-grade students who were in regular seventh-grade classes. The results also revealed that the scores of students who were White were statistically significantly higher than the scores of students who were Black and Hispanic. Black and Hispanic scores did not differ significantly. Further results indicated that the STAAR Level II and Level III scores were statistically significantly higher for the Pre-AP eighth-grade students who were in heterogeneous seventh-grade classes than the STAAR Level II and Level III scores of Pre-AP eighth-grade students who were in Pre-AP seventh-grade classes.

  11. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  12. Disparities in Minority Promotion Rates: A Total Quality Approach

    DTIC Science & Technology

    1992-01-01

    UCL - p + 3 x.’ { p ( I - p) / n data, The statistical theory of logistic regression is beyond the scope of this report. Several computer statistical ... Statistics . Richard D. Irwin, Inc., Homewood IL: 1986. Feagin, J. R., Discrimination 4merican style: Institutional racism and sexism . Englewood Cliffs...current year data and the previous three years. Data for fiscal year One purpose of this project is to provide a statistical 1987, 1988, 1989, 1990, and

  13. Evaluation of the Williams-type spring wheat model in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Leduc, S. (Principal Investigator)

    1982-01-01

    The Williams type model, developed similarly to previous models of C.V.D. Williams, uses monthly temperature and precipitation data as well as soil and topological variables to predict the yield of the spring wheat crop. The models are statistically developed using the regression technique. Eight model characteristics are examined in the evaluation of the model. Evaluation is at the crop reporting district level, the state level and for the entire region. A ten year bootstrap test was the basis of the statistical evaluation. The accuracy and current indication of modeled yield reliability could show improvement. There is great variability in the bias measured over the districts, but there is a slight overall positive bias. The model estimates for the east central crop reporting district in Minnesota are not accurate. The estimate of yield for 1974 were inaccurate for all of the models.

  14. ON INTERMITTENT TURBULENCE HEATING OF THE SOLAR WIND: DIFFERENCES BETWEEN TANGENTIAL AND ROTATIONAL DISCONTINUITIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Xin; Tu Chuanyi; He Jiansen

    The intermittent structures in solar wind turbulence, studied by using measurements from the WIND spacecraft, are identified as being mostly rotational discontinuities (RDs) and rarely tangential discontinuities (TDs) based on the technique described by Smith. Only TD-associated current sheets (TCSs) are found to be accompanied with strong local heating of the solar wind plasma. Statistical results show that the TCSs have a distinct tendency to be associated with local enhancements of the proton temperature, density, and plasma beta, and a local decrease of magnetic field magnitude. Conversely, for RDs, our statistical results do not reveal convincing heating effects. These resultsmore » confirm the notion that dissipation of solar wind turbulence can take place in intermittent or locally isolated small-scale regions which correspond to TCSs. The possibility of heating associated with RDs is discussed.« less

  15. Taking Ockham's razor to enzyme dynamics and catalysis.

    PubMed

    Glowacki, David R; Harvey, Jeremy N; Mulholland, Adrian J

    2012-01-29

    The role of protein dynamics in enzyme catalysis is a matter of intense current debate. Enzyme-catalysed reactions that involve significant quantum tunnelling can give rise to experimental kinetic isotope effects with complex temperature dependences, and it has been suggested that standard statistical rate theories, such as transition-state theory, are inadequate for their explanation. Here we introduce aspects of transition-state theory relevant to the study of enzyme reactivity, taking cues from chemical kinetics and dynamics studies of small molecules in the gas phase and in solution--where breakdowns of statistical theories have received significant attention and their origins are relatively better understood. We discuss recent theoretical approaches to understanding enzyme activity and then show how experimental observations for a number of enzymes may be reproduced using a transition-state-theory framework with physically reasonable parameters. Essential to this simple model is the inclusion of multiple conformations with different reactivity.

  16. Nonequilibrium critical behavior of model statistical systems and methods for the description of its features

    NASA Astrophysics Data System (ADS)

    Prudnikov, V. V.; Prudnikov, P. V.; Mamonova, M. V.

    2017-11-01

    This paper reviews features in critical behavior of far-from-equilibrium macroscopic systems and presents current methods of describing them by referring to some model statistical systems such as the three-dimensional Ising model and the two-dimensional XY model. The paper examines the critical relaxation of homogeneous and structurally disordered systems subjected to abnormally strong fluctuation effects involved in ordering processes in solids at second-order phase transitions. Interest in such systems is due to the aging properties and fluctuation-dissipation theorem violations predicted for and observed in systems slowly evolving from a nonequilibrium initial state. It is shown that these features of nonequilibrium behavior show up in the magnetic properties of magnetic superstructures consisting of alternating nanoscale-thick magnetic and nonmagnetic layers and can be observed not only near the film’s critical ferromagnetic ordering temperature Tc, but also over the wide temperature range T ⩽ Tc.

  17. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence

    PubMed Central

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-01-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016–2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences. PMID:28924610

  18. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence.

    PubMed

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-09-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016-2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences.

  19. Geographically Sourcing Cocaine's Origin - Delineation of the Nineteen Major Coca Growing Regions in South America.

    PubMed

    Mallette, Jennifer R; Casale, John F; Jordan, James; Morello, David R; Beyer, Paul M

    2016-03-23

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses ((2)H and (18)O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.

  20. Geographically Sourcing Cocaine’s Origin - Delineation of the Nineteen Major Coca Growing Regions in South America

    NASA Astrophysics Data System (ADS)

    Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.

    2016-03-01

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.

  1. Scaling and universality in the human voice.

    PubMed

    Luque, Jordi; Luque, Bartolo; Lacasa, Lucas

    2015-04-06

    Speech is a distinctive complex feature of human capabilities. In order to understand the physics underlying speech production, in this work, we empirically analyse the statistics of large human speech datasets ranging several languages. We first show that during speech, the energy is unevenly released and power-law distributed, reporting a universal robust Gutenberg-Richter-like law in speech. We further show that such 'earthquakes in speech' show temporal correlations, as the interevent statistics are again power-law distributed. As this feature takes place in the intraphoneme range, we conjecture that the process responsible for this complex phenomenon is not cognitive, but it resides in the physiological (mechanical) mechanisms of speech production. Moreover, we show that these waiting time distributions are scale invariant under a renormalization group transformation, suggesting that the process of speech generation is indeed operating close to a critical point. These results are put in contrast with current paradigms in speech processing, which point towards low dimensional deterministic chaos as the origin of nonlinear traits in speech fluctuations. As these latter fluctuations are indeed the aspects that humanize synthetic speech, these findings may have an impact in future speech synthesis technologies. Results are robust and independent of the communication language or the number of speakers, pointing towards a universal pattern and yet another hint of complexity in human speech. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  2. Health Resources Statistics; Health Manpower and Health Facilities, 1968. Public Health Service Publication No. 1509.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…

  3. Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    ERIC Educational Resources Information Center

    White, Patrick; Gorard, Stephen

    2017-01-01

    Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on…

  4. 75 FR 79320 - Animal Drugs, Feeds, and Related Products; Regulation of Carcinogenic Compounds in Food-Producing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...

  5. Forest Statistics for Ohio--1979

    Treesearch

    Donald F. Dennis; Thomas W. Birch; Thomas W. Birch

    1981-01-01

    A statistical report on the third forest survey of Ohio conducted in 1978 and 1979. Statistical findings are based on data from remeasured and new 10-point variable radius plots. The current status of forest-land area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based on a 1978 updated canvass of...

  6. Forest statistics for Vermont: 1973 and 1983

    Treesearch

    Thomas S. Frieswyk; Anne M. Malley

    1985-01-01

    A statistical report on the fourth forest survey of Vermont conducted in 1982-1983 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that the state has...

  7. Forest statistics for New York--1980

    Treesearch

    Thomas J., Jr. Considine; Thomas S. Frieswyk; Thomas S. Frieswyk

    1982-01-01

    A statistical report on the third forest survey of New York conducted in 1978 and 1979. Statistical findings are based on data from remeasured and new 10-point variable-radius plots. The current status of forest-land area, timber volume, and annual growth and removals is presented. Timber products output by timber industries, based on a 1979 updated canvass of...

  8. Forest statistics for Delaware: 1986 and 1999

    Treesearch

    Douglas M. Griffith; Richard H. Widmann; Richard H. Widmann

    2001-01-01

    A statistical report on the fourth forest inventory of Delaware conducted in 1999 by the Forest Inventory and Analysis Unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there are...

  9. Forest statistics for West Virginia: 1989 and 2000

    Treesearch

    Douglas M. Griffith; Richard H. Widmann

    2003-01-01

    A statistical report on the fifth forest inventory of West Virginia conducted in 2000 by the Forest Inventory and Analysis unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there...

  10. Counting statistics of tunneling current

    NASA Astrophysics Data System (ADS)

    Levitov, L. S.; Reznikov, M.

    2004-09-01

    The form of electron counting statistics of the tunneling current noise in a generic many-body interacting electron system is obtained and universal relations between its different moments are derived. A generalized fluctuation-dissipation theorem providing a relation between current and noise at arbitrary bias-to-temperature ratio eV/kBT is established in the tunneling Hamiltonian approximation. The third correlator of current fluctuations S3 (the skewness of the charge counting distribution) has a universal Schottky-type relation with the current and quasiparticle charge that holds in a wide bias voltage range, both at large and small eV/kBT . The insensitivity of S3 to the Nyquist-Schottky crossover represents an advantage compared to the Schottky formula for the noise power. We discuss the possibility of using the correlator S3 for detecting quasiparticle charge at high temperatures.

  11. Match statistics related to winning in the group stage of 2014 Brazil FIFA World Cup.

    PubMed

    Liu, Hongyou; Gomez, Miguel-Ángel; Lago-Peñas, Carlos; Sampaio, Jaime

    2015-01-01

    Identifying match statistics that strongly contribute to winning in football matches is a very important step towards a more predictive and prescriptive performance analysis. The current study aimed to determine relationships between 24 match statistics and the match outcome (win, loss and draw) in all games and close games of the group stage of FIFA World Cup (2014, Brazil) by employing the generalised linear model. The cumulative logistic regression was run in the model taking the value of each match statistic as independent variable to predict the logarithm of the odds of winning. Relationships were assessed as effects of a two-standard-deviation increase in the value of each variable on the change in the probability of a team winning a match. Non-clinical magnitude-based inferences were employed and were evaluated by using the smallest worthwhile change. Results showed that for all the games, nine match statistics had clearly positive effects on the probability of winning (Shot, Shot on Target, Shot from Counter Attack, Shot from Inside Area, Ball Possession, Short Pass, Average Pass Streak, Aerial Advantage and Tackle), four had clearly negative effects (Shot Blocked, Cross, Dribble and Red Card), other 12 statistics had either trivial or unclear effects. While for the close games, the effects of Aerial Advantage and Yellow Card turned to trivial and clearly negative, respectively. Information from the tactical modelling can provide a more thorough and objective match understanding to coaches and performance analysts for evaluating post-match performances and for scouting upcoming oppositions.

  12. The SPARC Intercomparison of Middle Atmosphere Climatologies

    NASA Technical Reports Server (NTRS)

    Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra

    2003-01-01

    Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.

  13. Energy Cascade Analysis: from Subscale Eddies to Mean Flow

    NASA Astrophysics Data System (ADS)

    Cheikh, Mohamad Ibrahim; Wonnell, Louis; Chen, James

    2017-11-01

    Understanding the energy transfer between eddies and mean flow can provide insights into the energy cascade process. Much work has been done to investigate the energy cascade at the level of the smallest eddies using different numerical techniques derived from the Navier-Stokes equations. These methodologies, however, prove to be computationally inefficient when producing energy spectra for a wide range of length scales. In this regard, Morphing Continuum Theory (MCT) resolves the length-scales issues by assuming the fluid continuum to be composed of inner structures that play the role of subscale eddies. The current study show- cases the capabilities of MCT in capturing the dynamics of energy cascade at the level of subscale eddies, through a supersonic turbulent flow of Mach 2.93 over an 8× compression ramp. Analysis of the results using statistical averaging procedure shows the existence of a statistical coupling of the internal and translational kinetic energy fluctuations with the corresponding rotational kinetic energy of the subscale eddies, indicating a multiscale transfer of energy. The results show that MCT gives a new characterization of the energy cascade within compressible turbulence without the use of excessive computational resources. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-17-1-0154.

  14. The Effects of CO2 Laser with or without Nanohydroxyapatite Paste in the Occlusion of Dentinal Tubules

    PubMed Central

    Al-maliky, Mohammed Abbood; Mahmood, Ali Shukur; Al-karadaghi, Tamara Sardar; Kurzmann, Christoph; Laky, Markus; Franz, Alexander; Moritz, Andreas

    2014-01-01

    The aim of this study was to evaluate a new treatment modality for the occlusion of dentinal tubules (DTs) via the combination of 10.6 µm carbon dioxide (CO2) laser and nanoparticle hydroxyapatite paste (n-HAp). Forty-six sound human molars were used in the current experiment. Ten of the molars were used to assess the temperature elevation during lasing. Thirty were evaluated for dentinal permeability test, subdivided into 3 groups: the control group (C), laser only (L−), and laser plus n-HAp (L+). Six samples, two per group, were used for surface and cross section morphology, evaluated through scanning electron microscope (SEM). The temperature measurement results showed that the maximum temperature increase was 3.2°C. Morphologically groups (L−) and (L+) presented narrower DTs, and almost a complete occlusion of the dentinal tubules for group (L+) was found. The Kruskal-Wallis nonparametric test for permeability test data showed statistical differences between the groups (P < 0.05). For intergroup comparison all groups were statistically different from each other, with group (L+) showing significant less dye penetration than the control group. We concluded that CO2 laser in moderate power density combined with n-HAp seems to be a good treatment modality for reducing the permeability of dentin. PMID:25386616

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benahmed, A.; Elkarch, H.

    This new portable radiological environmental monitor consists of 2 main components, Gamma ionization chamber and a FPGA-based electronic enclosure linked to convivial software for treatment and analyzing. The HPIC ion chamber is the heart of this radiation measurement system and is running in range from 0 to 100 mR/h, so that the sensitivity at the output is 20 mV/μR/h, with a nearly flat energy response from 0,07 to 10 MEV. This paper presents a contribution for developing a new nuclear measurement data acquisition system based on Cyclone III FPGA Starter Kit ALTERA, and a user-friendly software to run real-time controlmore » and data processing. It was developed to substitute the older radiation monitor RSS-112 PIC installed in CNESTEN's Laboratory in order to improve some of its functionalities related to acquisition time and data memory capacity. As for the associated acquisition software, it was conceived under the virtual LabView platform from National Instrument, and offers a variety of system setup for radiation environmental monitoring. It gives choice to display both the statistical data and the dose rate. Statistical data shows a summary of current data, current time/date and dose integrator values, and the dose rate displays the current dose rate in large numbers for viewing from a distance as well as the date and time. The prototype version of this new instrument and its data processing software has been successfully tested and validated for viewing and monitoring the environmental radiation of Moroccan nuclear center. (authors)« less

  16. Empirical retrocausality: Testing physics hypotheses with parapsychological experiments

    NASA Astrophysics Data System (ADS)

    Dobyns, York

    2017-05-01

    In 2011, Daryl Bem published a report of nine parapsychological experiments showing evidence of retrocausal information transfer. Earlier in 2016, the team of Bem, Tressoldi, Rabeyron, and Duggan published the results of a meta-analysis containing 81 independent replications of the original Bem experiments (total of 90 with the originals).[1] This much larger database continues to show positive results of generally comparable effect size, thus demonstrating that the effects claimed by Bem can be replicated by independent researchers and greatly strengthening the case for empirically observed retrocausation. Earlier (2011) work by this author showed how a modification of one of Bem's original experiments could be used to test the mechanism implicitly proposed by Echeverria, Klinkhammer, and Thorne to explain how retrocausal phenomena can exist without any risk of self-contradictory event sequences (time paradoxes). In light of the new publication and new evidence, the current work generalizes the previous analysis which was restricted to only one of Bem's experimental genres (precognitive approach and avoidance). The current analysis shows how minor modifications can be made in Bem's other experimental genres of retroactive priming, retroactive habituation, and retroactive facilitation of recall to test the EKT anti-paradox mechanism. If the EKT hypothesis is correct, the modified experiments, while continuing to show replicable retrocausal phenomena, will also show a characteristic pattern of distortion in the statistics of the random selections used to drive the experiments.

  17. Preferred prenatal counselling at the limits of viability: a survey among Dutch perinatal professionals.

    PubMed

    Geurtzen, R; Van Heijst, Arno; Hermens, Rosella; Scheepers, Hubertina; Woiski, Mallory; Draaisma, Jos; Hogeveen, Marije

    2018-01-03

    Since 2010, intensive care can be offered in the Netherlands at 24 +0  weeks gestation (with parental consent) but the Dutch guideline lacks recommendations on organization, content and preferred decision-making of the counselling. Our aim is to explore preferred prenatal counselling at the limits of viability by Dutch perinatal professionals and compare this to current care. Online nationwide survey as part of the PreCo study (2013) amongst obstetricians and neonatologists in all Dutch level III perinatal care centers (n = 205).The survey regarded prenatal counselling at the limits of viability and focused on the domains of organization, content and decision-making in both current and preferred practice. One hundred twenty-two surveys were returned out of 205 eligible professionals (response rate 60%). Organization-wise: more than 80% of all professionals preferred (but currently missed) having protocols for several aspects of counselling, joint counselling by both neonatologist and obstetrician, and the use of supportive materials. Most professionals preferred using national or local data (70%) on outcome statistics for the counselling content, in contrast to the international statistics currently used (74%). Current decisions on initiation care were mostly made together (in 99% parents and doctor). This shared decision model was preferred by 95% of the professionals. Dutch perinatal professionals would prefer more protocolized counselling, joint counselling, supportive material and local outcome statistics. Further studies on both barriers to perform adequate counselling, as well as on Dutch outcome statistics and parents' opinions are needed in order to develop a national framework. Clinicaltrials.gov, NCT02782650 , retrospectively registered May 2016.

  18. Teaching Statistics--Despite Its Applications

    ERIC Educational Resources Information Center

    Ridgway, Jim; Nicholson, James; McCusker, Sean

    2007-01-01

    Evidence-based policy requires sophisticated modelling and reasoning about complex social data. The current UK statistics curricula do not equip tomorrow's citizens to understand such reasoning. We advocate radical curriculum reform, designed to require students to reason from complex data.

  19. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  20. Towards a statistical mechanical theory of active fluids.

    PubMed

    Marini Bettolo Marconi, Umberto; Maggi, Claudio

    2015-12-07

    We present a stochastic description of a model of N mutually interacting active particles in the presence of external fields and characterize its steady state behavior in the absence of currents. To reproduce the effects of the experimentally observed persistence of the trajectories of the active particles we consider a Gaussian force having a non-vanishing correlation time τ, whose finiteness is a measure of the activity of the system. With these ingredients we show that it is possible to develop a statistical mechanical approach similar to the one employed in the study of equilibrium liquids and to obtain the explicit form of the many-particle distribution function by means of the multidimensional unified colored noise approximation. Such a distribution plays a role analogous to the Gibbs distribution in equilibrium statistical mechanics and provides complete information about the microscopic state of the system. From here we develop a method to determine the one- and two-particle distribution functions in the spirit of the Born-Green-Yvon (BGY) equations of equilibrium statistical mechanics. The resulting equations which contain extra-correlations induced by the activity allow us to determine the stationary density profiles in the presence of external fields, the pair correlations and the pressure of active fluids. In the low density regime we obtained the effective pair potential ϕ(r) acting between two isolated particles separated by a distance, r, showing the existence of an effective attraction between them induced by activity. Based on these results, in the second half of the paper we propose a mean field theory as an approach simpler than the BGY hierarchy and use it to derive a van der Waals expression of the equation of state.

  1. Medicare payment data for spine reimbursement; important but flawed data for evaluating utilization of resources.

    PubMed

    Menger, Richard P; Wolf, Michael E; Kukreja, Sunil; Sin, Anthony; Nanda, Anil

    2015-01-01

    Medicare data showing physician-specific reimbursement for 2012 were recently made public in the mainstream media. Given the ongoing interest in containing healthcare costs, we analyze these data in the context of the delivery of spinal surgery. Demographics of 206 leading surgeons were extracted including state, geographic area, residency training program, fellowship training, and academic affiliation. Using current procedural terminology (CPT) codes, information was evaluated regarding the number of lumbar laminectomies, lumbar fusions, add-on laminectomy levels, and anterior cervical fusions reimbursed by Medicare in 2012. In 2012 Medicare reimbursed the average neurosurgeon slightly more than an orthopedic surgeon for all procedures ($142,075 vs. $110,920), but this was not found to be statistically significant (P = 0.218). Orthopedic surgeons had a statistical trend illustrating increased reimbursement for lumbar fusions specifically, $1187 versus $1073 (P = 0.07). Fellowship trained spinal surgeons also, on average, received more from Medicare ($125,407 vs. $76,551), but again this was not statistically significant (P = 0.112). A surgeon in private practice, on average, was reimbursed $137,495 while their academic counterparts were reimbursed $103,144 (P = 0.127). Surgeons performing cervical fusions in the Centers for Disease Control West Region did receive statistically significantly less reimbursement for that procedure then those surgeons in other parts of the country (P = 0.015). Surgeons in the West were reimbursed on average $849 for CPT code 22,551 while those in the Midwest received $1475 per procedure. Medicare reimbursement data are fundamentally flawed in determining healthcare expenditure as it shows a bias toward delivery of care in specific patient demographics. However, neurosurgeons, not just policy makers, must take ownership to analyze, investigate, and interpret these data as it will affect healthcare reimbursement and delivery moving forward.

  2. The Effect of Folate and Folate Plus Zinc Supplementation on Endocrine Parameters and Sperm Characteristics in Sub-Fertile Men: A Systematic Review and Meta-Analysis.

    PubMed

    Irani, Morvarid; Amirian, Malihe; Sadeghi, Ramin; Lez, Justine Le; Latifnejad Roudsari, Robab

    2017-08-29

    To evaluate the effect of folate and folate plus zinc supplementation on endocrine parameters and sperm characteristics in sub fertile men. We conducted a systematic review and meta-analysis. Electronic databases of Medline, Scopus , Google scholar and Persian databases (SID, Iran medex, Magiran, Medlib, Iran doc) were searched from 1966 to December 2016 using a set of relevant keywords including "folate or folic acid AND (infertility, infertile, sterility)".All available randomized controlled trials (RCTs), conducted on a sample of sub fertile men with semen analyses, who took oral folic acid or folate plus zinc, were included. Data collected included endocrine parameters and sperm characteristics. Statistical analyses were done by Comprehensive Meta-analysis Version 2. In total, seven studies were included. Six studies had sufficient data for meta-analysis. "Sperm concentration was statistically higher in men supplemented with folate than with placebo (P < .001)". However, folate supplementation alone did not seem to be more effective than the placebo on the morphology (P = .056) and motility of the sperms (P = .652). Folate plus zinc supplementation did not show any statistically different effect on serum testosterone (P = .86), inhibin B (P = .84), FSH (P = .054), and sperm motility (P = .169) as compared to the placebo. Yet, folate plus zinc showed statistically higher effect on the sperm concentration (P < .001), morphology (P < .001), and serum folate level (P < .001) as compared to placebo. Folate plus zinc supplementation has a positive effect on sperm characteristics in sub fertile men. However, these results should be interpreted with caution due to the important heterogeneity of the studies included in this meta-analysis. Further trials are still needed to confirm the current findings.

  3. Feature maps driven no-reference image quality prediction of authentically distorted images

    NASA Astrophysics Data System (ADS)

    Ghadiyaram, Deepti; Bovik, Alan C.

    2015-03-01

    Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.

  4. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  5. Nanocellulose patents trends: a comprehensive review on patents on cellulose nanocrystals, microfibrillated and bacterial cellulose.

    PubMed

    Charreau, Hernan; Foresti, Maria L; Vazquez, Analia

    2013-01-01

    Cellulose nanoparticles (i.e. cellulose elements having at least one dimension in the 1-100 nm range) have received increasing attention during the last decade. This is not only evident in academic articles, but it is also manifested by the increasing number of nanocellulose patents that are published every year. In the current review, nanocellulose patents are reviewed using specific software which provides valuable information on the annual number of patents that have been published throughout the years, main patent owners, most prolific inventors, and patents on the field that have received more citations. Patent statistics on rod-like cellulose nanoparticles extracted from plants by acid hydrolysis (nanocrystals), mechanical treatment leading to microfibrillated cellulose (MFC), and microbially produced nanofibrils (bacterial cellulose, BC) are analyzed in detail. The aim of the current review is to provide researchers with patent information which may help them in visualizing the evolution of nanocellulose technology, both as a whole and also divided among the different nanosized particles that are currently the subject of outstanding scientific attention. Then, patents are not only analyzed by their content, but also by global statistics which will reveal the moment at which different cellulose nanoparticles technologies achieved a breakthrough, the relative interest received by different nanocellulose particles throughout the years, the companies that have been most interested in this technology, the most prolific inventors, and the patents that have had more influence in further developments. It is expected that the results showing the explosion that nanocellulose technology is experiencing in current days will still bring more research on the topic and contribute to the expansion of nanocellulosics applications.

  6. Investigation of rough surfaces on Cu2ZnSn(SxSe1-x)4 monograin layers using light beam induced current measurements

    NASA Astrophysics Data System (ADS)

    Neubauer, Christian; Babatas, Ertug; Meissner, Dieter

    2017-11-01

    Monograin technology has proven to be a successful way of manufacturing low cost photovoltaic applications using the pentanary Cu2ZnSn(SxSe1-x)4 (CZTSSe) as an absorber material in an industrial roll-to-roll process. For high efficient CZTSSe monograin device fabrication a thorough understanding of the impacts of the device characteristics and surface structure is important. A new evaluation method of Light Beam Induced Current (LBIC) images had to be developed to distinguish between different effects resulting from different surface orientations, grain sizes, packing densities and contacting areas. In this work we will show that with LBIC measurements it is possible to evaluate the quality and differences in produced CZTSSe monograin cells in a post-production and non-destructive step. The high spatial resolution evaluation allows investigating the homogeneity of single crystalline grains as well as certain areas of a CZTSSe device. By introducing a statistical method the active area as a major factor for the current density of a device will be calculated and evaluated. The results show that with LBIC measurements the active area can be quantified, which differs for the investigated cells up to 9%. Additionally, the homogeneity of short circuit current densities of the monograins and also of certain areas of a cell can be detected and quantified.

  7. A statistical study of current-sheet formation above solar active regions based on selforganized criticality

    NASA Astrophysics Data System (ADS)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M.; Anastasiadis, A.; Toutountzi, A.

    2013-09-01

    We treat flaring solar active regions as physical systems having reached the self-organized critical state. Their evolving magnetic configurations in the low corona may satisfy an instability criterion, related to the excession of a specific threshold in the curl of the magnetic field. This imposed instability criterion implies an almost zero resistivity everywhere in the solar corona, except in regions where magnetic-field discontinuities and. hence, local currents, reach the critical value. In these areas, current-driven instabilities enhance the resistivity by many orders of magnitude forming structures which efficiently accelerate charged particles. Simulating the formation of such structures (thought of as current sheets) via a refined SOC cellular-automaton model provides interesting information regarding their statistical properties. It is shown that the current density in such unstable regions follows power-law scaling. Furthermore, the size distribution of the produced current sheets is best fitted by power laws, whereas their formation probability is investigated against the photospheric magnetic configuration (e.g. Polarity Inversion Lines, Plage). The average fractal dimension of the produced current sheets is deduced depending on the selected critical threshold. The above-mentioned statistical description of intermittent electric field structures can be used by collisional relativistic test particle simulations, aiming to interpret particle acceleration in flaring active regions and in strongly turbulent media in astrophysical plasmas. The above work is supported by the Hellenic National Space Weather Research Network (HNSWRN) via the THALIS Programme.

  8. Stochastic IMT (Insulator-Metal-Transition) Neurons: An Interplay of Thermal and Threshold Noise at Bifurcation

    PubMed Central

    Parihar, Abhinav; Jerry, Matthew; Datta, Suman; Raychowdhury, Arijit

    2018-01-01

    Artificial neural networks can harness stochasticity in multiple ways to enable a vast class of computationally powerful models. Boltzmann machines and other stochastic neural networks have been shown to outperform their deterministic counterparts by allowing dynamical systems to escape local energy minima. Electronic implementation of such stochastic networks is currently limited to addition of algorithmic noise to digital machines which is inherently inefficient; albeit recent efforts to harness physical noise in devices for stochasticity have shown promise. To succeed in fabricating electronic neuromorphic networks we need experimental evidence of devices with measurable and controllable stochasticity which is complemented with the development of reliable statistical models of such observed stochasticity. Current research literature has sparse evidence of the former and a complete lack of the latter. This motivates the current article where we demonstrate a stochastic neuron using an insulator-metal-transition (IMT) device, based on electrically induced phase-transition, in series with a tunable resistance. We show that an IMT neuron has dynamics similar to a piecewise linear FitzHugh-Nagumo (FHN) neuron and incorporates all characteristics of a spiking neuron in the device phenomena. We experimentally demonstrate spontaneous stochastic spiking along with electrically controllable firing probabilities using Vanadium Dioxide (VO2) based IMT neurons which show a sigmoid-like transfer function. The stochastic spiking is explained by two noise sources - thermal noise and threshold fluctuations, which act as precursors of bifurcation. As such, the IMT neuron is modeled as an Ornstein-Uhlenbeck (OU) process with a fluctuating boundary resulting in transfer curves that closely match experiments. The moments of interspike intervals are calculated analytically by extending the first-passage-time (FPT) models for Ornstein-Uhlenbeck (OU) process to include a fluctuating boundary. We find that the coefficient of variation of interspike intervals depend on the relative proportion of thermal and threshold noise, where threshold noise is the dominant source in the current experimental demonstrations. As one of the first comprehensive studies of a stochastic neuron hardware and its statistical properties, this article would enable efficient implementation of a large class of neuro-mimetic networks and algorithms. PMID:29670508

  9. Stochastic IMT (Insulator-Metal-Transition) Neurons: An Interplay of Thermal and Threshold Noise at Bifurcation.

    PubMed

    Parihar, Abhinav; Jerry, Matthew; Datta, Suman; Raychowdhury, Arijit

    2018-01-01

    Artificial neural networks can harness stochasticity in multiple ways to enable a vast class of computationally powerful models. Boltzmann machines and other stochastic neural networks have been shown to outperform their deterministic counterparts by allowing dynamical systems to escape local energy minima. Electronic implementation of such stochastic networks is currently limited to addition of algorithmic noise to digital machines which is inherently inefficient; albeit recent efforts to harness physical noise in devices for stochasticity have shown promise. To succeed in fabricating electronic neuromorphic networks we need experimental evidence of devices with measurable and controllable stochasticity which is complemented with the development of reliable statistical models of such observed stochasticity. Current research literature has sparse evidence of the former and a complete lack of the latter. This motivates the current article where we demonstrate a stochastic neuron using an insulator-metal-transition (IMT) device, based on electrically induced phase-transition, in series with a tunable resistance. We show that an IMT neuron has dynamics similar to a piecewise linear FitzHugh-Nagumo (FHN) neuron and incorporates all characteristics of a spiking neuron in the device phenomena. We experimentally demonstrate spontaneous stochastic spiking along with electrically controllable firing probabilities using Vanadium Dioxide (VO 2 ) based IMT neurons which show a sigmoid-like transfer function. The stochastic spiking is explained by two noise sources - thermal noise and threshold fluctuations, which act as precursors of bifurcation. As such, the IMT neuron is modeled as an Ornstein-Uhlenbeck (OU) process with a fluctuating boundary resulting in transfer curves that closely match experiments. The moments of interspike intervals are calculated analytically by extending the first-passage-time (FPT) models for Ornstein-Uhlenbeck (OU) process to include a fluctuating boundary. We find that the coefficient of variation of interspike intervals depend on the relative proportion of thermal and threshold noise, where threshold noise is the dominant source in the current experimental demonstrations. As one of the first comprehensive studies of a stochastic neuron hardware and its statistical properties, this article would enable efficient implementation of a large class of neuro-mimetic networks and algorithms.

  10. Energy Current Cumulants in One-Dimensional Systems in Equilibrium

    NASA Astrophysics Data System (ADS)

    Dhar, Abhishek; Saito, Keiji; Roy, Anjan

    2018-06-01

    A recent theory based on fluctuating hydrodynamics predicts that one-dimensional interacting systems with particle, momentum, and energy conservation exhibit anomalous transport that falls into two main universality classes. The classification is based on behavior of equilibrium dynamical correlations of the conserved quantities. One class is characterized by sound modes with Kardar-Parisi-Zhang scaling, while the second class has diffusive sound modes. The heat mode follows Lévy statistics, with different exponents for the two classes. Here we consider heat current fluctuations in two specific systems, which are expected to be in the above two universality classes, namely, a hard particle gas with Hamiltonian dynamics and a harmonic chain with momentum conserving stochastic dynamics. Numerical simulations show completely different system-size dependence of current cumulants in these two systems. We explain this numerical observation using a phenomenological model of Lévy walkers with inputs from fluctuating hydrodynamics. This consistently explains the system-size dependence of heat current fluctuations. For the latter system, we derive the cumulant-generating function from a more microscopic theory, which also gives the same system-size dependence of cumulants.

  11. Patterns of faecal nematode egg shedding after treatment of sheep with a long-acting formulation of moxidectin.

    PubMed

    Crilly, James Patrick; Jennings, Amy; Sargison, Neil

    2015-09-15

    Much of the current information on the effects of long-acting anthelmintics on nematode populations derives either from research farms or mathematical models. A survey was performed with the aim of establishing how moxidectin is currently being used on sheep farms in the south-east of Scotland. A study was undertaken on a subsection of the surveyed farms to examine the effects of long-acting moxidectin treatments in both spring and autumn on faecal nematode egg output. The survey showed that whole flock treatments of injectable 2% moxidectin were used to control sheep scab on 21% of farms. Injectable 2% moxidectin and oral moxidectin were used to control the periparturient rise in faecal nematode egg shedding by ewes on 13% and 55% of farms respectively. The effects of injectable 2% moxidectin treatment on faecal nematode egg shedding post-treatment in both the autumn and spring were investigated by faecal nematode egg counts at the time of treatment and at 2-weekly interval thereafter on eight and six farms in the autumn and spring, respectively. Faecal egg shedding recommenced at 8 weeks (autumn) and 4 weeks (spring) post-treatment. Counts increased to a peak and then declined. The mean (95% confidence interval) peak counts post-treatment were 2.8 (0.6, 5.1), 3.6 (1.7, 5.5) and 53.5 (25.1, 82.0) eggs per gram (EPG) for autumn-treated ewes, autumn-treated lambs and spring-treated ewes respectively. The spring treated sheep showed a statistically significantly earlier return to faecal egg shedding (p=0.0125, p=0.0342) compared to both other groups, statistically significantly higher peak in egg counts than the autumn treated sheep (p<0.001) and a statistically significantly longer period of positive egg counts (p=0.0148). There was no statistically significant difference in the timing of the peak FECs between autumn and spring (p=0.211). The FECs of all groups of sheep treated with an injectable long-acting formulation of moxidectin became positive earlier than would be expected from the period of persistence given on the datasheet, but post-treatment FECs were very low compared to pre-treatment counts. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Enhancing the Equating of Item Difficulty Metrics: Estimation of Reference Distribution. Research Report. ETS RR-14-07

    ERIC Educational Resources Information Center

    Ali, Usama S.; Walker, Michael E.

    2014-01-01

    Two methods are currently in use at Educational Testing Service (ETS) for equating observed item difficulty statistics. The first method involves the linear equating of item statistics in an observed sample to reference statistics on the same items. The second method, or the item response curve (IRC) method, involves the summation of conditional…

  13. A Comparative Analysis of the Minuteman Education Programs as Currently Offered at Six SAC Bases.

    DTIC Science & Technology

    1980-06-01

    Principles of Marketing 3 Business Statistics 3 Business Law 3 Management Total... Principles of Marketing 3 Mathematics Methods I Total prerequisite hours 26 Required Graduate Courses Policy Formulation and Administration 3 Management...Business and Economic Statistics 3 Intermediate Business and Economic Statistics 3 Principles of Management 3 Corporation Finance 3 Principles of Marketing

  14. The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics

    DTIC Science & Technology

    1974-08-01

    VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated

  15. Indicators of School Crime and Safety: 2009. NCES 2010-012/NCJ 228478

    ERIC Educational Resources Information Center

    Dinkes, Rachel; Kemp, Jana; Baum, Katrina

    2009-01-01

    A joint effort by the Bureau of Justice Statistics and National Center for Education Statistics, this annual report examines crime occurring in school as well as on the way to and from school. It provides the most current detailed statistical information to inform the Nation on the nature of crime in schools. This report presents data on crime at…

  16. Metabolite profiling of fish skin mucus: a novel approach for minimally-invasive environmental exposure monitoring and surveillance.

    PubMed

    Ekman, D R; Skelton, D M; Davis, J M; Villeneuve, D L; Cavallin, J E; Schroeder, A; Jensen, K M; Ankley, G T; Collette, T W

    2015-03-03

    The application of 'omics tools to biologically based monitoring and surveillance of aquatic environments shows considerable promise for complementing chemical monitoring in ecological risk assessments. However, few of the current approaches offer the ability to sample ecologically relevant species (e.g., fish) in a way that produces minimal impact on the health of the organism(s) under study. In the current study we employ liquid chromatography tandem mass spectrometry (LC-MS/MS) to assess the potential for skin mucus-based metabolomics for minimally invasive sampling of the fathead minnow (FHM; Pimephales promelas). Using this approach we were able to detect 204 distinct metabolites in the FHM skin mucus metabolome representing a large number of metabolite classes. An analysis of the sex specificity of the skin mucus metabolome showed it to be highly sexually dimorphic with 72 of the detected metabolites showing a statistically significant bias with regard to sex. Finally, in a proof-of-concept fashion we report on the use of skin mucus-based metabolomics to assess exposures in male and female fathead minnows to an environmentally relevant concentration of bisphenol A, a nearly ubiquitous environmental contaminant and an established endocrine active chemical.

  17. Predictors of severe trunk postures among short-haul truck drivers during non-driving tasks: an exploratory investigation involving video-assessment and driver behavioural self-monitoring.

    PubMed

    Olson, R; Hahn, D I; Buckert, A

    2009-06-01

    Short-haul truck (lorry) drivers are particularly vulnerable to back pain and injury due to exposure to whole body vibration, prolonged sitting and demanding material handling tasks. The current project reports the results of video-based assessments (711 stops) and driver behavioural self-monitoring (BSM) (385 stops) of injury hazards during non-driving work. Participants (n = 3) worked in a trailer fitted with a camera system during baseline and BSM phases. Descriptive analyses showed that challenging customer environments and non-standard ingress/egress were prevalent. Statistical modelling of video-assessment results showed that each instance of manual material handling increased the predicted mean for severe trunk postures by 7%, while customer use of a forklift, moving standard pallets and moving non-standard pallets decreased predicted means by 12%, 20% and 22% respectively. Video and BSM comparisons showed that drivers were accurate at self-monitoring frequent environmental conditions, but less accurate at monitoring trunk postures and rare work events. The current study identified four predictors of severe trunk postures that can be modified to reduce risk of injury among truck drivers and showed that workers can produce reliable self-assessment data with BSM methods for frequent and easily discriminated events environmental.

  18. Prevalence of psychiatric disorders in patients with diabetes types 1 and 2.

    PubMed

    Maia, Ana Claudia C de Ornelas; Braga, Arthur de Azevedo; Brouwers, Amanda; Nardi, Antonio Egidio; Oliveira e Silva, Adriana Cardoso de

    2012-11-01

    Diabetes mellitus, classified into types 1 and 2, is a chronic disease that shows high comorbidity with psychiatric disorders. Insulin-dependent patients show a higher prevalence of psychiatric disorders than do patients with type 2 diabetes. This research involved the participation of 200 subjects divided into 2 groups: 100 patients with diabetes type 1 and 100 patients with diabetes type 2. This study used the Mini International Neuropsychiatric Interview for the identification of psychiatric disorders. Of the 200 participants, 85 (42.5%) were found to have at least 1 psychiatric disorder. The most prevalent disorders were generalized anxiety disorder (21%), dysthymia (15%), social phobia (7%), current depression (5.5%), lifelong depression (3.5%), panic disorder (2.5%), and risk of suicide (2%). Other disorders with lower prevalence were also identified. The groups showed a statistically significant difference in the presence of dysthymia, current depression, and panic disorder, which were more prevalent in patients with diabetes type 1. The high prevalence of psychiatric disorders in diabetic patients points to the need for greater investment in appropriate diagnostic evaluation of patients that considers mental issues. The difference identified between the groups shows that preventive measures and therapeutic projects should consider the specific demands of each type of diabetes. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  19. Uncertainties in Estimates of Fleet Average Fuel Economy : A Statistical Evaluation

    DOT National Transportation Integrated Search

    1977-01-01

    Research was performed to assess the current Federal procedure for estimating the average fuel economy of each automobile manufacturer's new car fleet. Test vehicle selection and fuel economy estimation methods were characterized statistically and so...

  20. 7 CFR 2201.13 - Lender.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... guaranteed loans; (4) The performance of the Lender's loan portfolio, including its current delinquency rate... nationally recognized statistical rating organization, as evidenced by written confirmation from the nationally recognized statistical rating organization, subject to updating upon request of the Board; and (ii...

  1. Measurement of positive direct current corona pulse in coaxial wire-cylinder gap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin, Han, E-mail: hanyin1986@gmail.com; Zhang, Bo, E-mail: shizbcn@mail.tsinghua.edu.cn; He, Jinliang, E-mail: hejl@tsinghua.edu.cn

    In this paper, a system is designed and developed to measure the positive corona current in coaxial wire-cylinder gaps. The characteristic parameters of corona current pulses, such as the amplitude, rise time, half-wave time, and repetition frequency, are statistically analyzed and a new set of empirical formulas are derived by numerical fitting. The influence of space charges on corona currents is tested by using three corona cages with different radii. A numerical method is used to solve a simplified ion-flow model to explain the influence of space charges. Based on the statistical results, a stochastic model is developed to simulatemore » the corona pulse trains. And this model is verified by comparing the simulated frequency-domain responses with the measured ones.« less

  2. Mutual interference between statistical summary perception and statistical learning.

    PubMed

    Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B

    2011-09-01

    The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.

  3. 77 FR 17404 - Notice of Intent To Seek Approval To Revise and Extend a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Seek Approval To Revise and Extend a Currently Approved Information Collection AGENCY: National Agricultural... Date of Current Approval: December 31, 2012. Type of Request: Intent to revise and extend a currently...

  4. Statistics used in current nursing research.

    PubMed

    Zellner, Kathleen; Boerst, Connie J; Tabb, Wil

    2007-02-01

    Undergraduate nursing research courses should emphasize the statistics most commonly used in the nursing literature to strengthen students' and beginning researchers' understanding of them. To determine the most commonly used statistics, we reviewed all quantitative research articles published in 13 nursing journals in 2000. The findings supported Beitz's categorization of kinds of statistics. Ten primary statistics used in 80% of nursing research published in 2000 were identified. We recommend that the appropriate use of those top 10 statistics be emphasized in undergraduate nursing education and that the nursing profession continue to advocate for the use of methods (e.g., power analysis, odds ratio) that may contribute to the advancement of nursing research.

  5. Hematocrit levels as cardiovascular risk among taxi drivers in Bangkok, Thailand

    PubMed Central

    ISHIMARU, Tomohiro; ARPHORN, Sara; JIRAPONGSUWAN, Ann

    2016-01-01

    In Thailand, taxi drivers employed in the informal sector often experience hazardous working conditions. Previous studies revealed that elevated Hematocrit (HCT) is a predictor of cardiovascular disease (CVD) risk. This study assessed factors associated with HCT in taxi drivers to predict their occupational CVD risk factors. A cross-sectional study was conducted on 298 male taxi drivers who joined a health check-up campaign in Bangkok, Thailand. HCT and body mass index were retrieved from participant health check-up files. Self-administered questionnaires assessed demographics, driving mileage, working hours, and lifestyle. Statistical associations were analyzed using stepwise linear regression. Our results showed that obesity (p=0.007), daily alcohol drinking (p=0.003), and current or past smoking (p=0.016) were associated with higher HCT levels. While working hours were not directly associated with HCT levels in the current study, the effect on overworking is statistically arguable because most participants worked substantially longer hours. Our findings suggest that taxi drivers’ CVD risk may be increased by their unhealthy work styles. Initiatives to improve general working conditions for taxi drivers should take into account health promotion and CVD prevention. The policy of providing periodic health check-ups is important to make workers in the informal sector aware of their health status. PMID:27151439

  6. Subcellular distributions of metals and metal induced stress: A field study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, K.D.; Howe, S.; Sanders, B.M.

    This paper reports the results of a field study which took place around an exploratory well located in the Santa Barbara Channel. This study was designed to test for significant temporal and spatial differences in the concentrations of a number of drilling fluid-associated metals in both the sediments and biota. Temporal changes in the distribution of Ba, Cd, Cr, Cu, Hg, Ni, Pb, and Zn were examined in the sediments, and the bioaccumulation and subcellular distribution of these metals were examined in three benthic invertebrate species before and after drilling. Statistically significant increases in the accumulation of several of themore » metals were found in the surface sediments down current from the site after drilling with Ba showing the most pronounced increase. Statistically significant increases in the bioaccumulation of Ba were also observed in two of the three species examined, Cyclocardia ventricosa and Pactinaria californiensis. Within these organisms the majority of the Ba was localized in the granular pellets (>97%) and less than 0.1% accumulated in the cytosol. These data indicate that although bioaccumulation of Ba occurs in some species immediately down current from the well, most of it remains in an insoluble for, presumably as BaSO{sub 4}.« less

  7. Hematocrit levels as cardiovascular risk among taxi drivers in Bangkok, Thailand.

    PubMed

    Ishimaru, Tomohiro; Arphorn, Sara; Jirapongsuwan, Ann

    2016-10-08

    In Thailand, taxi drivers employed in the informal sector often experience hazardous working conditions. Previous studies revealed that elevated Hematocrit (HCT) is a predictor of cardiovascular disease (CVD) risk. This study assessed factors associated with HCT in taxi drivers to predict their occupational CVD risk factors. A cross-sectional study was conducted on 298 male taxi drivers who joined a health check-up campaign in Bangkok, Thailand. HCT and body mass index were retrieved from participant health check-up files. Self-administered questionnaires assessed demographics, driving mileage, working hours, and lifestyle. Statistical associations were analyzed using stepwise linear regression. Our results showed that obesity (p=0.007), daily alcohol drinking (p=0.003), and current or past smoking (p=0.016) were associated with higher HCT levels. While working hours were not directly associated with HCT levels in the current study, the effect on overworking is statistically arguable because most participants worked substantially longer hours. Our findings suggest that taxi drivers' CVD risk may be increased by their unhealthy work styles. Initiatives to improve general working conditions for taxi drivers should take into account health promotion and CVD prevention. The policy of providing periodic health check-ups is important to make workers in the informal sector aware of their health status.

  8. Socioeconomic inequality in smoking in low-income and middle-income countries: results from the World Health Survey.

    PubMed

    Hosseinpoor, Ahmad Reza; Parker, Lucy Anne; Tursan d'Espaignet, Edouard; Chatterji, Somnath

    2012-01-01

    To assess the magnitude and pattern of socioeconomic inequality in current smoking in low and middle income countries. We used data from the World Health Survey [WHS] in 48 low-income and middle-income countries to estimate the crude prevalence of current smoking according to household wealth quintile. A Poisson regression model with a robust variance was used to generate the Relative Index of Inequality [RII] according to wealth within each of the countries studied. In males, smoking was disproportionately prevalent in the poor in the majority of countries. In numerous countries the poorest men were over 2.5 times more likely to smoke than the richest men. Socioeconomic inequality in women was more varied showing patterns of both pro-rich and pro-poor inequality. In 20 countries pro-rich relative socioeconomic inequality was statistically significant: the poorest women had a higher prevalence of smoking compared to the richest women. Conversely, in 9 countries women in the richest population groups had a statistically significant greater risk of smoking compared to the poorest groups. Both the pattern and magnitude of relative inequality may vary greatly between countries. Prevention measures should address the specific pattern of smoking inequality observed within a population.

  9. Socioeconomic Inequality in Smoking in Low-Income and Middle-Income Countries: Results from the World Health Survey

    PubMed Central

    Hosseinpoor, Ahmad Reza; Parker, Lucy Anne; Tursan d'Espaignet, Edouard; Chatterji, Somnath

    2012-01-01

    Objectives To assess the magnitude and pattern of socioeconomic inequality in current smoking in low and middle income countries. Methods We used data from the World Health Survey [WHS] in 48 low-income and middle-income countries to estimate the crude prevalence of current smoking according to household wealth quintile. A Poisson regression model with a robust variance was used to generate the Relative Index of Inequality [RII] according to wealth within each of the countries studied. Results In males, smoking was disproportionately prevalent in the poor in the majority of countries. In numerous countries the poorest men were over 2.5 times more likely to smoke than the richest men. Socioeconomic inequality in women was more varied showing patterns of both pro-rich and pro-poor inequality. In 20 countries pro-rich relative socioeconomic inequality was statistically significant: the poorest women had a higher prevalence of smoking compared to the richest women. Conversely, in 9 countries women in the richest population groups had a statistically significant greater risk of smoking compared to the poorest groups. Conclusion Both the pattern and magnitude of relative inequality may vary greatly between countries. Prevention measures should address the specific pattern of smoking inequality observed within a population. PMID:22952617

  10. Comparative Evaluation of Four Risk Scores for Predicting Mortality in Patients With Implantable Cardioverter-defibrillator for Primary Prevention.

    PubMed

    Rodríguez-Mañero, Moisés; Abu Assi, Emad; Sánchez-Gómez, Juan Miguel; Fernández-Armenta, Juan; Díaz-Infante, Ernesto; García-Bolao, Ignacio; Benezet-Mazuecos, Juan; Andrés Lahuerta, Ana; Expósito-García, Víctor; Bertomeu-González, Vicente; Arce-León, Álvaro; Barrio-López, María Teresa; Peinado, Rafael; Martínez-Sande, Luis; Arias, Miguel A

    2016-11-01

    Several clinical risk scores have been developed to identify patients at high risk of all-cause mortality despite implantation of an implantable cardioverter-defibrillator. We aimed to examine and compare the predictive capacity of 4 simple scoring systems (MADIT-II, FADES, PACE and SHOCKED) for predicting mortality after defibrillator implantation for primary prevention of sudden cardiac death in a Mediterranean country. A multicenter retrospective study was performed in 15 Spanish hospitals. Consecutive patients referred for defibrillator implantation between January 2010 and December 2011 were included. A total of 916 patients with ischemic and nonischemic heart disease were included (mean age, 62 ± 11 years, 81.4% male). Over 33.4 ± 12.9 months, 113 (12.3%) patients died (cardiovascular origin in 86 [9.4%] patients). At 12, 24, 36, and 48 months, mortality rates were 4.5%, 7.6%, 10.8%, and 12.3% respectively. All the risk scores showed a stepwise increase in the risk of death throughout the scoring system of each of the scores and all 4 scores identified patients at greater risk of mortality. The scores were significantly associated with all-cause mortality throughout the follow-up period. PACE displayed the lowest c-index value regardless of whether the population had heart disease of ischemic (c-statistic = 0.61) or nonischemic origin (c-statistic = 0.61), whereas MADIT-II (c-statistic = 0.67 and 0.65 in ischemic and nonischemic cardiomyopathy, respectively), SHOCKED (c-statistic = 0.68 and 0.66, respectively), and FADES (c-statistic = 0.66 and 0.60) provided similar c-statistic values (P ≥ .09). In this nontrial-based cohort of Mediterranean patients, the 4 evaluated risk scores showed a significant stepwise increase in the risk of death. Among the currently available risk scores, MADIT-II, FADES, and SHOCKED provide slightly better performance than PACE. Copyright © 2016 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  11. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    USGS Publications Warehouse

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  12. The most intense electric currents in turbulent high speed solar wind

    NASA Astrophysics Data System (ADS)

    Podesta, J. J.

    2017-12-01

    Theory and simulations suggest that dissipation of turbulent energy in collisionless astrophysical plasmas occurs most rapidly in spatial regions where the current density is most intense. To advance understanding of plasma heating by turbulent dissipation in the solar corona and solar wind, it is of interest to characterize the properties of plasma regions where the current density takes exceptionally large values and to identify the operative dissipation processes. In the solar wind, the curl of the magnetic field cannot be measured using data from a single spacecraft, however, a suitable proxy for this quantity can be constructed from the spatial derivative of the magnetic field along the flow direction of the plasma. This new approach is used to study the properties of the most intense current carrying structures in a high speed solar wind stream near 1 AU. In this study, based on 11 Hz magnetometer data from the WIND spacecraft, the spatial resolution of the proxy technique is approximately equal to the proton inertial length. Intense current sheets or current carrying structures were identified as events where the magnitude of the current density exceeds μ+5σ, where μ and σ are the mean and standard deviation of the magnitude of the current density (or its proxy), respectively. Statistical studies show (1) the average size of these 5σ events is close to the smallest resolvable scale in the data set, the proton inertial length; (2) the linear distance between neighboring events follows a power law distribution; and (3) the average peak current density of 5σ events is around 1 pA/cm2. The analysis techniques used in these studies have been validated using simulated spacecraft data from three dimensional hybrid simulations which show that results based on the analysis of the proxy are qualitatively and quantitatively similar to results based on the analysis of the true current density.

  13. Advanced electrical current measurements of microdischarges: evidence of sub-critical pulses and ion currents in barrier discharge in air

    NASA Astrophysics Data System (ADS)

    Synek, Petr; Zemánek, Miroslav; Kudrle, Vít; Hoder, Tomáš

    2018-04-01

    Electrical current measurements in corona or barrier microdischarges are a challenge as they require both high temporal resolution and a large dynamic range of the current probe used. In this article, we apply a simple self-assembled current probe and compare it to commercial ones. An analysis in the time and frequency domain is carried out. Moreover, an improved methodology is presented, enabling both temporal resolution in sub-nanosecond times and current sensitivity in the order of tens of micro-amperes. Combining this methodology with a high-tech oscilloscope and self-developed software, a unique statistical analysis of currents in volume barrier discharge driven in atmospheric-pressure air is made for over 80 consecutive periods of a 15 kHz applied voltage. We reveal the presence of repetitive sub-critical current pulses and conclude that these can be identified with the discharging of surface charge microdomains. Moreover, extremely low, long-lasting microsecond currents were detected which are caused by ion flow, and are analysed in detail. The statistical behaviour presented gives deeper insight into the discharge physics of these usually undetectable current signals.

  14. [Hippocampal subfield volume alteration in post-traumatic stress disorder: a magnetic resonance imaging study].

    PubMed

    Lu, Lu; Zhang, Lianqing; Hu, Xinyu; Hu, Xiaoxiao; Li, Lingjiang; Gong, Qiyong; Huang, Xiaoqi

    2018-04-01

    In the current study, we aim to investigate whether post-traumatic stress disorder (PTSD) is associated with structural alterations in specific subfields of hippocampus comparing with trauma-exposed control (TC) in a relatively large sample. We included 67 PTSD patients who were diagnosed under Diagnostic and Statistical Manual of Mental Disorders (4th Edition) (DSM-Ⅳ) criteria and 78 age- and sex-matched non-PTSD adult survivors who experienced similar stressors. High resolution T1 weighted images were obtained via a GE 3.0 T scanner. The structural data was automatically segmented using FreeSurfer software, and volume of whole hippocampus and subfield including CA1, CA2-3, CA4-DG, fimbria, presubiculum, subiculum and fissure were extracted. Volume differences between the two groups were statistically compared with age, years of education, duration from the events and intracranial volume (ICV) as covariates. Hemisphere, sex and diagnosis were entered as fixed factors. Relationship between morphometric measurements with Clinician-Administered PTSD Scale (CAPS) score and illness duration were performed using Pearson's correlation with SPSS. Comparing to TC, PTSD patients showed no statistically significant alteration in volumes of the whole hippocampus and all the subfields ( P > 0.05). In male patients, there were significant correlations between CAPS score and volume of right CA2-3 ( R 2 = 0.197, P = 0.034), right subiculum ( R 2 = 0.245, P = 0.016), and duration statistically correlated with right fissure ( R 2 = 0.247, P = 0.016). In female patients, CAPS scores significant correlated with volume of left presubiculum ( R 2 = 0.095, P = 0.042), left subiculum ( R 2 = 0.090, P = 0.048), and left CA4-DG ( R 2 = 0.099, P = 0.037). The main findings of the current study suggest that stress event causes non-selective damage to hippocampus in both PTSD patients and TC, and gender-specific lateralization may underlie PTSD pathology.

  15. Study Designs and Statistical Analyses for Biomarker Research

    PubMed Central

    Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori

    2012-01-01

    Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528

  16. Generic dynamical phase transition in one-dimensional bulk-driven lattice gases with exclusion

    NASA Astrophysics Data System (ADS)

    Lazarescu, Alexandre

    2017-06-01

    Dynamical phase transitions are crucial features of the fluctuations of statistical systems, corresponding to boundaries between qualitatively different mechanisms of maintaining unlikely values of dynamical observables over long periods of time. They manifest themselves in the form of non-analyticities in the large deviation function of those observables. In this paper, we look at bulk-driven exclusion processes with open boundaries. It is known that the standard asymmetric simple exclusion process exhibits a dynamical phase transition in the large deviations of the current of particles flowing through it. That phase transition has been described thanks to specific calculation methods relying on the model being exactly solvable, but more general methods have also been used to describe the extreme large deviations of that current, far from the phase transition. We extend those methods to a large class of models based on the ASEP, where we add arbitrary spatial inhomogeneities in the rates and short-range potentials between the particles. We show that, as for the regular ASEP, the large deviation function of the current scales differently with the size of the system if one considers very high or very low currents, pointing to the existence of a dynamical phase transition between those two regimes: high current large deviations are extensive in the system size, and the typical states associated to them are Coulomb gases, which are highly correlated; low current large deviations do not depend on the system size, and the typical states associated to them are anti-shocks, consistently with a hydrodynamic behaviour. Finally, we illustrate our results numerically on a simple example, and we interpret the transition in terms of the current pushing beyond its maximal hydrodynamic value, as well as relate it to the appearance of Tracy-Widom distributions in the relaxation statistics of such models. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Alexandre Lazarescu was selected by the Editorial Board of J. Phys. A as an Emerging Talent.

  17. The forecasting of menstruation based on a state-space modeling of basal body temperature time series.

    PubMed

    Fukaya, Keiichi; Kawamori, Ai; Osada, Yutaka; Kitazawa, Masumi; Ishiguro, Makio

    2017-09-20

    Women's basal body temperature (BBT) shows a periodic pattern that associates with menstrual cycle. Although this fact suggests a possibility that daily BBT time series can be useful for estimating the underlying phase state as well as for predicting the length of current menstrual cycle, little attention has been paid to model BBT time series. In this study, we propose a state-space model that involves the menstrual phase as a latent state variable to explain the daily fluctuation of BBT and the menstruation cycle length. Conditional distributions of the phase are obtained by using sequential Bayesian filtering techniques. A predictive distribution of the next menstruation day can be derived based on this conditional distribution and the model, leading to a novel statistical framework that provides a sequentially updated prediction for upcoming menstruation day. We applied this framework to a real data set of women's BBT and menstruation days and compared prediction accuracy of the proposed method with that of previous methods, showing that the proposed method generally provides a better prediction. Because BBT can be obtained with relatively small cost and effort, the proposed method can be useful for women's health management. Potential extensions of this framework as the basis of modeling and predicting events that are associated with the menstrual cycles are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  18. Critical Values for Yen’s Q3: Identification of Local Dependence in the Rasch Model Using Residual Correlations

    PubMed Central

    Christensen, Karl Bang; Makransky, Guido; Horton, Mike

    2016-01-01

    The assumption of local independence is central to all item response theory (IRT) models. Violations can lead to inflated estimates of reliability and problems with construct validity. For the most widely used fit statistic Q3, there are currently no well-documented suggestions of the critical values which should be used to indicate local dependence (LD), and for this reason, a variety of arbitrary rules of thumb are used. In this study, an empirical data example and Monte Carlo simulation were used to investigate the different factors that can influence the null distribution of residual correlations, with the objective of proposing guidelines that researchers and practitioners can follow when making decisions about LD during scale development and validation. A parametric bootstrapping procedure should be implemented in each separate situation to obtain the critical value of LD applicable to the data set, and provide example critical values for a number of data structure situations. The results show that for the Q3 fit statistic, no single critical value is appropriate for all situations, as the percentiles in the empirical null distribution are influenced by the number of items, the sample size, and the number of response categories. Furthermore, the results show that LD should be considered relative to the average observed residual correlation, rather than to a uniform value, as this results in more stable percentiles for the null distribution of an adjusted fit statistic. PMID:29881087

  19. The Relationship Between Organizational Culture and Organizational Commitment in Zahedan University of Medical Sciences.

    PubMed

    Azizollah, Arbabisarjou; Abolghasem, Farhang; Mohammad Amin, Dadgar

    2015-12-14

    Organizations effort is to achieve a common goal. There are many constructs needed for organizations. Organizational culture and organizational commitment are special concepts in management. The objective of the current research is to study the relationship between organizational culture and organizational commitment among the personnel of Zahedan University of Medical Sciences.  This is a descriptive- correlational study. The statistical population was whole tenured staff of Zahedan University of Medical Sciences that worked for this organization in 2012-2013. Random sampling method was used and 165 samples were chosen. Two standardized questionnaires of the organizational culture (Schein, 1984) and organizational commitment (Meyer & Allen, 2002) were applied. The face and construct validity of the questionnaires were approved by the lecturers of Management and experts. Reliability of questionnaires of the organizational culture and organizational commitment were 0.89 and 0.88 respectively, by Cronbach's Alpha coefficient. All statistical calculations performed using Statistical Package for the Social Sciences version 21.0 (SPSS Inc., Chicago, IL, USA). The level of significance was set at P<0.05. The findings of the study showed that there was a significant relationship between organizational culture and organizational commitment (P value=0.027). Also, the results showed that there was a significant relation between organizational culture and affective commitment (P-value=0.009), organizational culture and continuance commitment (P-value=0.009), and organizational culture and normative commitment (P-value=0.009).

  20. The Relationship Between Organizational Culture and Organizational Commitment in Zahedan University of Medical Sciences

    PubMed Central

    Azizollah, Arbabisarjou; Abolghasem, Farhang; Amin, Dadgar Mohammad

    2016-01-01

    Background and Objective: Organizations effort is to achieve a common goal. There are many constructs needed for organizations. Organizational culture and organizational commitment are special concepts in management. The objective of the current research is to study the relationship between organizational culture and organizational commitment among the personnel of Zahedan University of Medical Sciences. Materials and Methods: This is a descriptive- correlational study. The statistical population was whole tenured staff of Zahedan University of Medical Sciences that worked for this organization in 2012-2013. Random sampling method was used and 165 samples were chosen. Two standardized questionnaires of the organizational culture (Schein, 1984) and organizational commitment (Meyer & Allen, 2002) were applied. The face and construct validity of the questionnaires were approved by the lecturers of Management and experts. Reliability of questionnaires of the organizational culture and organizational commitment were 0.89 and 0.88 respectively, by Cronbach’s Alpha coefficient. All statistical calculations performed using Statistical Package for the Social Sciences version 21.0 (SPSS Inc., Chicago, IL, USA). The level of significance was set at P<0.05. Findings: The findings of the study showed that there was a significant relationship between organizational culture and organizational commitment (P value=0.027). Also, the results showed that there was a significant relation between organizational culture and affective commitment (P-value=0.009), organizational culture and continuance commitment (P-value=0.009), and organizational culture and normative commitment (P-value=0.009). PMID:26925884

  1. Statistical and dynamical forecast of regional precipitation after mature phase of ENSO

    NASA Astrophysics Data System (ADS)

    Sohn, S.; Min, Y.; Lee, J.; Tam, C.; Ahn, J.

    2010-12-01

    While the seasonal predictability of general circulation models (GCMs) has been improved, the current model atmosphere in the mid-latitude does not respond correctly to external forcing such as tropical sea surface temperature (SST), particularly over the East Asia and western North Pacific summer monsoon regions. In addition, the time-scale of prediction scope is considerably limited and the model forecast skill still is very poor beyond two weeks. Although recent studies indicate that coupled model based multi-model ensemble (MME) forecasts show the better performance, the long-lead forecasts exceeding 9 months still show a dramatic decrease of the seasonal predictability. This study aims at diagnosing the dynamical MME forecasts comprised of the state of art 1-tier models as well as comparing them with the statistical model forecasts, focusing on the East Asian summer precipitation predictions after mature phase of ENSO. The lagged impact of El Nino as major climate contributor on the summer monsoon in model environments is also evaluated, in the sense of the conditional probabilities. To evaluate the probability forecast skills, the reliability (attributes) diagram and the relative operating characteristics following the recommendations of the World Meteorological Organization (WMO) Standardized Verification System for Long-Range Forecasts are used in this study. The results should shed light on the prediction skill for dynamical model and also for the statistical model, in forecasting the East Asian summer monsoon rainfall with a long-lead time.

  2. Assessment of the effect of Allium sativum on serum nitric oxide level and hepatic histopathology in experimental cystic echinococcosis in mice.

    PubMed

    Ali, Nehad Mahmoud; Ibrahim, Ayman Nabil; Ahmed, Naglaa Samier

    2016-09-01

    The current study was carried out to evaluate the prophylactic and therapeutic effects of Allium sativum on experimental cystic echinococcosis by measuring the serum nitric oxide level and studying hepatic histopathological changes. The experimental animals were divided into five groups, ten mice in each, group (I): prophylactic; group (II): therapeutic; group (III): prophylactic and therapeutic; group (IV): infected nontreated; group (V): non infected non treated. The results showed that serum nitric oxide was significantly increased as a result of infection in all infected groups compared to group V. Statistical significant difference was noted in serum nitrate level in group I at 1st and 8th week post infection compared to the same time interval in group IV. In group II, statistical significance was noticed only at the 1st week post infection. Statistical significant difference was noted in serum nitrate level in group III at 1st, 4th, 6th and 8th week post infection compared to same time interval in group IV. Hydatid cysts developed in livers of mice of group IV as early as 4 weeks of infection while no cysts were found in groups I,II and III. Histopathologically there were moderate pathological changes in group I and group II as hepatocytes showed moderate steatosis, moderate venous congestion and inflammatory cellular infiltrate with foci of degeneration and necrosis. While livers of mice of group III showed mild steatosis, mild venous congestion, mild inflammatory cellular infiltrate, no necrosis and no biliary hyperplasia. Accordingly, that garlic (Allium sativum) may be a promising phototherapeutic agent for cystic echinococcosis.

  3. [Influence of diet and behavior related factors on the peripheral blood triglyceride levels in adults: a cross-sectional study].

    PubMed

    Liang, M B; Wang, H; Zhang, J; He, Q F; Fang, L; Wang, L X; Su, D T; Zhao, M; Zhang, X W; Hu, R Y; Cong, L M; Ding, G G; Ye, Z; Yu, M

    2017-12-10

    Objective: To study the influence of diet and behavior related factors on the peripheral blood triglyceride levels in adults, through a cross-sectional survey. Methods: The current study included 13 434 subjects without histories of major chronic diseases from a population-based cross-sectional survey: the 2010 Metabolic Syndrome Survey in Zhejiang Province. A generalized linear model was used to investigate the influence of diet/behavior-related factors on the peripheral blood triglyceride levels. Results: Mean TG of the sample population appeared as (1.36±1.18) mmol/L. The proportions of elevated TG and marginally elevated TG were 10.3% and 11.0% respectively, with statistically significant difference seen between males and females ( χ (2)=44.135, P <0.001). In this sampled population, the daily intake of cooking oil was exceeding the recommendation levels by over 50% while the intake of fruit, milk, nuts and physical exercise were much below the recommendation. There were statistically significant differences between smoking, alcohol-intake, meat, fruit and water intake in male population from this study. However, in females, the intake of aquatic product and physical exercise showed statistically significant differences. After controlling for other variables, factors as age, drinking, staple food and aquatic products showed positive influence on TG, while milk presented negative influence on TG. Through interaction analysis, fruit and meat intake in males and staple food in females showed positive influence on TG, when compared to the reference group. Conclusion: Hyperglyceridemia appeared as one of the major metabolic abnormities in Zhejiang province. Programs on monitoring the alcohol, staple food and meat intake should be priority on intervention, in the communities.

  4. Statistically Assessing Time-Averaged and Paleosecular Variation Field Models Against Paleomagnetic Directional Data Sets. Can Likely non-Zonal Features be Detected in a Robust way ?

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Khokhlov, A.

    2007-12-01

    We recently introduced a method to rigorously test the statistical compatibility of combined time-averaged (TAF) and paleosecular variation (PSV) field models against any lava flow paleomagnetic database (Khokhlov et al., 2001, 2006). Applying this method to test (TAF+PSV) models against synthetic data produced from those shows that the method is very efficient at discriminating models, and very sensitive, provided those data errors are properly taken into account. This prompted us to test a variety of published combined (TAF+PSV) models against a test Bruhnes stable polarity data set extracted from the Quidelleur et al. (1994) data base. Not surprisingly, ignoring data errors leads all models to be rejected. But taking data errors into account leads to the stimulating conclusion that at least one (TAF+PSV) model appears to be compatible with the selected data set, this model being purely axisymmetric. This result shows that in practice also, and with the data bases currently available, the method can discriminate various candidate models and decide which actually best fits a given data set. But it also shows that likely non-zonal signatures of non-homogeneous boundary conditions imposed by the mantle are difficult to identify as statistically robust from paleomagnetic directional data sets. In the present paper, we will discuss the possibility that such signatures could eventually be identified as robust with the help of more recent data sets (such as the one put together under the collaborative "TAFI" effort, see e.g. Johnson et al. abstract #GP21A-0013, AGU Fall Meeting, 2005) or by taking additional information into account (such as the possible coincidence of non-zonal time-averaged field patterns with analogous patterns in the modern field).

  5. Evaluation of Lightning Incidence to Elements of a Complex Structure: A Monte Carlo Approach

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Rakov, V. A.

    2008-01-01

    There are complex structures for which the installation and positioning of the lightning protection system (LPS) cannot be done using the lightning protection standard guidelines. As a result, there are some "unprotected" or "exposed" areas. In an effort to quantify the lightning threat to these areas, a Monte Carlo statistical tool has been developed. This statistical tool uses two random number generators: a uniform distribution to generate the origin of downward propagating leaders and a lognormal distribution to generate the corresponding returns stroke peak currents. Downward leaders propagate vertically downward and their striking distances are defined by the polarity and peak current. Following the electrogeometrical concept, we assume that the leader attaches to the closest object within its striking distance. The statistical analysis is run for N number of years with an assumed ground flash density and the output of the program is the probability of direct attachment to objects of interest with its corresponding peak current distribution.

  6. Multimodal Image Analysis in Alzheimer’s Disease via Statistical Modelling of Non-local Intensity Correlations

    NASA Astrophysics Data System (ADS)

    Lorenzi, Marco; Simpson, Ivor J.; Mendelson, Alex F.; Vos, Sjoerd B.; Cardoso, M. Jorge; Modat, Marc; Schott, Jonathan M.; Ourselin, Sebastien

    2016-04-01

    The joint analysis of brain atrophy measured with magnetic resonance imaging (MRI) and hypometabolism measured with positron emission tomography with fluorodeoxyglucose (FDG-PET) is of primary importance in developing models of pathological changes in Alzheimer’s disease (AD). Most of the current multimodal analyses in AD assume a local (spatially overlapping) relationship between MR and FDG-PET intensities. However, it is well known that atrophy and hypometabolism are prominent in different anatomical areas. The aim of this work is to describe the relationship between atrophy and hypometabolism by means of a data-driven statistical model of non-overlapping intensity correlations. For this purpose, FDG-PET and MRI signals are jointly analyzed through a computationally tractable formulation of partial least squares regression (PLSR). The PLSR model is estimated and validated on a large clinical cohort of 1049 individuals from the ADNI dataset. Results show that the proposed non-local analysis outperforms classical local approaches in terms of predictive accuracy while providing a plausible description of disease dynamics: early AD is characterised by non-overlapping temporal atrophy and temporo-parietal hypometabolism, while the later disease stages show overlapping brain atrophy and hypometabolism spread in temporal, parietal and cortical areas.

  7. Fast Radio Bursts’ Recipes for the Distributions of Dispersion Measures, Flux Densities, and Fluences

    NASA Astrophysics Data System (ADS)

    Niino, Yuu

    2018-05-01

    We investigate how the statistical properties of dispersion measure (DM) and apparent flux density/fluence of (nonrepeating) fast radio bursts (FRBs) are determined by unknown cosmic rate density history [ρ FRB(z)] and luminosity function (LF) of the transient events. We predict the distributions of DMs, flux densities, and fluences of FRBs taking account of the variation of the receiver efficiency within its beam, using analytical models of ρ FRB(z) and LF. Comparing the predictions with the observations, we show that the cumulative distribution of apparent fluences suggests that FRBs originate at cosmological distances and ρ FRB increases with redshift resembling the cosmic star formation history (CSFH). We also show that an LF model with a bright-end cutoff at log10 L ν (erg s‑1 Hz‑1) ∼ 34 are favored to reproduce the observed DM distribution if ρ FRB(z) ∝ CSFH, although the statistical significance of the constraints obtained with the current size of the observed sample is not high. Finally, we find that the correlation between DM and flux density of FRBs is potentially a powerful tool to distinguish whether FRBs are at cosmological distances or in the local universe more robustly with future observations.

  8. Effect of Robotics on Elementary Preservice Teachers' Self-Efficacy, Science Learning, and Computational Thinking

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini; Angeli, Charoula

    2017-04-01

    The current impetus for increasing STEM in K-12 education calls for an examination of how preservice teachers are being prepared to teach STEM. This paper reports on a study that examined elementary preservice teachers' ( n = 21) self-efficacy, understanding of science concepts, and computational thinking as they engaged with robotics in a science methods course. Data collection methods included pretests and posttests on science content, prequestionnaires and postquestionnaires for interest and self-efficacy, and four programming assignments. Statistical results showed that preservice teachers' interest and self-efficacy with robotics increased. There was a statistically significant difference between preknowledge and postknowledge scores, and preservice teachers did show gains in learning how to write algorithms and debug programs over repeated programming tasks. The findings suggest that the robotics activity was an effective instructional strategy to enhance interest in robotics, increase self-efficacy to teach with robotics, develop understandings of science concepts, and promote the development of computational thinking skills. Study findings contribute quantitative evidence to the STEM literature on how robotics develops preservice teachers' self-efficacy, science knowledge, and computational thinking skills in higher education science classroom contexts.

  9. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  10. Sequencing proteins with transverse ionic transport in nanochannels.

    PubMed

    Boynton, Paul; Di Ventra, Massimiliano

    2016-05-03

    De novo protein sequencing is essential for understanding cellular processes that govern the function of living organisms and all sequence modifications that occur after a protein has been constructed from its corresponding DNA code. By obtaining the order of the amino acids that compose a given protein one can then determine both its secondary and tertiary structures through structure prediction, which is used to create models for protein aggregation diseases such as Alzheimer's Disease. Here, we propose a new technique for de novo protein sequencing that involves translocating a polypeptide through a synthetic nanochannel and measuring the ionic current of each amino acid through an intersecting perpendicular nanochannel. We find that the distribution of ionic currents for each of the 20 proteinogenic amino acids encoded by eukaryotic genes is statistically distinct, showing this technique's potential for de novo protein sequencing.

  11. Voltage-Gated Lipid Ion Channels

    PubMed Central

    Blicher, Andreas; Heimburg, Thomas

    2013-01-01

    Synthetic lipid membranes can display channel-like ion conduction events even in the absence of proteins. We show here that these events are voltage-gated with a quadratic voltage dependence as expected from electrostatic theory of capacitors. To this end, we recorded channel traces and current histograms in patch-experiments on lipid membranes. We derived a theoretical current-voltage relationship for pores in lipid membranes that describes the experimental data very well when assuming an asymmetric membrane. We determined the equilibrium constant between closed and open state and the open probability as a function of voltage. The voltage-dependence of the lipid pores is found comparable to that of protein channels. Lifetime distributions of open and closed events indicate that the channel open distribution does not follow exponential statistics but rather power law behavior for long open times. PMID:23823188

  12. On magnetic reconnection in the Venusian wake. The experimental evidences

    NASA Astrophysics Data System (ADS)

    Fedorov, A.; Volwerk, M.; Zhang, T.; Barabash, S.; Sauvaud, J.

    2009-12-01

    The Venusian magnetotail is formed by solar wind magnetic flux tubes draping around the planet and stretched antisunward. The magnetotail topology represents two magnetic lobes separated by a thin current sheet. Such a configuration is a free energy reservoir. The accumulated energy is generally released by acceleration of planetary ions antisunward. But in the case of a magnetic reconnection, hypothetically appeared somewhere in the equatorial current sheet, some part of the planetary ions filling the tail, should be accelerated toward the planet. The present paper is devoted to the study of such sunward flows observed by IMA mass spectrometer onboard of the Venus Express orbiter. The case study shows rare accidently observed precipitations of the heavy ions in the nightside of the planet. The statistical study gives us the spatial distribution of such precipitations and conditions of their appearance.

  13. Current management of myomas: the place of medical therapy with the advent of selective progesterone receptor modulators.

    PubMed

    Donnez, Jacques; Arriagada, Pablo; Donnez, Olivier; Dolmans, Marie-Madeleine

    2015-12-01

    To review the current management of myomas with the advent of selective progesterone receptor modulators. Selective progesterone receptor modulators have proved effective and recent publications on the use of ulipristal acetate (UPA) have analyzed the performance of long-term intermittent utilization of 10 mg UPA given in repeated courses of 3 months. This long-term intermittent therapy maximizes the efficacy of UPA. Indeed, control of bleeding is achieved sooner after each course. With each subsequent course, a statistically greater number of patients show a fibroid volume reduction of more than 50%. The choice of therapy is influenced by different factors, such as the severity of symptoms, tumor characteristics, age, and wish to preserve the uterus (and fertility). Use of UPA will undoubtedly modify the surgical approach.

  14. Current-voltage characteristics and transition voltage spectroscopy of individual redox proteins.

    PubMed

    Artés, Juan M; López-Martínez, Montserrat; Giraudet, Arnaud; Díez-Pérez, Ismael; Sanz, Fausto; Gorostiza, Pau

    2012-12-19

    Understanding how molecular conductance depends on voltage is essential for characterizing molecular electronics devices. We reproducibly measured current-voltage characteristics of individual redox-active proteins by scanning tunneling microscopy under potentiostatic control in both tunneling and wired configurations. From these results, transition voltage spectroscopy (TVS) data for individual redox molecules can be calculated and analyzed statistically, adding a new dimension to conductance measurements. The transition voltage (TV) is discussed in terms of the two-step electron transfer (ET) mechanism. Azurin displays the lowest TV measured to date (0.4 V), consistent with the previously reported distance decay factor. This low TV may be advantageous for fabricating and operating molecular electronic devices for different applications. Our measurements show that TVS is a helpful tool for single-molecule ET measurements and suggest a mechanism for gating of ET between partner redox proteins.

  15. The Evolution of Random Number Generation in MUVES

    DTIC Science & Technology

    2017-01-01

    mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number

  16. A Response to White and Gorard: Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    ERIC Educational Resources Information Center

    Nicholson, James; Ridgway, Jim

    2017-01-01

    White and Gorard make important and relevant criticisms of some of the methods commonly used in social science research, but go further by criticising the logical basis for inferential statistical tests. This paper comments briefly on matters we broadly agree on with them and more fully on matters where we disagree. We agree that too little…

  17. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    NASA Astrophysics Data System (ADS)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  18. Parabens abatement from surface waters by electrochemical advanced oxidation with boron doped diamond anodes.

    PubMed

    Domínguez, Joaquín R; Muñoz-Peña, Maria J; González, Teresa; Palo, Patricia; Cuerda-Correa, Eduardo M

    2016-10-01

    The removal efficiency of four commonly-used parabens by electrochemical advanced oxidation with boron-doped diamond anodes in two different aqueous matrices, namely ultrapure water and surface water from the Guadiana River, has been analyzed. Response surface methodology and a factorial, composite, central, orthogonal, and rotatable (FCCOR) statistical design of experiments have been used to optimize the process. The experimental results clearly show that the initial concentration of pollutants is the factor that influences the removal efficiency in a more remarkable manner in both aqueous matrices. As a rule, as the initial concentration of parabens increases, the removal efficiency decreases. The current density also affects the removal efficiency in a statistically significant manner in both aqueous matrices. In the water river aqueous matrix, a noticeable synergistic effect on the removal efficiency has been observed, probably due to the presence of chloride ions that increase the conductivity of the solution and contribute to the generation of strong secondary oxidant species such as chlorine or HClO/ClO - . The use of a statistical design of experiments made it possible to determine the optimal conditions necessary to achieve total removal of the four parabens in ultrapure and river water aqueous matrices.

  19. Revising the lower statistical limit of x-ray grating-based phase-contrast computed tomography.

    PubMed

    Marschner, Mathias; Birnbacher, Lorenz; Willner, Marian; Chabior, Michael; Herzen, Julia; Noël, Peter B; Pfeiffer, Franz

    2017-01-01

    Phase-contrast x-ray computed tomography (PCCT) is currently investigated as an interesting extension of conventional CT, providing high soft-tissue contrast even if examining weakly absorbing specimen. Until now, the potential for dose reduction was thought to be limited compared to attenuation CT, since meaningful phase retrieval fails for scans with very low photon counts when using the conventional phase retrieval method via phase stepping. In this work, we examine the statistical behaviour of the reverse projection method, an alternative phase retrieval approach and compare the results to the conventional phase retrieval technique. We investigate the noise levels in the projections as well as the image quality and quantitative accuracy of the reconstructed tomographic volumes. The results of our study show that this method performs better in a low-dose scenario than the conventional phase retrieval approach, resulting in lower noise levels, enhanced image quality and more accurate quantitative values. Overall, we demonstrate that the lower statistical limit of the phase stepping procedure as proposed by recent literature does not apply to this alternative phase retrieval technique. However, further development is necessary to overcome experimental challenges posed by this method which would enable mainstream or even clinical application of PCCT.

  20. Prospective Study of Maternal Alcohol Intake During Pregnancy or Lactation and Risk of Childhood Asthma: The Norwegian Mother and Child Cohort Study

    PubMed Central

    Magnus, Maria C.; DeRoo, Lisa A.; Håberg, Siri E.; Magnus, Per; Nafstad, Per; Nystad, Wenche; London, Stephanie J.

    2014-01-01

    Background Many women drink during pregnancy and lactation despite recommendations to abstain. In animals, alcohol exposure during pregnancy and lactation influences lung and immune development, plausibly increasing risk of asthma and lower respiratory tract infections (LRTIs). Studies in humans are few. Methods In the Norwegian Mother and Child Cohort Study, we examined maternal alcohol intake during pregnancy and lactation in relation to risk of current asthma at 36 months (49,138 children), recurrent LRTIs by 36 months (39,791 children) and current asthma at seven years (13,253 children). Mothers reported frequency and amount of alcohol intake each trimester and the first three months following delivery. We calculated adjusted relative risks (aRR), comparing children of drinkers to non-drinkers, using Generalized Linear Models. Results A total of 31.8% of mothers consumed alcohol during first trimester, 9.7% during second trimester and 15.6% during third trimester. Infrequent and low-dose prenatal alcohol exposure showed a modest statistically significant inverse association with current asthma at 36 months (aRRs ~0.85). No association was seen with the highest alcohol intakes during the first trimester when alcohol consumption was most common. Relative risks of maternal alcohol intake during pregnancy with recurrent LRTIs were ~1, with sporadic differences in risk for some metrics of intake, but without any consistent pattern. For current asthma at seven years, similar inverse associations were seen as with current asthma at 36 month but were not statistically significant. Among children breastfed throughout the first three months of life, maternal alcohol intake during this time was not significantly associated with any of the three outcomes. Conclusion The low levels of alcohol exposure during pregnancy or lactation observed in this cohort were not associated with increased risk of asthma or recurrent LRTIs. The slight inverse associations of infrequent or low-dose prenatal alcohol exposure with asthma may not be causal. PMID:24460824

  1. Global characteristics of auroral Hall currents derived from the Swarm constellation: dependences on season and IMF orientation

    NASA Astrophysics Data System (ADS)

    Huang, Tao; Lühr, Hermann; Wang, Hui

    2017-11-01

    On the basis of field-aligned currents (FACs) and Hall currents derived from high-resolution magnetic field data of the Swarm constellation, the average characteristics of these two current systems in the auroral regions are comprehensively investigated by statistical methods. This is the first study considering both current types determined simultaneously by the same spacecraft in both hemispheres. The FAC distribution, derived from the novel Swarm dual-spacecraft approach, reveals the well-known features of Region 1 (R1) and Region 2 (R2) FACs. At high latitudes, Region 0 (R0) FACs appear on the dayside. Their flow direction, up or down, depends on the orientation of the interplanetary magnetic field (IMF) By component. Of particular interest is the distribution of auroral Hall currents. The prominent auroral electrojets are found to be closely controlled by the solar wind input, but we find no dependence of their intensity on the IMF By orientation. The eastward electrojet is about 1.5 times stronger in local summer than in winter. Conversely, the westward electrojet shows less dependence on season. As to higher latitudes, part of the electrojet current is closed over the polar cap. Here the seasonal variation of conductivity mainly controls the current density. During local summer of the Northern Hemisphere, there is a clear channeling of return currents over the polar cap. For positive (negative) IMF By a dominant eastward (westward) Hall current circuit is formed from the afternoon (morning) electrojet towards the dawn side (dusk side) polar cap return current. The direction of polar cap Hall currents in the noon sector depends directly on the orientation of the IMF By. This is true for both signs of the IMF Bz component. Comparable Hall current distributions can be observed in the Southern Hemisphere but for opposite IMF By signs. Around the midnight sector the westward substorm electrojet is dominating. As expected, it is highly dependent on magnetic activity, but it shows only little response to season and IMF By polarity. An important finding is that all the IMF By dependences of FACs and Hall currents practically disappear in the dark winter hemisphere.

  2. Statistical patterns in the location of natural lightning

    NASA Astrophysics Data System (ADS)

    Zoghzoghy, F. G.; Cohen, M. B.; Said, R. K.; Inan, U. S.

    2013-01-01

    Lightning discharges are nature's way of neutralizing the electrical buildup in thunderclouds. Thus, if an individual discharge destroys a substantial fraction of the cloud charge, the probability of a subsequent flash is reduced until the cloud charge separation rebuilds. The temporal pattern of lightning activity in a localized region may thus inherently be a proxy measure of the corresponding timescales for charge separation and electric field buildup processes. We present a statistical technique to bring out this effect (as well as the subsequent recovery) using lightning geo-location data, in this case with data from the National Lightning Detection Network (NLDN) and from the GLD360 Network. We use this statistical method to show that a lightning flash can remove an appreciable fraction of the built up charge, affecting the neighboring lightning activity for tens of seconds within a ˜ 10 km radius. We find that our results correlate with timescales of electric field buildup in storms and suggest that the proposed statistical tool could be used to study the electrification of storms on a global scale. We find that this flash suppression effect is a strong function of flash type, flash polarity, cloud-to-ground flash multiplicity, the geographic location of lightning, and is proportional to NLDN model-derived peak stroke current. We characterize the spatial and temporal extent of the suppression effect as a function of these parameters and discuss various applications of our findings.

  3. Observational limitations of Bose-Einstein photon statistics and radiation noise in thermal emission

    NASA Astrophysics Data System (ADS)

    Lee, Y.-J.; Talghader, J. J.

    2018-01-01

    For many decades, theory has predicted that Bose-Einstein statistics are a fundamental feature of thermal emission into one or a few optical modes; however, the resulting Bose-Einstein-like photon noise has never been experimentally observed. There are at least two reasons for this: (1) Relationships to describe the thermal radiation noise for an arbitrary mode structure have yet to be set forth, and (2) the mode and detector constraints necessary for the detection of such light is extremely hard to fulfill. Herein, photon statistics and radiation noise relationships are developed for systems with any number of modes and couplings to an observing space. The results are shown to reproduce existing special cases of thermal emission and are then applied to resonator systems to discuss physically realizable conditions under which Bose-Einstein-like thermal statistics might be observed. Examples include a single isolated cavity and an emitter cavity coupled to a small detector space. Low-mode-number noise theory shows major deviations from solely Bose-Einstein or Poisson treatments and has particular significance because of recent advances in perfect absorption and subwavelength structures both in the long-wave infrared and terahertz regimes. These microresonator devices tend to utilize a small volume with few modes, a regime where the current theory of thermal emission fluctuations and background noise, which was developed decades ago for free-space or single-mode cavities, has no derived solutions.

  4. MRMC analysis of agreement studies

    NASA Astrophysics Data System (ADS)

    Gallas, Brandon D.; Anam, Amrita; Chen, Weijie; Wunderlich, Adam; Zhang, Zhiwei

    2016-03-01

    The purpose of this work is to present and evaluate methods based on U-statistics to compare intra- or inter-reader agreement across different imaging modalities. We apply these methods to multi-reader multi-case (MRMC) studies. We measure reader-averaged agreement and estimate its variance accounting for the variability from readers and cases (an MRMC analysis). In our application, pathologists (readers) evaluate patient tissue mounted on glass slides (cases) in two ways. They evaluate the slides on a microscope (reference modality) and they evaluate digital scans of the slides on a computer display (new modality). In the current work, we consider concordance as the agreement measure, but many of the concepts outlined here apply to other agreement measures. Concordance is the probability that two readers rank two cases in the same order. Concordance can be estimated with a U-statistic and thus it has some nice properties: it is unbiased, asymptotically normal, and its variance is given by an explicit formula. Another property of a U-statistic is that it is symmetric in its inputs; it doesn't matter which reader is listed first or which case is listed first, the result is the same. Using this property and a few tricks while building the U-statistic kernel for concordance, we get a mathematically tractable problem and efficient software. Simulations show that our variance and covariance estimates are unbiased.

  5. On the Relationship between Molecular Hit Rates in High-Throughput Screening and Molecular Descriptors.

    PubMed

    Hansson, Mari; Pemberton, John; Engkvist, Ola; Feierberg, Isabella; Brive, Lars; Jarvis, Philip; Zander-Balderud, Linda; Chen, Hongming

    2014-06-01

    High-throughput screening (HTS) is widely used in the pharmaceutical industry to identify novel chemical starting points for drug discovery projects. The current study focuses on the relationship between molecular hit rate in recent in-house HTS and four common molecular descriptors: lipophilicity (ClogP), size (heavy atom count, HEV), fraction of sp(3)-hybridized carbons (Fsp3), and fraction of molecular framework (f(MF)). The molecular hit rate is defined as the fraction of times the molecule has been assigned as active in the HTS campaigns where it has been screened. Beta-binomial statistical models were built to model the molecular hit rate as a function of these descriptors. The advantage of the beta-binomial statistical models is that the correlation between the descriptors is taken into account. Higher degree polynomial terms of the descriptors were also added into the beta-binomial statistic model to improve the model quality. The relative influence of different molecular descriptors on molecular hit rate has been estimated, taking into account that the descriptors are correlated to each other through applying beta-binomial statistical modeling. The results show that ClogP has the largest influence on the molecular hit rate, followed by Fsp3 and HEV. f(MF) has only a minor influence besides its correlation with the other molecular descriptors. © 2013 Society for Laboratory Automation and Screening.

  6. Hemispheric Patterns in Electric Current Helicity of Solar Magnetic Fields During Solar Cycle 24: Results from SOLIS, SDO and Hinode

    NASA Astrophysics Data System (ADS)

    Gusain, S.

    2017-12-01

    We study the hemispheric patterns in electric current helicity distribution on the Sun. Magnetic field vector in the photosphere is now routinely measured by variety of instruments. SOLIS/VSM of NSO observes full disk Stokes spectra in photospheric lines which are used to derive vector magnetograms. Hinode SP is a space based spectropolarimeter which has the same observable as SOLIS albeit with limited field-of-view (FOV) but high spatial resolution. SDO/HMI derives vector magnetograms from full disk Stokes measurements, with rather limited spectral resolution, from space in a different photospheric line. Further, these datasets now exist for several years. SOLIS/VSM from 2003, Hinode SP from 2006, and SDO HMI since 2010. Using these time series of vector magnetograms we compute the electric current density in active regions during solar cycle 24 and study the hemispheric distributions. Many studies show that the helicity parameters and proxies show a strong hemispheric bias, such that Northern hemisphere has preferentially negative and southern positive helicity, respectively. We will confirm these results for cycle 24 from three different datasets and evaluate the statistical significance of the hemispheric bias. Further, we discuss the solar cycle variation in the hemispheric helicity pattern during cycle 24 and discuss its implications in terms of solar dynamo models.

  7. The Network Structure of Symptoms of the Diagnostic and Statistical Manual of Mental Disorders.

    PubMed

    Boschloo, Lynn; van Borkulo, Claudia D; Rhemtulla, Mijke; Keyes, Katherine M; Borsboom, Denny; Schoevers, Robert A

    2015-01-01

    Although current classification systems have greatly contributed to the reliability of psychiatric diagnoses, they ignore the unique role of individual symptoms and, consequently, potentially important information is lost. The network approach, in contrast, assumes that psychopathology results from the causal interplay between psychiatric symptoms and focuses specifically on these symptoms and their complex associations. By using a sophisticated network analysis technique, this study constructed an empirically based network structure of 120 psychiatric symptoms of twelve major DSM-IV diagnoses using cross-sectional data of the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC, second wave; N = 34,653). The resulting network demonstrated that symptoms within the same diagnosis showed differential associations and indicated that the strategy of summing symptoms, as in current classification systems, leads to loss of information. In addition, some symptoms showed strong connections with symptoms of other diagnoses, and these specific symptom pairs, which both concerned overlapping and non-overlapping symptoms, may help to explain the comorbidity across diagnoses. Taken together, our findings indicated that psychopathology is very complex and can be more adequately captured by sophisticated network models than current classification systems. The network approach is, therefore, promising in improving our understanding of psychopathology and moving our field forward.

  8. 2-D inner-shelf current observations from a single VHF WEllen RAdar (WERA) station

    USGS Publications Warehouse

    Voulgaris, G.; Kumar, N.; Gurgel, K.-W.; Warner, J.C.; List, J.H.

    2011-01-01

    The majority of High Frequency (HF) radars used worldwide operate at medium to high frequencies (8 to 30 MHz) providing spatial resolutions ranging from 3 to 1.5 km and ranges from 150 to 50 km. This paper presents results from the deployment of a single Very High Frequency (VHF, 48 MHz) WEllen RAdar (WERA) radar with spatial resolution of 150 m and range 10-15 km, used in the nearshore off Cape Hatteras, NC, USA. It consisted of a linear array of 12 antennas operating in beam forming mode. Radial velocities were estimated from radar backscatter for a variety of wind and nearshore wave conditions. A methodology similar to that used for converting acoustically derived beam velocities to an orthogonal system is presented for obtaining 2-D current fields from a single station. The accuracy of the VHF radar-derived radial velocities is examined using a new statistical technique that evaluates the system over the range of measured velocities. The VHF radar velocities showed a bias of 3 to 7 cm/s over the experimental period explainable by the differences in radar penetration and in-situ measurement height. The 2-D current field shows good agreement with the in-situ measurements. Deviations and inaccuracies are well explained by the geometric dilution analysis. ?? 2011 IEEE.

  9. Statistics Anxiety Update: Refining the Construct and Recommendations for a New Research Agenda.

    PubMed

    Chew, Peter K H; Dillon, Denise B

    2014-03-01

    Appreciation of the importance of statistics literacy for citizens of a democracy has resulted in an increasing number of degree programs making statistics courses mandatory for university students. Unfortunately, empirical evidence suggests that students in nonmathematical disciplines (e.g., social sciences) regard statistics courses as the most anxiety-inducing course in their degree programs. Although a literature review exists for statistics anxiety, it was done more than a decade ago, and newer studies have since added findings for consideration. In this article, we provide a current review of the statistics anxiety literature. Specifically, related variables, definitions, and measures of statistics anxiety are reviewed with the goal of refining the statistics anxiety construct. Antecedents, effects, and interventions of statistics anxiety are also reviewed to provide recommendations for statistics instructors and for a new research agenda. © The Author(s) 2014.

  10. The promise of air cargo: System aspects and vehicle design

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1976-01-01

    The current operation of the air cargo system is reviewed. An assessment of the future of air cargo is provided by: (1) analyzing statistics and trends, (2) by noting system problems and inefficiencies, (3) by analyzing characteristics of 'air eligible' commodities, and (4) by showing the promise of new technology for future cargo aircraft with significant improvements in costs and efficiency. The following topics are discussed: (1) air cargo demand forecasts; (2) economics of air cargo transport; (3) the integrated air cargo system; (4) evolution of airfreighter design; and (5) the span distributed load concept.

  11. Influence of audio triggered emotional attention on video perception

    NASA Astrophysics Data System (ADS)

    Torres, Freddy; Kalva, Hari

    2014-02-01

    Perceptual video coding methods attempt to improve compression efficiency by discarding visual information not perceived by end users. Most of the current approaches for perceptual video coding only use visual features ignoring the auditory component. Many psychophysical studies have demonstrated that auditory stimuli affects our visual perception. In this paper we present our study of audio triggered emotional attention and it's applicability to perceptual video coding. Experiments with movie clips show that the reaction time to detect video compression artifacts was longer when video was presented with the audio information. The results reported are statistically significant with p=0.024.

  12. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  13. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  14. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  15. Tunneling Statistics for Analysis of Spin-Readout Fidelity

    NASA Astrophysics Data System (ADS)

    Gorman, S. K.; He, Y.; House, M. G.; Keizer, J. G.; Keith, D.; Fricke, L.; Hile, S. J.; Broome, M. A.; Simmons, M. Y.

    2017-09-01

    We investigate spin and charge dynamics of a quantum dot of phosphorus atoms coupled to a radio-frequency single-electron transistor (SET) using full counting statistics. We show how the magnetic field plays a role in determining the bunching or antibunching tunneling statistics of the donor dot and SET system. Using the counting statistics, we show how to determine the lowest magnetic field where spin readout is possible. We then show how such a measurement can be used to investigate and optimize single-electron spin-readout fidelity.

  16. 76 FR 17710 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-30

    ... Rowan, BLS Clearance Officer, Division of Management Systems, Bureau of Labor Statistics, Room 4080, 2.... The statistics are fundamental inputs in economic decision processes at all levels of government... the State mandatory reporting authority. II. Current Action Office of Management and Budget clearance...

  17. Advances in Bayesian Modeling in Educational Research

    ERIC Educational Resources Information Center

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  18. 78 FR 11135 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ..., electronic, mechanical or other technological collection techniques or other forms of information technology... unless it displays a currently valid OMB control number. National Agricultural Statistics Service Title... National Agricultural Statistics Service (NASS) is to prepare and issue State and national estimates of...

  19. A systematic review of Bayesian articles in psychology: The last 25 years.

    PubMed

    van de Schoot, Rens; Winter, Sonja D; Ryan, Oisín; Zondervan-Zwijnenburg, Mariëlle; Depaoli, Sarah

    2017-06-01

    Although the statistical tools most often used by researchers in the field of psychology over the last 25 years are based on frequentist statistics, it is often claimed that the alternative Bayesian approach to statistics is gaining in popularity. In the current article, we investigated this claim by performing the very first systematic review of Bayesian psychological articles published between 1990 and 2015 (n = 1,579). We aim to provide a thorough presentation of the role Bayesian statistics plays in psychology. This historical assessment allows us to identify trends and see how Bayesian methods have been integrated into psychological research in the context of different statistical frameworks (e.g., hypothesis testing, cognitive models, IRT, SEM, etc.). We also describe take-home messages and provide "big-picture" recommendations to the field as Bayesian statistics becomes more popular. Our review indicated that Bayesian statistics is used in a variety of contexts across subfields of psychology and related disciplines. There are many different reasons why one might choose to use Bayes (e.g., the use of priors, estimating otherwise intractable models, modeling uncertainty, etc.). We found in this review that the use of Bayes has increased and broadened in the sense that this methodology can be used in a flexible manner to tackle many different forms of questions. We hope this presentation opens the door for a larger discussion regarding the current state of Bayesian statistics, as well as future trends. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Distribution of Reynolds stress carried by mesoscale variability in the Antarctic Circumpolar Current

    NASA Technical Reports Server (NTRS)

    Johnson, Thomas J.; Stewart, Robert H.; Shum, C. K.; Tapley, Byron D.

    1992-01-01

    Satellite altimeter data collected by the Geosat Exact Repeat Mission were used to investigate turbulent stress resulting from the variability of surface geostrophic currents in the Antarctic Circumpolar Current. The altimeter measured sea level along the subsatellite track. The variability of the along-track slope of sea level is directly proportional to the variability of surface geostrophic currents in the cross-track direction. Because the grid of crossover points is dense at high latitudes, the satellite data could be used for mapping the temporal and spatial variability of the current. Two and a half years of data were used to compute the statistical structure of the variability. The statistics included the probability distribution functions for each component of the current, the time-lagged autocorrelation functions of the variability, and the Reynolds stress produced by the variability. The results demonstrate that stress is correlated with bathymetry. In some areas the distribution of negative stress indicate that eddies contribute to an acceleration of the mean flow, strengthening the hypothesis that baroclinic instability makes important contributions to strong oceanic currents.

  1. Cut-off characterisation of energy spectra of bright Fermi sources: Current instrument limits and future possibilities

    NASA Astrophysics Data System (ADS)

    Romoli, Carlo; Taylor, Andrew M.; Aharonian, Felix

    2017-01-01

    The cut-off region of the gamma-ray spectrum of astrophysical sources encodes important information about the acceleration processes producing the parent particle population. For bright AGNs the cut-off happens in an energy range around a few tens of GeV, a region where satellites are limited by their effective area and current ground based telescopes by energy threshold. In the attempt to maximise the statistics, we have looked at two of the brightest AGNs seen by the Fermi-LAT (3C 454.3 and 3C 279) during extremely luminous flares. Our analysis showed the difficulty to obtain good constraints on the cut-off parameters when a power-law with modified exponential cut-off was assumed to fit the SEDs. We discuss the potential of future low-threshold Cherenkov telescope arrays, in particular CTA, showing the impact that a much bigger effective area can have on the determination of spectral parameters in the cut-off region. This preliminary study serves as an example, demonstrating the importance of having good wide-energy coverage around 10 GeV.

  2. Cavity-coupled double-quantum dot at finite bias: Analogy with lasers and beyond

    NASA Astrophysics Data System (ADS)

    Kulkarni, Manas; Cotlet, Ovidiu; Türeci, Hakan E.

    2014-09-01

    We present a theoretical and experimental study of photonic and electronic transport properties of a voltage biased InAs semiconductor double quantum dot (DQD) that is dipole coupled to a superconducting transmission line resonator. We obtain the master equation for the reduced density matrix of the coupled system of cavity photons and DQD electrons accounting systematically for both the presence of phonons and the effect of leads at finite voltage bias. We subsequently derive analytical expressions for transmission, phase response, photon number, and the nonequilibrium steady-state electron current. We show that the coupled system under finite bias realizes an unconventional version of a single-atom laser and analyze the spectrum and the statistics of the photon flux leaving the cavity. In the transmission mode, the system behaves as a saturable single-atom amplifier for the incoming photon flux. Finally, we show that the back action of the photon emission on the steady-state current can be substantial. Our analytical results are compared to exact master equation results establishing regimes of validity of various analytical models. We compare our findings to available experimental measurements.

  3. Distribution of Region 1 and 2 currents in the quietand substorm time plasma sheetfrom THEMIS observations

    NASA Astrophysics Data System (ADS)

    Liu, J.; Angelopoulos, V.; Chu, X.; McPherron, R. L.

    2016-12-01

    Although Earth's Region 1 and 2 currents are related to activities such as substorm initiation, their magnetospheric origin remains unclear. Utilizing the triangular configuration of THEMIS probes at 8-12 RE downtail, we seek the origin of nightside Region 1 and 2 currents. The triangular configuration allows a curlometer-like technique which do not rely on active-time boundary crossings, so we can examine the current distribution in quiet times as well as active times. Our statistical study reveals that both Region 1 and 2 currents exist in the plasma sheet during quiet and active times. Especially, this is the first unequivocal, in-situ evidence of the existence of Region 2 currents in the plasma sheet. Farther away from the neutral sheet than the Region 2 currents lie the Region 1 currents which extend at least to the plasma sheet boundary layer. At geomagnetic quiet times, the separation between the two currents is located 2.5 RE from the neutral sheet. These findings suggest that the plasma sheet is a source of Region 1 and 2 currents regardless of geomagnetic activity level. During substorms, the separation between Region 1 and 2 currents migrates toward (away from) the neutral sheet as the plasma sheet thins (thickens). This migration indicates that the deformation of Region 1 and 2 currents is associated with redistribution of FAC sources in the magnetotail. In some substorms when the THEMIS probes encounter a dipolarization, a substorm current wedge (SCW) can be inferred from our technique, and it shows a distinctively larger current density than the pre-existing Region 1 currents. This difference suggests that the SCW is not just an enhancement of the pre-existing Region 1 current; the SCW and the Region 1 currents have different sources.

  4. A Novel Genome-Information Content-Based Statistic for Genome-Wide Association Analysis Designed for Next-Generation Sequencing Data

    PubMed Central

    Luo, Li; Zhu, Yun

    2012-01-01

    Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812

  5. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    PubMed

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  6. The relationship between chiropractor required and current level of business knowledge.

    PubMed

    Ciolfi, Michael Anthony; Kasen, Patsy Anne

    2017-01-01

    Chiropractors frequently practice within health care systems requiring the business acumen of an entrepreneur. However, some chiropractors do not know the relationship between the level of business knowledge required for practice success and their current level of business knowledge. The purpose of this quantitative study was to examine the relationship between chiropractors' perceived level of business knowledge required and their perceived level of current business knowledge. Two hundred and seventy-four participants completed an online survey (Health Care Training and Education Needs Survey) which included eight key business items. Participants rated the level of perceived business knowledge required (Part I) and their current perceived level of knowledge (Part II) for the same eight items. Data was collected from November 27, 2013 to December 18, 2013. Data were analyzed using Spearman's ranked correlation to determine the statistically significant relationships for the perceived level of knowledge required and the perceived current level of knowledge for each of the paired eight items from Parts I and II of the survey. Wilcoxon Signed Ranks Tests were performed to determine the statistical difference between the paired items. The results of Spearman's correlation testing indicated a statistically significant ( p < 0.01) positive correlation for the perceived level of knowledge required and perceived current level of knowledge for six variables: (a) organizational behavior, (b) strategic management, (c) marketing, (d) legal and ethical, (e) managerial decisions, and (f) operations. Wilcoxon Signed Ranks testing indicated a significant difference for three paired items: strategic management; marketing and; legal and ethical. The results suggest that relationships exist for the majority of business items (6 of 8) however a statistically difference was demonstrated in only three of the paired business items. The implications of this study for social change include the potential to improve chiropractors' business knowledge and skills, enable practice success, enhance health services delivery and positively influence the profession as a viable career.

  7. Grain size statistics and depositional pattern of the Ecca Group sandstones, Karoo Supergroup in the Eastern Cape Province, South Africa

    NASA Astrophysics Data System (ADS)

    Baiyegunhi, Christopher; Liu, Kuiwu; Gwavava, Oswald

    2017-11-01

    Grain size analysis is a vital sedimentological tool used to unravel the hydrodynamic conditions, mode of transportation and deposition of detrital sediments. In this study, detailed grain-size analysis was carried out on thirty-five sandstone samples from the Ecca Group in the Eastern Cape Province of South Africa. Grain-size statistical parameters, bivariate analysis, linear discriminate functions, Passega diagrams and log-probability curves were used to reveal the depositional processes, sedimentation mechanisms, hydrodynamic energy conditions and to discriminate different depositional environments. The grain-size parameters show that most of the sandstones are very fine to fine grained, moderately well sorted, mostly near-symmetrical and mesokurtic in nature. The abundance of very fine to fine grained sandstones indicate the dominance of low energy environment. The bivariate plots show that the samples are mostly grouped, except for the Prince Albert samples that show scattered trend, which is due to the either mixture of two modes in equal proportion in bimodal sediments or good sorting in unimodal sediments. The linear discriminant function analysis is dominantly indicative of turbidity current deposits under shallow marine environments for samples from the Prince Albert, Collingham and Ripon Formations, while those samples from the Fort Brown Formation are lacustrine or deltaic deposits. The C-M plots indicated that the sediments were deposited mainly by suspension and saltation, and graded suspension. Visher diagrams show that saltation is the major process of transportation, followed by suspension.

  8. 76 FR 52304 - Notice of Intent To Seek Approval To Revise and Extend a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-22

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Seek Approval To Revise and Extend a Currently Approved Information Collection AGENCY: National Agricultural...: December 31, 2011. Type of Request: Intent to revise and extend a currently approved information collection...

  9. Current Status of Endovascular Treatment for Vasospasm following Subarachnoid Hemorrhage: Analysis of JR-NET2

    PubMed Central

    HAYASHI, Kentaro; HIRAO, Tomohito; SAKAI, Nobuyuki; NAGATA, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JRNET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3–6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early. PMID:24257541

  10. Current status of endovascular treatment for vasospasm following subarachnoid hemorrhage: analysis of JR-NET2.

    PubMed

    Hayashi, Kentaro; Hirao, Tomohito; Sakai, Nobuyuki; Nagata, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JR-NET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3-6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early.

  11. Current Status of Endovascular Treatment for Vasospasm following Subarachnoid Hemorrhage: Analysis of JR-NET2.

    PubMed

    Hayashi, Kentaro; Hirao, Tomohito; Sakai, Nobuyuki; Nagata, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JR-NET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3–6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early.

  12. Variations in intensity statistics for representational and abstract art, and for art from the Eastern and Western hemispheres.

    PubMed

    Graham, Daniel J; Field, David J

    2008-01-01

    Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.

  13. Domain General Constraints on Statistical Learning

    ERIC Educational Resources Information Center

    Thiessen, Erik D.

    2011-01-01

    All theories of language development suggest that learning is constrained. However, theories differ on whether these constraints arise from language-specific processes or have domain-general origins such as the characteristics of human perception and information processing. The current experiments explored constraints on statistical learning of…

  14. Alternative Statistical Frameworks for Student Growth Percentile Estimation

    ERIC Educational Resources Information Center

    Lockwood, J. R.; Castellano, Katherine E.

    2015-01-01

    This article suggests two alternative statistical approaches for estimating student growth percentiles (SGP). The first is to estimate percentile ranks of current test scores conditional on past test scores directly, by modeling the conditional cumulative distribution functions, rather than indirectly through quantile regressions. This would…

  15. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  16. 75 FR 60497 - Proposed Agency Information Collection Activities; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-30

    ... Board Clearance Officer, (202) 452-3829, Division of Research and Statistics, Board of Governors of the... most current statistical data available for evaluating institutions' corporate applications, for... securitized auto loans outstanding as well as securitized auto loan delinquencies and charge-offs. The...

  17. 75 FR 53346 - Submission for OMB Emergency Review: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Submission for OMB Emergency Review: Comment... submissions of responses. Agency: Bureau of Labor Statistics. Type of Review: Revision of currently approved collection. Title of Collection: National Compensation Survey. OMB Control Number: 1220-0164. Affected Public...

  18. Impaired Statistical Learning in Developmental Dyslexia

    ERIC Educational Resources Information Center

    Gabay, Yafit; Thiessen, Erik D.; Holt, Lori L.

    2015-01-01

    Purpose: Developmental dyslexia (DD) is commonly thought to arise from phonological impairments. However, an emerging perspective is that a more general procedural learning deficit, not specific to phonological processing, may underlie DD. The current study examined if individuals with DD are capable of extracting statistical regularities across…

  19. Bangladesh.

    PubMed

    Ahmed, K S

    1979-01-01

    In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.

  20. Structure of High Latitude Currents in Magnetosphere-Ionosphere Models

    NASA Astrophysics Data System (ADS)

    Wiltberger, M.; Rigler, E. J.; Merkin, V.; Lyon, J. G.

    2017-03-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  1. Structure of high latitude currents in global magnetospheric-ionospheric models

    USGS Publications Warehouse

    Wiltberger, M; Rigler, E. J.; Merkin, V; Lyon, J. G

    2016-01-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  2. Forcing and variability of nonstationary rip currents

    USGS Publications Warehouse

    Long, Joseph W.; H.T. Özkan-Haller,

    2016-01-01

    Surface wave transformation and the resulting nearshore circulation along a section of coast with strong alongshore bathymetric gradients outside the surf zone are modeled for a consecutive 4 week time period. The modeled hydrodynamics are compared to in situ measurements of waves and currents collected during the Nearshore Canyon Experiment and indicate that for the entire range of observed conditions, the model performance is similar to other studies along this stretch of coast. Strong alongshore wave height gradients generate rip currents that are observed by remote sensing data and predicted qualitatively well by the numerical model. Previous studies at this site have used idealized scenarios to link the rip current locations to undulations in the offshore bathymetry but do not explain the dichotomy between permanent offshore bathymetric features and intermittent rip current development. Model results from the month‐long simulation are used to track the formation and location of rip currents using hourly statistics, and results show that the direction of the incoming wave energy strongly controls whether rip currents form. In particular, most of the offshore wave spectra were bimodal and we find that the ratio of energy contained in each mode dictates rip current development, and the alongshore rip current position is controlled by the incident wave period. Additionally, model simulations performed with and without updating the nearshore morphology yield no significant change in the accuracy of the predicted surf zone hydrodyanmics indicating that the large‐scale offshore features (e.g., submarine canyon) predominately control the nearshore wave‐circulation system.

  3. Gender cognition in transgender children.

    PubMed

    Olson, Kristina R; Key, Aidan C; Eaton, Nicholas R

    2015-04-01

    A visible and growing cohort of transgender children in North America live according to their expressed gender rather than their natal sex, yet scientific research has largely ignored this population. In the current study, we adopted methodological advances from social-cognition research to investigate whether 5- to 12-year-old prepubescent transgender children (N = 32), who were presenting themselves according to their gender identity in everyday life, showed patterns of gender cognition more consistent with their expressed gender or their natal sex, or instead appeared to be confused about their gender identity. Using implicit and explicit measures, we found that transgender children showed a clear pattern: They viewed themselves in terms of their expressed gender and showed preferences for their expressed gender, with response patterns mirroring those of two cisgender (nontransgender) control groups. These results provide evidence that, early in development, transgender youth are statistically indistinguishable from cisgender children of the same gender identity. © The Author(s) 2015.

  4. Practice versus knowledge when it comes to pressure ulcer prevention.

    PubMed

    Provo, B; Piacentine, L; Dean-Baar, S

    1997-09-01

    This study was completed to determine the current knowledge and documentation patterns of nursing staff in the prevention of pressure ulcers and to identify the prevalence of pressure ulcers. This pre-post intervention study was carried out in three phases. In phase 1, 67 nursing staff members completed a modified version of Bostrom's Patient Skin Integrity Survey. A Braden Scale score, the presence of actual skin breakdown, and the presence of nursing documentation were collected for each patient (n = 43). Phase II consisted of a 20-minute educational session to all staff. In phase III, 51 nursing staff completed a second questionnaire similar to that completed in phase I. Patient data (n = 49) were again collected using the same procedure as phase I. Twenty-seven staff members completed questionnaires in both phase I and phase III of the study. No statistically significant differences were found in the knowledge of the staff before or after the educational session. The number of patients with a documented plan of care showed a statistically significant difference from phase I to phase III. The number of patients with pressure ulcers or at risk for pressure ulcer development (determined by a Braden Scale score of 16 or less) did not differ statistically from phase I to phase III. Knowledge about pressure ulcers in this sample of staff nurses was for the most part current and consistent with the recommendations in the Agency for Health Care Policy and Research guideline. Documentation of pressure ulcer prevention and treatment improved after the educational session. Although a significant change was noted in documentation, it is unclear whether it reflected an actual change in practice.

  5. Determination the validity of the new developed Sport Experts® hand grip dynamometer, measuring continuity of force, and comparison with current Takei and Baseline® dynamometers.

    PubMed

    Güçlüöver, A; Kutlu, M; Ciğerci, A E; Esen, H T; Demirkan, E; Erdoğdu, M

    2015-11-01

    In this study the Sport Experts ™ brand of hand grip dynamometer, measuring the continuity of force with the new developed load cell technology, was compared with Takei and Baseline® dynamometers, the current in use. It was tried to determine the correlation between them. In a study with provides use of clinical, orthopedic and rehabilitative purposes in the athletes and patient populations, this developed dynamometer can provide useful data by observing the continuity of force. The sample of the study included 60 badminton players in 2010-2011; consisting of Turkish Junior National male players (N.=16, age: 16.8±1.5), Junior National female players (N.=14, age: 16.9±1.6), amateur level male players (N.=15, age: 16.3±0.8) and amateur level female players (N.=15, age: 16.1±0.6). ANOVA was used in the statistical methods in order to compare the hand grip strength made by different brands; Pearson's correlation coefficient was used to determine the relationship level between dynamometers. Furthermore, test-retest reliability analysis was completed the new developed expert dynamometer. There was no statistically significant difference in the comparison of the dynamometers (P>0.05). Besides, a highly significant relationship (r=0.95 to 0.96) was found among all three dynamometers. However, the reliability coefficient was found (Chronbachs α: 0.989, ICC:0.97, r=0.97), (P<0.01) for the new developed expert dynamometer. Comparison between the dynamometers and the statistical results obtained from the correlation relationships shows interchangeability of dynamometers. As a result, our observation of force continuity (progression) of the athlete and patient populations is thought to be important.

  6. A new statistical framework to assess structural alignment quality using information compression

    PubMed Central

    Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.

    2014-01-01

    Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241

  7. Structures in magnetohydrodynamic turbulence: Detection and scaling

    NASA Astrophysics Data System (ADS)

    Uritsky, V. M.; Pouquet, A.; Rosenberg, D.; Mininni, P. D.; Donovan, E. F.

    2010-11-01

    We present a systematic analysis of statistical properties of turbulent current and vorticity structures at a given time using cluster analysis. The data stem from numerical simulations of decaying three-dimensional magnetohydrodynamic turbulence in the absence of an imposed uniform magnetic field; the magnetic Prandtl number is taken equal to unity, and we use a periodic box with grids of up to 15363 points and with Taylor Reynolds numbers up to 1100. The initial conditions are either an X -point configuration embedded in three dimensions, the so-called Orszag-Tang vortex, or an Arn’old-Beltrami-Childress configuration with a fully helical velocity and magnetic field. In each case two snapshots are analyzed, separated by one turn-over time, starting just after the peak of dissipation. We show that the algorithm is able to select a large number of structures (in excess of 8000) for each snapshot and that the statistical properties of these clusters are remarkably similar for the two snapshots as well as for the two flows under study in terms of scaling laws for the cluster characteristics, with the structures in the vorticity and in the current behaving in the same way. We also study the effect of Reynolds number on cluster statistics, and we finally analyze the properties of these clusters in terms of their velocity-magnetic-field correlation. Self-organized criticality features have been identified in the dissipative range of scales. A different scaling arises in the inertial range, which cannot be identified for the moment with a known self-organized criticality class consistent with magnetohydrodynamics. We suggest that this range can be governed by turbulence dynamics as opposed to criticality and propose an interpretation of intermittency in terms of propagation of local instabilities.

  8. Validation of cone beam computed tomography-based tooth printing using different three-dimensional printing technologies.

    PubMed

    Khalil, Wael; EzEldeen, Mostafa; Van De Casteele, Elke; Shaheen, Eman; Sun, Yi; Shahbazian, Maryam; Olszewski, Raphael; Politis, Constantinus; Jacobs, Reinhilde

    2016-03-01

    Our aim was to determine the accuracy of 3-dimensional reconstructed models of teeth compared with the natural teeth by using 4 different 3-dimensional printers. This in vitro study was carried out using 2 intact, dry adult human mandibles, which were scanned with cone beam computed tomography. Premolars were selected for this study. Dimensional differences between natural teeth and the printed models were evaluated directly by using volumetric differences and indirectly through optical scanning. Analysis of variance, Pearson correlation, and Bland Altman plots were applied for statistical analysis. Volumetric measurements from natural teeth and fabricated models, either by the direct method (the Archimedes principle) or by the indirect method (optical scanning), showed no statistical differences. The mean volume difference ranged between 3.1 mm(3) (0.7%) and 4.4 mm(3) (1.9%) for the direct measurement, and between -1.3 mm(3) (-0.6%) and 11.9 mm(3) (+5.9%) for the optical scan. A surface part comparison analysis showed that 90% of the values revealed a distance deviation within the interval 0 to 0.25 mm. Current results showed a high accuracy of all printed models of teeth compared with natural teeth. This outcome opens perspectives for clinical use of cost-effective 3-dimensional printed teeth for surgical procedures, such as tooth autotransplantation. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Knowledge of general dentists in the current guidelines for emergency treatment of avulsed teeth and dental trauma prevention.

    PubMed

    de Vasconcellos, Luis Gustavo Oliveira; Brentel, Aline Scalone; Vanderlei, Aleska Dias; de Vasconcellos, Luana Marotta Reis; Valera, Márcia Carneiro; de Araújo, Maria Amélia Máximo

    2009-12-01

    A high prevalence of dental trauma exists and its effects on function and esthetics deserve the attention of general dentists. The aim of this study was to assess the level of general dental practitioners' (GDPs) knowledge about guidelines for dental avulsion and its prevention using a questionnaire. The 21-item questionnaire was distributed among 264 GDPs and the survey was realized between August-November 2006. The data obtained were statistically analyzed using descriptive analysis and Pearson's Chi-square test to determine associations between knowledge regarding emergency treatment and dentists from public or private dental schools and years of experience. The results showed that the participants exhibited appropriate knowledge concerning procedures in cases of tooth avulsion and its prevention. The number of correct answers was low in relation to recommended treatment at the site of injury. Storage medium, preparation of the alveolus and splint time for receiving the avulsed tooth received a high number of correct answers. One statistically significant association between years of experience and recommended treatment at the site of the injury in the case an avulsed tooth (chi(2) = 9.384, P = 0.009). In conclusion, this survey showed appropriate knowledge of dental avulsion management and its prevention among the surveyed dentists. The findings also showed that communication between dentists and the population is deficient, especially concerning practitioners of high risk and contact sports.

  10. Relationship of children's salivary microbiota with their caries status: a pyrosequencing study.

    PubMed

    Gomar-Vercher, S; Cabrera-Rubio, R; Mira, A; Montiel-Company, J M; Almerich-Silla, J M

    2014-12-01

    Different dental caries status could be related with alterations in oral microbiota. Previous studies have collected saliva as a representative medium of the oral ecosystem. The purpose of this study was to assess the composition of oral microbiota and its relation to the presence of dental caries at different degrees of severity. One hundred ten saliva samples from 12-year-old children were taken and divided into six groups defined in strict accordance with their dental caries prevalence according to the International Caries Detection and Assessment System II criteria. These samples were studied by pyrosequencing PCR products of the 16S ribosomal RNA gene. The results showed statistically significant intergroup differences at the class and genus taxonomic levels. Streptococcus is the most frequent genus in all groups; although it did not show intergroup statistical differences. In patients with cavities, Porphyromonas and Prevotella showed an increasing percentage compared to healthy individuals. Bacterial diversity diminished as the severity of the disease increased, so those patients with more advanced stages of caries presented less bacterial diversity than healthy subjects. Although microbial composition tended to be different, the intragroup variation is large, as evidenced by the lack of clear intragroup clustering in principal component analyses. Thus, no clear differences were found, indicating that using bacterial composition as the sole source of biomarkers for dental caries may not be reliable in the unstimulated saliva samples used in the current study.

  11. The Spiral Arm Segments of the Galaxy within 3 kpc from the Sun: A Statistical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griv, Evgeny; Jiang, Ing-Guey; Hou, Li-Gang, E-mail: griv@bgu.ac.il

    As can be reasonably expected, upcoming large-scale APOGEE, GAIA, GALAH, LAMOST, and WEAVE stellar spectroscopic surveys will yield rather noisy Galactic distributions of stars. In view of the possibility of employing these surveys, our aim is to present a statistical method to extract information about the spiral structure of the Galaxy from currently available data, and to demonstrate the effectiveness of this method. The model differs from previous works studying how objects are distributed in space in its calculation of the statistical significance of the hypothesis that some of the objects are actually concentrated in a spiral. A statistical analysismore » of the distribution of cold dust clumps within molecular clouds, H ii regions, Cepheid stars, and open clusters in the nearby Galactic disk within 3 kpc from the Sun is carried out. As an application of the method, we obtain distances between the Sun and the centers of the neighboring Sagittarius arm segment, the Orion arm segment in which the Sun is located, and the Perseus arm segment. Pitch angles of the logarithmic spiral segments and their widths are also estimated. The hypothesis that the collected objects accidentally form spirals is refuted with almost 100% statistical confidence. We show that these four independent distributions of young objects lead to essentially the same results. We also demonstrate that our newly deduced values of the mean distances and pitch angles for the segments are not too far from those found recently by Reid et al. using VLBI-based trigonometric parallaxes of massive star-forming regions.« less

  12. Quantification of heterogeneity observed in medical images.

    PubMed

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  13. Online incidental statistical learning of audiovisual word sequences in adults: a registered report.

    PubMed

    Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy

    2018-02-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r  = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.

  14. Online incidental statistical learning of audiovisual word sequences in adults: a registered report

    PubMed Central

    Duta, Mihaela; Thompson, Paul

    2018-01-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876

  15. Statistical methods for nuclear material management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen W.M.; Bennett, C.A.

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material managementmore » problems.« less

  16. Aspects Topologiques de la Theorie des Champs et leurs Applications

    NASA Astrophysics Data System (ADS)

    Caenepeel, Didier

    This thesis is dedicated to the study of various topological aspects of field theory, and is divided in three parts. In two space dimensions the possibility of fractional statistics can be implemented by adding an appropriate "fictitious" electric charge and magnetic flux to each particle (after which they are known as anyons). Since the statistical interaction is rather difficult to handle, a mean-field approximation is used in order to describe a gas of anyons. We derive a criterion for the validity of this approximation using the inherent feature of parity violation in the scattering of anyons. We use this new method in various examples of anyons and show both analytically and numerically that the approximation is justified if the statistical interaction is weak, and that it must be more weak for boson-based than for fermion-based anyons. Chern-Simons theories give an elegant implementation of anyonic properties in field theories, which permits the emergence of new mechanisms for anyon superconductivity. Since it is reasonable to think that superconductivity is a low energy phenomenon, we have been interested in non-relativistic C-S systems. We present the scalar field effective potential for non-relativistic matter coupled to both Abelian and non-Abelian C-S gauge fields. We perform the calculations using functional methods in background fields. Finally, we compute the scalar effective potential in various gauges and treat divergences with various regularization schemes. In three space dimensions, a generalization of Chern-Simons theory may be achieved by introducing an antisymmetric tensor gauge field. We use these theories, called B wedge F theories, to present an alternative to the Higgs mechanism to generate masses for non-Abelian gauge fields. The initial Lagrangian is composed of a fermion with current-current and dipole-dipole type self -interactions minimally coupled to non-Abelian gauge fields. The mass generation occurs upon the fermionic functional integration. We show that by suitably adjusting the coupling constants the effective theory contains massive non-Abelian gauge fields without any residual scalars or other degrees of freedom.

  17. Birkeland currents during substorms: Statistical evidence for intensification of Regions 1 and 2 currents after onset and a localized signature of auroral dimming

    NASA Astrophysics Data System (ADS)

    Coxon, John C.; Rae, I. Jonathan; Forsyth, Colin; Jackman, Caitriona M.; Fear, Robert C.; Anderson, Brian J.

    2017-06-01

    We conduct a superposed epoch analysis of Birkeland current densities from AMPERE (Active Magnetosphere and Planetary Electrodynamics Response Experiment) using isolated substorm expansion phase onsets identified by an independently derived data set. In order to evaluate whether R1 and R2 currents contribute to the substorm current wedge, we rotate global maps of Birkeland currents into a common coordinate system centered on the magnetic local time of substorm onset. When the latitude of substorm is taken into account, it is clear that both R1 and R2 current systems play a role in substorm onset, contrary to previous studies which found that R2 current did not contribute. The latitude of substorm onset is colocated with the interface between R1 and R2 currents, allowing us to infer that R1 current closes just tailward and R2 current closes just earthward of the associated current disruption in the tail. AMPERE is the first data set to give near-instantaneous measurements of Birkeland current across the whole polar cap, and this study addresses apparent discrepancies in previous studies which have used AMPERE to examine the morphology of the substorm current wedge. Finally, we present evidence for an extremely localized reduction in current density immediately prior to substorm onset, and we interpret this as the first statistical signature of auroral dimming in Birkeland current.

  18. Effective Lagrangians and Current Algebra in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Ferretti, Gabriele

    In this thesis we study three dimensional field theories that arise as effective Lagrangians of quantum chromodynamics in Minkowski space with signature (2,1) (QCD3). In the first chapter, we explain the method of effective Langrangians and the relevance of current algebra techniques to field theory. We also provide the physical motivations for the study of QCD3 as a toy model for confinement and as a theory of quantum antiferromagnets (QAF). In chapter two, we derive the relevant effective Lagrangian by studying the low energy behavior of QCD3, paying particular attention to how the global symmetries are realized at the quantum level. In chapter three, we show how baryons arise as topological solitons of the effective Lagrangian and also show that their statistics depends on the number of colors as predicted by the quark model. We calculate mass splitting and magnetic moments of the soliton and find logarithmic corrections to the naive quark model predictions. In chapter four, we drive the current algebra of the theory. We find that the current algebra is a co -homologically non-trivial generalization of Kac-Moody algebras to three dimensions. This fact may provide a new, non -perturbative way to quantize the theory. In chapter five, we discuss the renormalizability of the model in the large-N expansion. We prove the validity of the non-renormalization theorem and compute the critical exponents in a specific limiting case, the CP^ {N-1} model with a Chern-Simons term. Finally, chapter six contains some brief concluding remarks.

  19. Listening through Voices: Infant Statistical Word Segmentation across Multiple Speakers

    ERIC Educational Resources Information Center

    Graf Estes, Katharine; Lew-Williams, Casey

    2015-01-01

    To learn from their environments, infants must detect structure behind pervasive variation. This presents substantial and largely untested learning challenges in early language acquisition. The current experiments address whether infants can use statistical learning mechanisms to segment words when the speech signal contains acoustic variation…

  20. Methodological difficulties of conducting agroecological studies from a statistical perspective

    USDA-ARS?s Scientific Manuscript database

    Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable an...

  1. Supply and Characteristics of Selected Health Personnel.

    ERIC Educational Resources Information Center

    Ake, James N.; Johnson, Donald W.

    Detailed statistics on trends in the U.S. supply and geographic distribution of personnel in eight health occupations, along with current data on selected professional characteristics, are presented. Statistical tables include combined data for the eight occupations, and groups of tables for the individual health occupations: physicians (both…

  2. MODEL ANALYSIS OF RIPARIAN BUFFER EFFECTIVENESS FOR REDUCING NUTRIENT INPUTS TO STREAMS IN AGRICULTURAL LANDSCAPES

    EPA Science Inventory

    Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality...

  3. OCCUPATIONS IN COLORADO. PART I, OUTLOOK BY INDUSTRIES.

    ERIC Educational Resources Information Center

    1966

    CURRENT AND PROJECTED EMPLOYMENT STATISTICS ARE GIVEN FOR THE STATE AND FOR THE DENVER STANDARD METROPOLITAN STATISTICAL AREA WHICH INCLUDES ADAMS, ARAPAHOE, BOULDER, DENVER, AND JEFFERSON COUNTIES. DATA WERE OBTAINED FROM THE COLORADO DEPARTMENT OF EMPLOYMENT, DENVER RESEARCH INSTITUTE, U.S. CENSUS, UNIVERSITY OF COLORADO, MOUNTAIN STATES…

  4. 75 FR 1415 - Submission for OMB Review: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... Department of Labor--Bureau of Labor Statistics (BLS), Office of Management and Budget, Room 10235... Statistics. Type of Review: Revision of a currently approved collection. Title of Collection: The Consumer... sector. The data are collected from a national probability sample of households designed to represent the...

  5. Sensitivity to volcanic field boundary

    NASA Astrophysics Data System (ADS)

    Runge, Melody; Bebbington, Mark; Cronin, Shane; Lindsay, Jan; Rashad Moufti, Mohammed

    2016-04-01

    Volcanic hazard analyses are desirable where there is potential for future volcanic activity to affect a proximal population. This is frequently the case for volcanic fields (regions of distributed volcanism) where low eruption rates, fertile soil, and attractive landscapes draw populations to live close by. Forecasting future activity in volcanic fields almost invariably uses spatial or spatio-temporal point processes with model selection and development based on exploratory analyses of previous eruption data. For identifiability reasons, spatio-temporal processes, and practically also spatial processes, the definition of a spatial region is required to which volcanism is confined. However, due to the complex and predominantly unknown sub-surface processes driving volcanic eruptions, definition of a region based solely on geological information is currently impossible. Thus, the current approach is to fit a shape to the known previous eruption sites. The class of boundary shape is an unavoidable subjective decision taken by the forecaster that is often overlooked during subsequent analysis of results. This study shows the substantial effect that this choice may have on even the simplest exploratory methods for hazard forecasting, illustrated using four commonly used exploratory statistical methods and two very different regions: the Auckland Volcanic Field, New Zealand, and Harrat Rahat, Kingdom of Saudi Arabia. For Harrat Rahat, sensitivity of results to boundary definition is substantial. For the Auckland Volcanic Field, the range of options resulted in similar shapes, nevertheless, some of the statistical tests still showed substantial variation in results. This work highlights the fact that when carrying out any hazard analysis on volcanic fields, it is vital to specify how the volcanic field boundary has been defined, assess the sensitivity of boundary choice, and to carry these assumptions and related uncertainties through to estimates of future activity and hazard analyses.

  6. An Investigation of the Variety and Complexity of Statistical Methods Used in Current Internal Medicine Literature.

    PubMed

    Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth

    2015-10-01

    Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.

  7. Developing a Campaign Plan to Target Centers of Gravity Within Economic Systems

    DTIC Science & Technology

    1995-05-01

    Conclusion 67 CHAPTER 7: CURRENT AND FUTURE CONCERNS 69 Decision Making and Planning 69 Conclusion 72 CHAPTER 8: CONCLUSION 73 APPENDIX A: STATISTICS 80...Terminology and Statistical Tests 80 Country Analysis 84 APPENDIX B 154 BIBLIOGRAPHY 157 VITAE 162 IV LIST OF FIGURES Figure 1. Air Campaign...This project furthers the original statistical effort and adds to this a campaign planning approach (including both systems and operational level

  8. Extreme current fluctuations in lattice gases: Beyond nonequilibrium steady states

    NASA Astrophysics Data System (ADS)

    Meerson, Baruch; Sasorov, Pavel V.

    2014-01-01

    We use the macroscopic fluctuation theory (MFT) to study large current fluctuations in nonstationary diffusive lattice gases. We identify two universality classes of these fluctuations, which we call elliptic and hyperbolic. They emerge in the limit when the deterministic mass flux is small compared to the mass flux due to the shot noise. The two classes are determined by the sign of compressibility of effective fluid, obtained by mapping the MFT into an inviscid hydrodynamics. An example of the elliptic class is the symmetric simple exclusion process, where, for some initial conditions, we can solve the effective hydrodynamics exactly. This leads to a super-Gaussian extreme current statistics conjectured by Derrida and Gerschenfeld [J. Stat. Phys. 137, 978 (2009), 10.1007/s10955-009-9830-1] and yields the optimal path of the system. For models of the hyperbolic class, the deterministic mass flux cannot be neglected, leading to a different extreme current statistics.

  9. Distribution of tunnelling times for quantum electron transport.

    PubMed

    Rudge, Samuel L; Kosov, Daniel S

    2016-03-28

    In electron transport, the tunnelling time is the time taken for an electron to tunnel out of a system after it has tunnelled in. We define the tunnelling time distribution for quantum processes in a dissipative environment and develop a practical approach for calculating it, where the environment is described by the general Markovian master equation. We illustrate the theory by using the rate equation to compute the tunnelling time distribution for electron transport through a molecular junction. The tunnelling time distribution is exponential, which indicates that Markovian quantum tunnelling is a Poissonian statistical process. The tunnelling time distribution is used not only to study the quantum statistics of tunnelling along the average electric current but also to analyse extreme quantum events where an electron jumps against the applied voltage bias. The average tunnelling time shows distinctly different temperature dependence for p- and n-type molecular junctions and therefore provides a sensitive tool to probe the alignment of molecular orbitals relative to the electrode Fermi energy.

  10. Associations between state-level soda taxes and adolescent body mass index.

    PubMed

    Powell, Lisa M; Chriqui, Jamie; Chaloupka, Frank J

    2009-09-01

    Soft drink consumption has been linked with higher energy intake, obesity, and poorer health. Fiscal pricing policies such as soda taxes may lower soda consumption and, in turn, reduce weight among U.S. adolescents. This study used multivariate linear regression analyses to examine the associations between state-level grocery store and vending machine soda taxes and adolescent body mass index (BMI). We used repeated cross-sections of individual-level data on adolescents drawn from the Monitoring the Future surveys combined with state-level tax data and local area contextual measures for the years 1997 through 2006. The results showed no statistically significant associations between state-level soda taxes and adolescent BMI. Only a weak economic and statistically significant effect was found between vending machine soda tax rates and BMI among teens at risk for overweight. Current state-level tax rates are not found to be significantly associated with adolescent weight outcomes. It is likely that taxes would need to be raised substantially to detect significant associations between taxes and adolescent weight.

  11. Interlaboratory round robin study on axial tensile properties of SiC-SiC CMC tubular test specimens [Interlaboratory round robin study on axial tensile properties of SiC/SiC tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.

    An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less

  12. Modeling Traffic on the Web Graph

    NASA Astrophysics Data System (ADS)

    Meiss, Mark R.; Gonçalves, Bruno; Ramasco, José J.; Flammini, Alessandro; Menczer, Filippo

    Analysis of aggregate and individual Web requests shows that PageRank is a poor predictor of traffic. We use empirical data to characterize properties of Web traffic not reproduced by Markovian models, including both aggregate statistics such as page and link traffic, and individual statistics such as entropy and session size. As no current model reconciles all of these observations, we present an agent-based model that explains them through realistic browsing behaviors: (1) revisiting bookmarked pages; (2) backtracking; and (3) seeking out novel pages of topical interest. The resulting model can reproduce the behaviors we observe in empirical data, especially heterogeneous session lengths, reconciling the narrowly focused browsing patterns of individual users with the extreme variance in aggregate traffic measurements. We can thereby identify a few salient features that are necessary and sufficient to interpret Web traffic data. Beyond the descriptive and explanatory power of our model, these results may lead to improvements in Web applications such as search and crawling.

  13. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    NASA Astrophysics Data System (ADS)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  14. Cannabis, motivation, and life satisfaction in an internet sample

    PubMed Central

    Barnwell, Sara Smucker; Earleywine, Mitch; Wilcox, Rand

    2006-01-01

    Although little evidence supports cannabis-induced amotivational syndrome, sources continue to assert that the drug saps motivation [1], which may guide current prohibitions. Few studies report low motivation in chronic users; another reveals that they have higher subjective wellbeing. To assess differences in motivation and subjective wellbeing, we used a large sample (N = 487) and strict definitions of cannabis use (7 days/week) and abstinence (never). Standard statistical techniques showed no differences. Robust statistical methods controlling for heteroscedasticity, non-normality and extreme values found no differences in motivation but a small difference in subjective wellbeing. Medical users of cannabis reporting health problems tended to account for a significant portion of subjective wellbeing differences, suggesting that illness decreased wellbeing. All p-values were above p = .05. Thus, daily use of cannabis does not impair motivation. Its impact on subjective wellbeing is small and may actually reflect lower wellbeing due to medical symptoms rather than actual consumption of the plant. PMID:16722561

  15. For the Love of the Game: Game- Versus Lecture-Based Learning With Generation Z Patients.

    PubMed

    Adamson, Mary A; Chen, Hengyi; Kackley, Russell; Micheal, Alicia

    2018-02-01

    The current study evaluated adolescent patients' enjoyment of and knowledge gained from game-based learning compared with an interactive lecture format on the topic of mood disorders. It was hypothesized that game-based learning would be statistically more effective than a lecture in knowledge acquisition and satisfaction scores. A pre-post design was implemented in which a convenience sample of 160 adolescent patients were randomized to either a lecture (n = 80) or game-based (n = 80) group. Both groups completed a pretest/posttest and satisfaction survey. Results showed that both groups had significant improvement in knowledge from pretest compared to posttest. Game-based learning was statistically more effective than the interactive lecture in knowledge achievement and satisfaction scores. This finding supports the contention that game-based learning is an active technique that may be used with patient education. [Journal of Psychosocial Nursing and Mental Health Services, 56(2), 29-36.]. Copyright 2018, SLACK Incorporated.

  16. A Complex Network Approach to Stylometry

    PubMed Central

    Amancio, Diego Raphael

    2015-01-01

    Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921

  17. Geographically Sourcing Cocaine’s Origin – Delineation of the Nineteen Major Coca Growing Regions in South America

    PubMed Central

    Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.

    2016-01-01

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions. PMID:27006288

  18. Interlaboratory round robin study on axial tensile properties of SiC-SiC CMC tubular test specimens [Interlaboratory round robin study on axial tensile properties of SiC/SiC tubes

    DOE PAGES

    Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.; ...

    2018-04-19

    An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less

  19. Grain size analysis and depositional environment of shallow marine to basin floor, Kelantan River Delta

    NASA Astrophysics Data System (ADS)

    Afifah, M. R. Nurul; Aziz, A. Che; Roslan, M. Kamal

    2015-09-01

    Sediment samples were collected from the shallow marine from Kuala Besar, Kelantan outwards to the basin floor of South China Sea which consisted of quaternary bottom sediments. Sixty five samples were analysed for their grain size distribution and statistical relationships. Basic statistical analysis like mean, standard deviation, skewness and kurtosis were calculated and used to differentiate the depositional environment of the sediments and to derive the uniformity of depositional environment either from the beach or river environment. The sediments of all areas were varied in their sorting ranging from very well sorted to poorly sorted, strongly negative skewed to strongly positive skewed, and extremely leptokurtic to very platykurtic in nature. Bivariate plots between the grain-size parameters were then interpreted and the Coarsest-Median (CM) pattern showed the trend suggesting relationships between sediments influenced by three ongoing hydrodynamic factors namely turbidity current, littoral drift and waves dynamic, which functioned to control the sediments distribution pattern in various ways.

  20. On the effectiveness of noise masks: naturalistic vs. un-naturalistic image statistics.

    PubMed

    Hansen, Bruce C; Hess, Robert F

    2012-05-01

    It has been argued that the human visual system is optimized for identification of broadband objects embedded in stimuli possessing orientation averaged power spectra fall-offs that obey the 1/f(β) relationship typically observed in natural scene imagery (i.e., β=2.0 on logarithmic axes). Here, we were interested in whether individual spatial channels leading to recognition are functionally optimized for narrowband targets when masked by noise possessing naturalistic image statistics (β=2.0). The current study therefore explores the impact of variable β noise masks on the identification of narrowband target stimuli ranging in spatial complexity, while simultaneously controlling for physical or perceived differences between the masks. The results show that β=2.0 noise masks produce the largest identification thresholds regardless of target complexity, and thus do not seem to yield functionally optimized channel processing. The differential masking effects are discussed in the context of contrast gain control. Copyright © 2012 Elsevier Ltd. All rights reserved.

Top