NASA Astrophysics Data System (ADS)
Schwartz, M. A.; Hall, A. D.; Sun, F.; Walton, D.; Berg, N.
2015-12-01
Hybrid dynamical-statistical downscaling is used to produce surface runoff timing projections for California's Sierra Nevada, a high-elevation mountain range with significant seasonal snow cover. First, future climate change projections (RCP8.5 forcing scenario, 2081-2100 period) from five CMIP5 global climate models (GCMs) are dynamically downscaled. These projections reveal that future warming leads to a shift toward earlier snowmelt and surface runoff timing throughout the Sierra Nevada region. Relationships between warming and surface runoff timing from the dynamical simulations are used to build a simple statistical model that mimics the dynamical model's projected surface runoff timing changes given GCM input or other statistically-downscaled input. This statistical model can be used to produce surface runoff timing projections for other GCMs, periods, and forcing scenarios to quantify ensemble-mean changes, uncertainty due to intermodel variability and consequences stemming from choice of forcing scenario. For all CMIP5 GCMs and forcing scenarios, significant trends toward earlier surface runoff timing occur at elevations below 2500m. Thus, we conclude that trends toward earlier surface runoff timing by the end-of-the-21st century are inevitable. The changes to surface runoff timing diagnosed in this study have implications for many dimensions of climate change, including impacts on surface hydrology, water resources, and ecosystems.
Seasonal Atmospheric and Oceanic Predictions
NASA Technical Reports Server (NTRS)
Roads, John; Rienecker, Michele (Technical Monitor)
2003-01-01
Several projects associated with dynamical, statistical, single column, and ocean models are presented. The projects include: 1) Regional Climate Modeling; 2) Statistical Downscaling; 3) Evaluation of SCM and NSIPP AGCM Results at the ARM Program Sites; and 4) Ocean Forecasts.
Downscaled projections of Caribbean coral bleaching that can inform conservation planning.
van Hooidonk, Ruben; Maynard, Jeffrey Allen; Liu, Yanyun; Lee, Sang-Ki
2015-09-01
Projections of climate change impacts on coral reefs produced at the coarse resolution (~1°) of Global Climate Models (GCMs) have informed debate but have not helped target local management actions. Here, projections of the onset of annual coral bleaching conditions in the Caribbean under Representative Concentration Pathway (RCP) 8.5 are produced using an ensemble of 33 Coupled Model Intercomparison Project phase-5 models and via dynamical and statistical downscaling. A high-resolution (~11 km) regional ocean model (MOM4.1) is used for the dynamical downscaling. For statistical downscaling, sea surface temperature (SST) means and annual cycles in all the GCMs are replaced with observed data from the ~4-km NOAA Pathfinder SST dataset. Spatial patterns in all three projections are broadly similar; the average year for the onset of annual severe bleaching is 2040-2043 for all projections. However, downscaled projections show many locations where the onset of annual severe bleaching (ASB) varies 10 or more years within a single GCM grid cell. Managers in locations where this applies (e.g., Florida, Turks and Caicos, Puerto Rico, and the Dominican Republic, among others) can identify locations that represent relative albeit temporary refugia. Both downscaled projections are different for the Bahamas compared to the GCM projections. The dynamically downscaled projections suggest an earlier onset of ASB linked to projected changes in regional currents, a feature not resolved in GCMs. This result demonstrates the value of dynamical downscaling for this application and means statistically downscaled projections have to be interpreted with caution. However, aside from west of Andros Island, the projections for the two types of downscaling are mostly aligned; projected onset of ASB is within ±10 years for 72% of the reef locations. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Terando, A. J.; Grade, S.; Bowden, J.; Henareh Khalyani, A.; Wootten, A.; Misra, V.; Collazo, J.; Gould, W. A.; Boyles, R.
2016-12-01
Sub-tropical island nations may be particularly vulnerable to anthropogenic climate change because of predicted changes in the hydrologic cycle that would lead to significant drying in the future. However, decision makers in these regions have seen their adaptation planning efforts frustrated by the lack of island-resolving climate model information. Recently, two investigations have used statistical and dynamical downscaling techniques to develop climate change projections for the U.S. Caribbean region (Puerto Rico and U.S. Virgin Islands). We compare the results from these two studies with respect to three commonly downscaled CMIP5 global climate models (GCMs). The GCMs were dynamically downscaled at a convective-permitting scale using two different regional climate models. The statistical downscaling approach was conducted at locations with long-term climate observations and then further post-processed using climatologically aided interpolation (yielding two sets of projections). Overall, both approaches face unique challenges. The statistical approach suffers from a lack of observations necessary to constrain the model, particularly at the land-ocean boundary and in complex terrain. The dynamically downscaled model output has a systematic dry bias over the island despite ample availability of moisture in the atmospheric column. Notwithstanding these differences, both approaches are consistent in projecting a drier climate that is driven by the strong global-scale anthropogenic forcing.
Nonnormality and Divergence in Posttreatment Alcohol Use
Witkiewitz, Katie; van der Maas, Han L. J.; Hufford, Michael R.; Marlatt, G. Alan
2007-01-01
Alcohol lapses are the modal outcome following treatment for alcohol use disorders, yet many alcohol researchers have encountered limited success in the prediction and prevention of relapse. One hypothesis is that lapses are unpredictable, but another possibility is the complexity of the relapse process is not captured by traditional statistical methods. Data from Project Matching Alcohol Treatments to Client Heterogeneity (Project MATCH), a multisite alcohol treatment study, were reanalyzed with 2 statistical methodologies: catastrophe and 2-part growth mixture modeling. Drawing on previous investigations of self-efficacy as a dynamic predictor of relapse, the current study revisits the self-efficacy matching hypothesis, which was not statistically supported in Project MATCH. Results from both the catastrophe and growth mixture analyses demonstrated a dynamic relationship between self-efficacy and drinking outcomes. The growth mixture analyses provided evidence in support of the original matching hypothesis: Individuals with lower self-efficacy who received cognitive behavior therapy drank far less frequently than did those with low self-efficacy who received motivational therapy. These results highlight the dynamical nature of the relapse process and the importance of the use of methodologies that accommodate this complexity when evaluating treatment outcomes. PMID:17516769
Enhancing seasonal climate prediction capacity for the Pacific countries
NASA Astrophysics Data System (ADS)
Kuleshov, Y.; Jones, D.; Hendon, H.; Charles, A.; Cottrill, A.; Lim, E.-P.; Langford, S.; de Wit, R.; Shelton, K.
2012-04-01
Seasonal and inter-annual climate variability is a major factor in determining the vulnerability of many Pacific Island Countries to climate change and there is need to improve weekly to seasonal range climate prediction capabilities beyond what is currently available from statistical models. In the seasonal climate prediction project under the Australian Government's Pacific Adaptation Strategy Assistance Program (PASAP), we describe a comprehensive project to strengthen the climate prediction capacities in National Meteorological Services in 14 Pacific Island Countries and East Timor. The intent is particularly to reduce the vulnerability of current services to a changing climate, and improve the overall level of information available assist with managing climate variability. Statistical models cannot account for aspects of climate variability and change that are not represented in the historical record. In contrast, dynamical physics-based models implicitly include the effects of a changing climate whatever its character or cause and can predict outcomes not seen previously. The transition from a statistical to a dynamical prediction system provides more valuable and applicable climate information to a wide range of climate sensitive sectors throughout the countries of the Pacific region. In this project, we have developed seasonal climate outlooks which are based upon the current dynamical model POAMA (Predictive Ocean-Atmosphere Model for Australia) seasonal forecast system. At present, meteorological services of the Pacific Island Countries largely employ statistical models for seasonal outlooks. Outcomes of the PASAP project enhanced capabilities of the Pacific Island Countries in seasonal prediction providing National Meteorological Services with an additional tool to analyse meteorological variables such as sea surface temperatures, air temperature, pressure and rainfall using POAMA outputs and prepare more accurate seasonal climate outlooks.
Interference in the classical probabilistic model and its representation in complex Hilbert space
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei Yu.
2005-10-01
The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.
Non-Markovian generalization of the Lindblad theory of open quantum systems
NASA Astrophysics Data System (ADS)
Breuer, Heinz-Peter
2007-02-01
A systematic approach to the non-Markovian quantum dynamics of open systems is given by the projection operator techniques of nonequilibrium statistical mechanics. Combining these methods with concepts from quantum information theory and from the theory of positive maps, we derive a class of correlated projection superoperators that take into account in an efficient way statistical correlations between the open system and its environment. The result is used to develop a generalization of the Lindblad theory to the regime of highly non-Markovian quantum processes in structured environments.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1987-01-01
A dynamic rain attenuation prediction model is developed for use in obtaining the temporal characteristics, on time scales of minutes or hours, of satellite communication link availability. Analagous to the associated static rain attenuation model, which yields yearly attenuation predictions, this dynamic model is applicable at any location in the world that is characterized by the static rain attenuation statistics peculiar to the geometry of the satellite link and the rain statistics of the location. Such statistics are calculated by employing the formalism of Part I of this report. In fact, the dynamic model presented here is an extension of the static model and reduces to the static model in the appropriate limit. By assuming that rain attenuation is dynamically described by a first-order stochastic differential equation in time and that this random attenuation process is a Markov process, an expression for the associated transition probability is obtained by solving the related forward Kolmogorov equation. This transition probability is then used to obtain such temporal rain attenuation statistics as attenuation durations and allowable attenuation margins versus control system delay.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1991-01-01
The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.
Synthesis of Systemic Functional Theory & Dynamical Systems Theory for Socio-Cultural Modeling
2012-05-07
Kevin Judd Email: Kevin.Judd@uwa.edu.au Institution: University of Western Australia Mailing Address: School of Mathematics and Statistics...Performance: 6 May 2010 - 5 May 2012 Note: Kevin Judd was on extended medical leave in 2011. He did not contribute to the project during 2011-2012...the project. Marissa E Kwan Lin was the Research Associate for the project. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-07
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
NASA Astrophysics Data System (ADS)
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-01
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johannesson, G
2010-03-17
Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less
Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.
2007-01-01
A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.
Bigdely-Shamlo, Nima; Mullen, Tim; Kreutz-Delgado, Kenneth; Makeig, Scott
2013-01-01
A crucial question for the analysis of multi-subject and/or multi-session electroencephalographic (EEG) data is how to combine information across multiple recordings from different subjects and/or sessions, each associated with its own set of source processes and scalp projections. Here we introduce a novel statistical method for characterizing the spatial consistency of EEG dynamics across a set of data records. Measure Projection Analysis (MPA) first finds voxels in a common template brain space at which a given dynamic measure is consistent across nearby source locations, then computes local-mean EEG measure values for this voxel subspace using a statistical model of source localization error and between-subject anatomical variation. Finally, clustering the mean measure voxel values in this locally consistent brain subspace finds brain spatial domains exhibiting distinguishable measure features and provides 3-D maps plus statistical significance estimates for each EEG measure of interest. Applied to sufficient high-quality data, the scalp projections of many maximally independent component (IC) processes contributing to recorded high-density EEG data closely match the projection of a single equivalent dipole located in or near brain cortex. We demonstrate the application of MPA to a multi-subject EEG study decomposed using independent component analysis (ICA), compare the results to k-means IC clustering in EEGLAB (sccn.ucsd.edu/eeglab), and use surrogate data to test MPA robustness. A Measure Projection Toolbox (MPT) plug-in for EEGLAB is available for download (sccn.ucsd.edu/wiki/MPT). Together, MPA and ICA allow use of EEG as a 3-D cortical imaging modality with near-cm scale spatial resolution. PMID:23370059
Statistical deprojection of galaxy pairs
NASA Astrophysics Data System (ADS)
Nottale, Laurent; Chamaraux, Pierre
2018-06-01
Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.
Computer architecture evaluation for structural dynamics computations: Project summary
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1989-01-01
The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.
Short-time dynamics of molecular junctions after projective measurement
NASA Astrophysics Data System (ADS)
Tang, Gaomin; Xing, Yanxia; Wang, Jian
2017-08-01
In this work, we study the short-time dynamics of a molecular junction described by Anderson-Holstein model using full-counting statistics after projective measurement. The coupling between the central quantum dot (QD) and two leads was turned on at remote past and the system is evolved to steady state at time t =0 , when we perform the projective measurement in one of the lead. Generating function for the charge transfer is expressed as a Fredholm determinant in terms of Keldysh nonequilibrium Green's function in the time domain. It is found that the current is not constant at short times indicating that the measurement does perturb the system. We numerically compare the current behaviors after the projective measurement with those in the transient regime where the subsystems are connected at t =0 . The universal scaling for high-order cumulants is observed for the case with zero QD occupation due to the unidirectional transport at short times. The influences of electron-phonon interaction on short-time dynamics of electric current, shot noise, and differential conductance are analyzed.
Employing Deceptive Dynamic Network Topology Through Software-Defined Networking
2014-03-01
manage economies, banking, and businesses , to the way we gather intelligence and militaries wage war. With computer networks and the Internet, we have seen...space, along with other generated statistics , similar to that performed by the Ant Census project. As we have shown, there is an extensive and diverse...calculated RTT for each probe. In the ping statistics , we are presented the details of probes sent and responses received, and the calculated packet loss
HIERARCHICAL STATISTICAL MODELLING OF INFLUENZA EPIDEMIC DYNAMICS IN SPACE AND TIME. (R827257)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
NASA Astrophysics Data System (ADS)
Muhling, B.; Gaitan, C. F.; Tommasi, D.; Saba, V. S.; Stock, C. A.; Dixon, K. W.
2016-02-01
Estuaries of the northeastern United States provide essential habitat for many anadromous fishes, across a range of life stages. Climate change is likely to impact estuarine environments and habitats through multiple pathways. Increasing air temperatures will result in a warming water column, and potentially increased stratification. In addition, changes to precipitation patterns may alter freshwater inflow dynamics, leading to altered seasonal salinity regimes. However, the spatial resolution of global climate models is generally insufficient to resolve these processes at the scale of individual estuaries. Global models can be downscaled to a regional resolution using a variety of dynamical and statistical methods. In this study, we examined projections of estuarine conditions, and future habitat extent, for several anadromous fishes in the Chesapeake Bay using different statistical downscaling methods. Sources of error from physical and biological models were quantified, and key areas of uncertainty were highlighted. Results suggested that future projections of the distribution and recruitment of species most strongly linked to freshwater flow dynamics had the highest levels of uncertainty. The sensitivity of different life stages to environmental conditions, and the population-level responses of anadromous species to climate change, were identified as important areas for further research.
Climate Change Projection for the Department of Energy's Savannah River Site
NASA Astrophysics Data System (ADS)
Werth, D. W.
2014-12-01
As per recent Department of Energy (DOE) sustainability requirements, the Savannah River National Laboratory (SRNL) is developing a climate projection for the DOE's Savannah River Site (SRS) near Aiken, SC. This will comprise data from both a statistical and a dynamic downscaling process, each interpolated to the SRS. We require variables most relevant to operational activities at the site (such as the US Forest Service's forest management program), and select temperature, precipitation, wind, and humidity as being most relevant to energy and water resource requirements, fire and forest ecology, and facility and worker safety. We then develop projections of the means and extremes of these variables, estimate the effect on site operations, and develop long-term mitigation strategies. For example, given that outdoor work while wearing protective gear is a daily facet of site operations, heat stress is of primary importance to work planning, and we use the downscaled data to estimate changes in the occurrence of high temperatures. For the statistical downscaling, we use global climate model (GCM) data from the Climate Model Intercomparison Project, version 5 (CMIP-5), which was used in the IPCC Fifth Assessment Report (AR5). GCM data from five research groups was selected, and two climate change scenarios - RCP 4.5 and RCP 8.5 - are used with observed data from site instruments and other databases to produce the downscaled projections. We apply a quantile regression downscaling method, which involves the use of the observed cumulative distribution function to correct that of the GCM. This produces a downscaled projection with an interannual variability closer to that of the observed data and allows for more extreme values in the projections, which are often absent in GCM data. The statistically downscaled data is complemented with dynamically downscaled data from the NARCCAP database, which comprises output from regional climate models forced with GCM output from the CMIP-3 database of GCM simulations. Applications of the downscaled climate projections to some of the unique operational needs of a large DOE weapons complex site are described.
Molecular dynamics study on splitting of hydrogen-implanted silicon in Smart-Cut® technology
NASA Astrophysics Data System (ADS)
Bing, Wang; Bin, Gu; Rongying, Pan; Sijia, Zhang; Jianhua, Shen
2015-03-01
Defect evolution in a single crystal silicon which is implanted with hydrogen atoms and then annealed is investigated in the present paper by means of molecular dynamics simulation. By introducing defect density based on statistical average, this work aims to quantitatively examine defect nucleation and growth at nanoscale during annealing in Smart-Cut® technology. Research focus is put on the effects of the implantation energy, hydrogen implantation dose and annealing temperature on defect density in the statistical region. It is found that most defects nucleate and grow at the annealing stage, and that defect density increases with the increase of the annealing temperature and the decrease of the hydrogen implantation dose. In addition, the enhancement and the impediment effects of stress field on defect density in the annealing process are discussed. Project supported by the National Natural Science Foundation of China (No. 11372261), the Excellent Young Scientists Supporting Project of Science and Technology Department of Sichuan Province (No. 2013JQ0030), the Supporting Project of Department of Education of Sichuan Province (No. 2014zd3132), the Opening Project of Key Laboratory of Testing Technology for Manufacturing Process, Southwest University of Science and Technology-Ministry of Education (No. 12zxzk02), the Fund of Doctoral Research of Southwest University of Science and Technology (No. 12zx7106), and the Postgraduate Innovation Fund Project of Southwest University of Science and Technology (No. 14ycxjj0121).
Dynamic PET Image reconstruction for parametric imaging using the HYPR kernel method
NASA Astrophysics Data System (ADS)
Spencer, Benjamin; Qi, Jinyi; Badawi, Ramsey D.; Wang, Guobao
2017-03-01
Dynamic PET image reconstruction is a challenging problem because of the ill-conditioned nature of PET and the lowcounting statistics resulted from short time-frames in dynamic imaging. The kernel method for image reconstruction has been developed to improve image reconstruction of low-count PET data by incorporating prior information derived from high-count composite data. In contrast to most of the existing regularization-based methods, the kernel method embeds image prior information in the forward projection model and does not require an explicit regularization term in the reconstruction formula. Inspired by the existing highly constrained back-projection (HYPR) algorithm for dynamic PET image denoising, we propose in this work a new type of kernel that is simpler to implement and further improves the kernel-based dynamic PET image reconstruction. Our evaluation study using a physical phantom scan with synthetic FDG tracer kinetics has demonstrated that the new HYPR kernel-based reconstruction can achieve a better region-of-interest (ROI) bias versus standard deviation trade-off for dynamic PET parametric imaging than the post-reconstruction HYPR denoising method and the previously used nonlocal-means kernel.
Climate and dengue transmission: evidence and implications.
Morin, Cory W; Comrie, Andrew C; Ernst, Kacey
2013-01-01
Climate influences dengue ecology by affecting vector dynamics, agent development, and mosquito/human interactions. Although these relationships are known, the impact climate change will have on transmission is unclear. Climate-driven statistical and process-based models are being used to refine our knowledge of these relationships and predict the effects of projected climate change on dengue fever occurrence, but results have been inconsistent. We sought to identify major climatic influences on dengue virus ecology and to evaluate the ability of climate-based dengue models to describe associations between climate and dengue, simulate outbreaks, and project the impacts of climate change. We reviewed the evidence for direct and indirect relationships between climate and dengue generated from laboratory studies, field studies, and statistical analyses of associations between vectors, dengue fever incidence, and climate conditions. We assessed the potential contribution of climate-driven, process-based dengue models and provide suggestions to improve their performance. Relationships between climate variables and factors that influence dengue transmission are complex. A climate variable may increase dengue transmission potential through one aspect of the system while simultaneously decreasing transmission potential through another. This complexity may at least partly explain inconsistencies in statistical associations between dengue and climate. Process-based models can account for the complex dynamics but often omit important aspects of dengue ecology, notably virus development and host-species interactions. Synthesizing and applying current knowledge of climatic effects on all aspects of dengue virus ecology will help direct future research and enable better projections of climate change effects on dengue incidence.
Forecasting of Radiation Belts: Results From the PROGRESS Project.
NASA Astrophysics Data System (ADS)
Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.
2017-12-01
Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
NASA Astrophysics Data System (ADS)
Kang, S.; IM, E. S.; Eltahir, E. A. B.
2016-12-01
In this study, the future change in precipitation due to global warming is investigated over the Maritime Continent using the MIT Regional Climate Model (MRCM). A total of nine 30-year projections under multi-GCMs (CCSM, MPI, ACCESS) and multi-scenarios of emissions (Control, RCP4.5, RCP8.5) are dynamically downscaled using the MRCM with 12km horizontal resolution. Since downscaled results tend to systematically overestimate the precipitation regardless of GCM used as lateral boundary conditions, the Parametric Quantile Mapping (PQM) is applied to reduce this wet bias. The cross validation for the control simulation shows that the PQM method seems to retain the spatial pattern and temporal variability of raw simulation, however it effectively reduce the wet bias. Based on ensemble projections produced by dynamical downscaling and statistical bias correction, a reduction of future precipitation is discernible, in particular during dry season (June-July-August). For example, intense precipitation in Singapore is expected to be reduced in RCP8.5 projection compared to control simulation. However, the geographical patterns and magnitude of changes still remain uncertain, suffering from statistical insignificance and a lack of model agreement. Acknowledgements This research is supported by the National Research Foundation Singapore under its Campus for Research Excellence and Technological Enterprise programme. The Center for Environmental Sensing and Modeling is an interdisciplinary research group of the Singapore-MIT Alliance for Research and Technology
Inferring Characteristics of Sensorimotor Behavior by Quantifying Dynamics of Animal Locomotion
NASA Astrophysics Data System (ADS)
Leung, KaWai
Locomotion is one of the most well-studied topics in animal behavioral studies. Many fundamental and clinical research make use of the locomotion of an animal model to explore various aspects in sensorimotor behavior. In the past, most of these studies focused on population average of a specific trait due to limitation of data collection and processing power. With recent advance in computer vision and statistical modeling techniques, it is now possible to track and analyze large amounts of behavioral data. In this thesis, I present two projects that aim to infer the characteristics of sensorimotor behavior by quantifying the dynamics of locomotion of nematode Caenorhabditis elegans and fruit fly Drosophila melanogaster, shedding light on statistical dependence between sensing and behavior. In the first project, I investigate the possibility of inferring noxious sensory information from the behavior of Caenorhabditis elegans. I develop a statistical model to infer the heat stimulus level perceived by individual animals from their stereotyped escape responses after stimulation by an IR laser. The model allows quantification of analgesic-like effects of chemical agents or genetic mutations in the worm. At the same time, the method is able to differentiate perturbations of locomotion behavior that are beyond affecting the sensory system. With this model I propose experimental designs that allows statistically significant identification of analgesic-like effects. In the second project, I investigate the relationship of energy budget and stability of locomotion in determining the walking speed distribution of Drosophila melanogaster during aging. The locomotion stability at different age groups is estimated from video recordings using Floquet theory. I calculate the power consumption of different locomotion speed using a biomechanics model. In conclusion, the power consumption, not stability, predicts the locomotion speed distribution at different ages.
NASA Astrophysics Data System (ADS)
Olson, R.; Evans, J. P.; Fan, Y.
2015-12-01
NARCliM (NSW/ACT Regional Climate Modelling Project) is a regional climate project for Australia and the surrounding region. It dynamically downscales 4 General Circulation Models (GCMs) using three Regional Climate Models (RCMs) to provide climate projections for the CORDEX-AustralAsia region at 50 km resolution, and for south-east Australia at 10 km resolution. The project differs from previous work in the level of sophistication of model selection. Specifically, the selection process for GCMs included (i) conducting literature review to evaluate model performance, (ii) analysing model independence, and (iii) selecting models that span future temperature and precipitation change space. RCMs for downscaling the GCMs were chosen based on their performance for several precipitation events over South-East Australia, and on model independence.Bayesian Model Averaging (BMA) provides a statistically consistent framework for weighing the models based on their likelihood given the available observations. These weights are used to provide probability distribution functions (pdfs) for model projections. We develop a BMA framework for constructing probabilistic climate projections for spatially-averaged variables from the NARCliM project. The first step in the procedure is smoothing model output in order to exclude the influence of internal climate variability. Our statistical model for model-observations residuals is a homoskedastic iid process. Comparing RCMs with Australian Water Availability Project (AWAP) observations is used to determine model weights through Monte Carlo integration. Posterior pdfs of statistical parameters of model-data residuals are obtained using Markov Chain Monte Carlo. The uncertainty in the properties of the model-data residuals is fully accounted for when constructing the projections. We present the preliminary results of the BMA analysis for yearly maximum temperature for New South Wales state planning regions for the period 2060-2079.
Statistical model of exotic rotational correlations in emergent space-time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan
2017-06-06
A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictionsmore » for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.« less
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
Pacific Decadal Variability and Central Pacific Warming El Niño in a Changing Climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Lorenzo, Emanuele
This research aimed at understanding the dynamics controlling decadal variability in the Pacific Ocean and its interactions with global-scale climate change. The first goal was to assess how the dynamics and statistics of the El Niño Southern Oscillation and the modes of Pacific decadal variability are represented in global climate models used in the IPCC. The second goal was to quantify how decadal dynamics are projected to change under continued greenhouse forcing, and determine their significance in the context of paleo-proxy reconstruction of long-term climate.
2013-12-01
Demonstration Project beginning in the 2011 North Atlantic hurricane season (WG/HWSOR 2011). The primary objectives of the first year of the demon- stration...after Atlantic hurricanes from WP-3D hur- ricane research flights conducted jointly by the NOAA AircraftOperationsCenter (AOC), theNOAA/Hurricane... Atlantic hurricane season; 3) to present an initial set of results from the inclusion of AXBT data in both statistical and dynamical numerical prediction
NASA Astrophysics Data System (ADS)
Jaworski, Leszek; Swiatek, Anna; Zdunek, Ryszard
2013-09-01
The problem of insufficient accuracy of EGNOS correction for the territory of Poland, located at the edge of EGNOS range is well known. The EEI PECS project (EGNOS EUPOS Integration) assumes improving the EGNOS correction by using the GPS observations from Polish ASG-EUPOS stations. One of the EEI project tasks was the identification of EGNOS performance limitations over Poland and services for EGNOSS-EUPOS combination. The two sets of data were used for those goals: statistical, theoretical data obtained using the SBAS simulator software, real data obtained during the measurements. The real measurements were managed as two types of measurements: static and dynamic. Static measurements are continuously managing using Septentrio PolaRx2 receiver. The SRC permanent station works in IMAGE/PERFECT project. Dynamic measurements were managed using the Mobile GPS Laboratory (MGL). Receivers (geodetic and navigation) were working in two modes: determining navigation position from standalone GPS, determining navigation position from GPS plus EGNOS correction. The paper presents results of measurements' analyses and conclusions based on which the next tasks in EEI project are completed
Quantum fluctuation theorems and power measurements
NASA Astrophysics Data System (ADS)
Prasanna Venkatesh, B.; Watanabe, Gentaro; Talkner, Peter
2015-07-01
Work in the paradigm of the quantum fluctuation theorems of Crooks and Jarzynski is determined by projective measurements of energy at the beginning and end of the force protocol. In analogy to classical systems, we consider an alternative definition of work given by the integral of the supplied power determined by integrating up the results of repeated measurements of the instantaneous power during the force protocol. We observe that such a definition of work, in spite of taking account of the process dependence, has different possible values and statistics from the work determined by the conventional two energy measurement approach (TEMA). In the limit of many projective measurements of power, the system’s dynamics is frozen in the power measurement basis due to the quantum Zeno effect leading to statistics only trivially dependent on the force protocol. In general the Jarzynski relation is not satisfied except for the case when the instantaneous power operator commutes with the total Hamiltonian at all times. We also consider properties of the joint statistics of power-based definition of work and TEMA work in protocols where both values are determined. This allows us to quantify their correlations. Relaxing the projective measurement condition, weak continuous measurements of power are considered within the stochastic master equation formalism. Even in this scenario the power-based work statistics is in general not able to reproduce qualitative features of the TEMA work statistics.
Uniting statistical and individual-based approaches for animal movement modelling.
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.
Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047
Flows of Wet Foamsand Concentrated Emulsions
NASA Technical Reports Server (NTRS)
Nemer, Martin B.
2005-01-01
The aim of this project was is to advance a microstructural understanding of foam and emulsion flows. The dynamics of individual surfactant-covered drops and well as the collective behavior of dilute and concentrated was explored using numerical simulations. The long-range goal of this work is the formulation of reliable microphysically-based statistical models of emulsion flows.
2006-05-31
dynamics (MD) and kinetic Monte Carlo ( KMC ) procedures. In 2D surface modeling our calculations project speedups of 9 orders of magnitude at 300 degrees...programming is used to perform customized statistical mechanics by bridging the different time scales of MD and KMC quickly and well. Speedups in
The COSMIC-DANCE project: Unravelling the origin of the mass function
NASA Astrophysics Data System (ADS)
Bouy, H.; Bertin, E.; Sarro, L. M.; Barrado, D.; Berihuete, A.; Olivares, J.; Moraux, E.; Bouvier, J.; Tamura, M.; Cuillandre, J.-C.; Beletsky, Y.; Wright, N.; Huelamo, N.; Allen, L.; Solano, E.; Brandner, B.
2017-03-01
The COSMIC-DANCE project is an observational program aiming at understanding the origin and evolution of ultracool objects by measuring the mass function and internal dynamics of young nearby associations down to the fragmentation limit. The least massive members of young nearby associations are identified using modern statistical methods in a multi-dimensional space made of optical and infrared luminosities and colors and proper motions. The photometry and astrometry are obtained by combining ground and in some case space based archival observations with new observations, covering between one and two decades.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, Matthew Morgan; DeChant, Lawrence Justin.; Piekos, Edward Stanley
2009-02-01
This report summarizes the work completed during FY2007 and FY2008 for the LDRD project ''Hybrid Plasma Modeling''. The goal of this project was to develop hybrid methods to model plasmas across the non-continuum-to-continuum collisionality spectrum. The primary methodology to span these regimes was to couple a kinetic method (e.g., Particle-In-Cell) in the non-continuum regions to a continuum PDE-based method (e.g., finite differences) in continuum regions. The interface between the two would be adjusted dynamically ased on statistical sampling of the kinetic results. Although originally a three-year project, it became clear during the second year (FY2008) that there were not sufficientmore » resources to complete the project and it was terminated mid-year.« less
Dynamics of essential collective motions in proteins: Theory
NASA Astrophysics Data System (ADS)
Stepanova, Maria
2007-11-01
A general theoretical background is introduced for characterization of conformational motions in protein molecules, and for building reduced coarse-grained models of proteins, based on the statistical analysis of their phase trajectories. Using the projection operator technique, a system of coupled generalized Langevin equations is derived for essential collective coordinates, which are generated by principal component analysis of molecular dynamic trajectories. The number of essential degrees of freedom is not limited in the theory. An explicit analytic relation is established between the generalized Langevin equation for essential collective coordinates and that for the all-atom phase trajectory projected onto the subspace of essential collective degrees of freedom. The theory introduced is applied to identify correlated dynamic domains in a macromolecule and to construct coarse-grained models representing the conformational motions in a protein through a few interacting domains embedded in a dissipative medium. A rigorous theoretical background is provided for identification of dynamic correlated domains in a macromolecule. Examples of domain identification in protein G are given and employed to interpret NMR experiments. Challenges and potential outcomes of the theory are discussed.
Empirical analysis of storm-time energetic electron enhancements
NASA Astrophysics Data System (ADS)
O'Brien, Thomas Paul, III
This Ph.D. thesis documents a program for studying the appearance of energetic electrons in the Earth's outer radiation belts that is associated with many geomagnetic storms. The dynamic evolution of the electron radiation belts is an outstanding empirical problem in both theoretical space physics and its applied sibling, space weather. The project emphasizes the development of empirical tools and their use in testing several theoretical models of the energization of the electron belts. First, I develop the Statistical Asynchronous Regression technique to provide proxy electron fluxes throughout the parts of the radiation belts explored by geosynchronous and GPS spacecraft. Next, I show that a theoretical adiabatic model can relate the local time asymmetry of the proxy geosynchronous fluxes to the asymmetry of the geomagnetic field. Then, I perform a superposed epoch analysis on the proxy fluxes at local noon to identify magnetospheric and interplanetary precursors of relativistic electron enhancements. Finally, I use statistical and neural network phase space analyses to determine the hourly evolution of flux at a virtual stationary monitor. The dynamic equation quantitatively identifies the importance of different drivers of the electron belts. This project provides empirical constraints on theoretical models of electron acceleration.
Molecular Dynamics of Hot Dense Plasmas: New Horizons
NASA Astrophysics Data System (ADS)
Graziani, Frank
2011-10-01
We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. This work is performed under the auspices of the U. S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
The finite state projection approach to analyze dynamics of heterogeneous populations
NASA Astrophysics Data System (ADS)
Johnson, Rob; Munsky, Brian
2017-06-01
Population modeling aims to capture and predict the dynamics of cell populations in constant or fluctuating environments. At the elementary level, population growth proceeds through sequential divisions of individual cells. Due to stochastic effects, populations of cells are inherently heterogeneous in phenotype, and some phenotypic variables have an effect on division or survival rates, as can be seen in partial drug resistance. Therefore, when modeling population dynamics where the control of growth and division is phenotype dependent, the corresponding model must take account of the underlying cellular heterogeneity. The finite state projection (FSP) approach has often been used to analyze the statistics of independent cells. Here, we extend the FSP analysis to explore the coupling of cell dynamics and biomolecule dynamics within a population. This extension allows a general framework with which to model the state occupations of a heterogeneous, isogenic population of dividing and expiring cells. The method is demonstrated with a simple model of cell-cycle progression, which we use to explore possible dynamics of drug resistance phenotypes in dividing cells. We use this method to show how stochastic single-cell behaviors affect population level efficacy of drug treatments, and we illustrate how slight modifications to treatment regimens may have dramatic effects on drug efficacy.
Factors Influencing Energy Use and Carbon Emissions in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisher-Vanden, Karen; Jefferson, Gary
This research project was designed to fill a critical void in our understanding of the state of energy research and innovation in China. It seeks to provide a comprehensive review and accounting of the various elements of the Chinese government and non-governmental sectors (commercial, university, research institutes) that are engaged in energy-related R&D and various aspects of energy innovation, including specific programs and projects designed to promote renewable energy innovation and energy conservation. The project provides an interrelated descriptive, statistical, and econometric account of China's overall energy innovation activities and capabilities, spanning the full economy with a particular focus onmore » the dynamic industrial sector.« less
Statistical and dynamical modeling of heavy-ion fusion-fission reactions
NASA Astrophysics Data System (ADS)
Eslamizadeh, H.; Razazzadeh, H.
2018-02-01
A modified statistical model and a four dimensional dynamical model based on Langevin equations have been used to simulate the fission process of the excited compound nuclei 207At and 216Ra produced in the fusion 19F + 188Os and 19F + 197Au reactions. The evaporation residue cross section, the fission cross section, the pre-scission neutron, proton and alpha multiplicities and the anisotropy of fission fragments angular distribution have been calculated for the excited compound nuclei 207At and 216Ra. In the modified statistical model the effects of spin K about the symmetry axis and temperature have been considered in calculations of the fission widths and the potential energy surfaces. It was shown that the modified statistical model can reproduce the above mentioned experimental data by using appropriate values of the temperature coefficient of the effective potential equal to λ = 0.0180 ± 0.0055, 0.0080 ± 0.0030 MeV-2 and the scaling factor of the fission barrier height equal to rs = 1.0015 ± 0.0025, 1.0040 ± 0.0020 for the compound nuclei 207At and 216Ra, respectively. Three collective shape coordinates plus the projection of total spin of the compound nucleus on the symmetry axis, K, were considered in the four dimensional dynamical model. In the dynamical calculations, dissipation was generated through the chaos weighted wall and window friction formula. Comparison of the theoretical results with the experimental data showed that two models make it possible to reproduce satisfactorily the above mentioned experimental data for the excited compound nuclei 207At and 216Ra.
NASA Astrophysics Data System (ADS)
Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano
2015-04-01
The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.
Designing high speed diagnostics
NASA Astrophysics Data System (ADS)
Veliz Carrillo, Gerardo; Martinez, Adam; Mula, Swathi; Prestridge, Kathy; Extreme Fluids Team Team
2017-11-01
Timing and firing for shock-driven flows is complex because of jitter in the shock tube mechanical drivers. Consequently, experiments require dynamic triggering of diagnostics from pressure transducers. We explain the design process and criteria for setting up re-shock experiments at the Los Alamos Vertical Shock Tube facility, and the requirements for particle image velocimetry and planar laser induced fluorescence measurements necessary for calculating Richtmeyer-Meshkov variable density turbulent statistics. Dynamic triggering of diagnostics allows for further investigation of the development of the Richtemeyer-Meshkov instability at both initial shock and re-shock. Thanks to the Los Alamos National Laboratory for funding our project.
Tipping point analysis of atmospheric oxygen concentration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.
2015-03-15
We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.
ERIC Educational Resources Information Center
Baltezore, Joan M.; Newbrey, Michael G.
2007-01-01
The purpose of this paper is to provide background information about the spread of viruses in a population, to introduce an adaptable procedure to further the understanding of epidemiology in the high school setting, and to show how hypothesis testing and statistics can be incorporated into a high school lab exercise. It describes a project which…
W. Keith Moser; Renate Bush; John D. Shaw; Mark H. Hansen; Mark D. Nelson
2010-01-01
A major challenge for todayâs resource managers is the linking of standand landscape-scale dynamics. The U.S. Forest Service has made major investments in programs at both the stand- (national forest project) and landscape/regional (Forest Inventory and Analysis [FIA] program) levels. FIA produces the only comprehensive and consistent statistical information on the...
2009-05-27
should not be interpreted as representing the official policies, either expressed or implied, of the Defence Advanced Research Project Agency or the... neurobiological interpretation of E-states? 36 8.5 The statistical molecular dynamics of E-states 37 8.6 The C++ program EPMM 40 9 Promising...have an intuitive link between E-machines and neural networks. Such a link provides a source of neurobiological heuristic considerations for the design
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui
2012-01-01
Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.
Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz
2012-09-24
The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.
NASA Astrophysics Data System (ADS)
Held, H.; Gerstengarbe, F.-W.; Hattermann, F.; Pinto, J. G.; Ulbrich, U.; Böhm, U.; Born, K.; Büchner, M.; Donat, M. G.; Kücken, M.; Leckebusch, G. C.; Nissen, K.; Nocke, T.; Österle, H.; Pardowitz, T.; Werner, P. C.; Burghoff, O.; Broecker, U.; Kubik, A.
2012-04-01
We present an overview of a complementary-approaches impact project dealing with the consequences of climate change for the natural hazard branch of the insurance industry in Germany. The project was conducted by four academic institutions together with the German Insurance Association (GDV) and finalized in autumn 2011. A causal chain is modeled that goes from global warming projections over regional meteorological impacts to regional economic losses for private buildings, hereby fully covering the area of Germany. This presentation will focus on wind storm related losses, although the method developed had also been applied in part to hail and flood impact losses. For the first time, the GDV supplied their collected set of insurance cases, dating back for decades, for such an impact study. These data were used to calibrate and validate event-based damage functions which in turn were driven by three different types of regional climate models to generate storm loss projections. The regional models were driven by a triplet of ECHAM5 experiments following the A1B scenario which were found representative in the recent ENSEMBLES intercomparison study. In our multi-modeling approach we used two types of regional climate models that conceptually differ at maximum: a dynamical model (CCLM) and a statistical model based on the idea of biased bootstrapping (STARS). As a third option we pursued a hybrid approach (statistical-dynamical downscaling). For the assessment of climate change impacts, the buildings' infrastructure and their economic value is kept at current values. For all three approaches, a significant increase of average storm losses and extreme event return levels in the German private building sector is found for future decades assuming an A1B-scenario. However, the three projections differ somewhat in terms of magnitude and regional differentiation. We have developed a formalism that allows us to express the combined effect of multi-source uncertainty on return levels within the framework of a generalized Pareto distribution.
Dynamic and thermodynamic processes driving the January 2014 precipitation record in southern UK
NASA Astrophysics Data System (ADS)
Oueslati, B.; Yiou, P.; Jezequel, A.
2017-12-01
Regional extreme precipitation are projected to intensify as a response to planetary climate change, with important impacts on societies. Understanding and anticipating those events remain a major challenge. In this study, we revisit the mechanisms of winter precipitation record that occurred in southern United Kingdom in January 2014. The physical drivers of this event are analyzed using the water vapor budget. Precipitation changes are decomposed into dynamic contributions, related to changes in atmospheric circulation, and thermodynamic contributions, related to changes in water vapor. We attempt to quantify the relative importance of the two contributions during this event and examine the applicability of Clausius-Clapeyron scaling. This work provides a physical interpretation of the mechanisms associated with Southern UK's wettest event, which is complementary to other studies based on statistical approaches (Schaller et al., 2016, Yiou et al., 2017). The analysis is carried out using the ERA-Interim reanalysis. This is motivated by the horizontal resolution of this dataset. It is then applied to present-day simulations and future projections of CMIP5 models on selected extreme precipitation events in southern UK that are comparable to January 2014 in terms of atmospheric circulation.References:Schaller, N. et al. Human influence on climate in the 2014 southern England winter floods and their impacts, Nature Clim. Change, 2016, 6, 627-634 Yiou, P., et al. A statistical framework for conditional extreme event attribution Advances in Statistical Climatology, Meteorology and Oceanography, 2017, 3, 17-31
The role of fluctuations and interactions in pedestrian dynamics
NASA Astrophysics Data System (ADS)
Corbetta, Alessandro; Meeusen, Jasper; Benzi, Roberto; Lee, Chung-Min; Toschi, Federico
Understanding quantitatively the statistical behaviour of pedestrians walking in crowds is a major scientific challenge of paramount societal relevance. Walking humans exhibit a rich (stochastic) dynamics whose small and large deviations are driven, among others, by own will as well as by environmental conditions. Via 24/7 automatic pedestrian tracking from multiple overhead Microsoft Kinect depth sensors, we collected large ensembles of pedestrian trajectories (in the order of tens of millions) in different real-life scenarios. These scenarios include both narrow corridors and large urban hallways, enabling us to cover and compare a wide spectrum of typical pedestrian dynamics. We investigate the pedestrian motion measuring the PDFs, e.g. those of position, velocity and acceleration, and at unprecedentedly high statistical resolution. We consider the dependence of PDFs on flow conditions, focusing on diluted dynamics and pair-wise interactions (''collisions'') for mutual avoidance. By means of Langevin-like models we provide models for the measured data, inclusive typical fluctuations and rare events. This work is part of the JSTP research programme ``Vision driven visitor behaviour analysis and crowd management'' with Project Number 341-10-001, which is financed by the Netherlands Organisation for Scientific Research (NWO).
Dynamic rain fade compensation techniques for the advanced communications technology satellite
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1992-01-01
The dynamic and composite nature of propagation impairments that are incurred on earth-space communications links at frequencies in and above the 30/20 GHz Ka band necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) project by the implementation of optimal processing schemes derived through the use of the ACTS Rain Attenuation Prediction Model and nonlinear Markov filtering theory. The ACTS Rain Attenuation Prediction Model discerns climatological variations on the order of 0.5 deg in latitude and longitude in the continental U.S. The time-dependent portion of the model gives precise availability predictions for the 'spot beam' links of ACTS. However, the structure of the dynamic portion of the model, which yields performance parameters such as fade duration probabilities, is isomorphic to the state-variable approach of stochastic control theory and is amenable to the design of such statistical fade processing schemes which can be made specific to the particular climatological location at which they are employed.
Chaos and Forecasting - Proceedings of the Royal Society Discussion Meeting
NASA Astrophysics Data System (ADS)
Tong, Howell
1995-04-01
The Table of Contents for the full book PDF is as follows: * Preface * Orthogonal Projection, Embedding Dimension and Sample Size in Chaotic Time Series from a Statistical Perspective * A Theory of Correlation Dimension for Stationary Time Series * On Prediction and Chaos in Stochastic Systems * Locally Optimized Prediction of Nonlinear Systems: Stochastic and Deterministic * A Poisson Distribution for the BDS Test Statistic for Independence in a Time Series * Chaos and Nonlinear Forecastability in Economics and Finance * Paradigm Change in Prediction * Predicting Nonuniform Chaotic Attractors in an Enzyme Reaction * Chaos in Geophysical Fluids * Chaotic Modulation of the Solar Cycle * Fractal Nature in Earthquake Phenomena and its Simple Models * Singular Vectors and the Predictability of Weather and Climate * Prediction as a Criterion for Classifying Natural Time Series * Measuring and Characterising Spatial Patterns, Dynamics and Chaos in Spatially-Extended Dynamical Systems and Ecologies * Non-Linear Forecasting and Chaos in Ecology and Epidemiology: Measles as a Case Study
Extreme events in optics: Challenges of the MANUREVA project
NASA Astrophysics Data System (ADS)
Dudley, J. M.; Finot, C.; Millot, G.; Garnier, J.; Genty, G.; Agafontsev, D.; Dias, F.
2010-07-01
In this contribution we describe and discuss a series of challenges and questions relating to understanding extreme wave phenomena in optics. Many aspects of these questions are being studied in the framework of the MANUREVA project: a multidisciplinary consortium aiming to carry out mathematical, numerical and experimental studies in this field. The central motivation of this work is the 2007 results from optical physics [D. Solli et al., Nature 450, 1054 (2007)] that showed how a fibre-optical system can generate large amplitude extreme wave events with similar statistical properties to the infamous hydrodynamic rogue waves on the surface of the ocean. We review our recent work in this area, and discuss how this observation may open the possibility for an optical system to be used to directly study both the dynamics and statistics of extreme-value processes, a potential advance comparable to the introduction of optical systems to study chaos in the 1970s.
Identifying misbehaving models using baseline climate variance
NASA Astrophysics Data System (ADS)
Schultz, Colin
2011-06-01
The majority of projections made using general circulation models (GCMs) are conducted to help tease out the effects on a region, or on the climate system as a whole, of changing climate dynamics. Sun et al., however, used model runs from 20 different coupled atmosphere-ocean GCMs to try to understand a different aspect of climate projections: how bias correction, model selection, and other statistical techniques might affect the estimated outcomes. As a case study, the authors focused on predicting the potential change in precipitation for the Murray-Darling Basin (MDB), a 1-million- square- kilometer area in southeastern Australia that suffered a recent decade of drought that left many wondering about the potential impacts of climate change on this important agricultural region. The authors first compared the precipitation predictions made by the models with 107 years of observations, and they then made bias corrections to adjust the model projections to have the same statistical properties as the observations. They found that while the spread of the projected values was reduced, the average precipitation projection for the end of the 21st century barely changed. Further, the authors determined that interannual variations in precipitation for the MDB could be explained by random chance, where the precipitation in a given year was independent of that in previous years.
Statistical Decoupling of a Lagrangian Fluid Parcel in Newtonian Cosmology
NASA Astrophysics Data System (ADS)
Wang, Xin; Szalay, Alex
2016-03-01
The Lagrangian dynamics of a single fluid element within a self-gravitational matter field is intrinsically non-local due to the presence of the tidal force. This complicates the theoretical investigation of the nonlinear evolution of various cosmic objects, e.g., dark matter halos, in the context of Lagrangian fluid dynamics, since fluid parcels with given initial density and shape may evolve differently depending on their environments. In this paper, we provide a statistical solution that could decouple this environmental dependence. After deriving the evolution equation for the probability distribution of the matter field, our method produces a set of closed ordinary differential equations whose solution is uniquely determined by the initial condition of the fluid element. Mathematically, it corresponds to the projected characteristic curve of the transport equation of the density-weighted probability density function (ρPDF). Consequently it is guaranteed that the one-point ρPDF would be preserved by evolving these local, yet nonlinear, curves with the same set of initial data as the real system. Physically, these trajectories describe the mean evolution averaged over all environments by substituting the tidal tensor with its conditional average. For Gaussian distributed dynamical variables, this mean tidal tensor is simply proportional to the velocity shear tensor, and the dynamical system would recover the prediction of the Zel’dovich approximation (ZA) with the further assumption of the linearized continuity equation. For a weakly non-Gaussian field, the averaged tidal tensor could be expanded perturbatively as a function of all relevant dynamical variables whose coefficients are determined by the statistics of the field.
STATISTICAL DECOUPLING OF A LAGRANGIAN FLUID PARCEL IN NEWTONIAN COSMOLOGY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xin; Szalay, Alex, E-mail: xwang@cita.utoronto.ca
The Lagrangian dynamics of a single fluid element within a self-gravitational matter field is intrinsically non-local due to the presence of the tidal force. This complicates the theoretical investigation of the nonlinear evolution of various cosmic objects, e.g., dark matter halos, in the context of Lagrangian fluid dynamics, since fluid parcels with given initial density and shape may evolve differently depending on their environments. In this paper, we provide a statistical solution that could decouple this environmental dependence. After deriving the evolution equation for the probability distribution of the matter field, our method produces a set of closed ordinary differentialmore » equations whose solution is uniquely determined by the initial condition of the fluid element. Mathematically, it corresponds to the projected characteristic curve of the transport equation of the density-weighted probability density function (ρPDF). Consequently it is guaranteed that the one-point ρPDF would be preserved by evolving these local, yet nonlinear, curves with the same set of initial data as the real system. Physically, these trajectories describe the mean evolution averaged over all environments by substituting the tidal tensor with its conditional average. For Gaussian distributed dynamical variables, this mean tidal tensor is simply proportional to the velocity shear tensor, and the dynamical system would recover the prediction of the Zel’dovich approximation (ZA) with the further assumption of the linearized continuity equation. For a weakly non-Gaussian field, the averaged tidal tensor could be expanded perturbatively as a function of all relevant dynamical variables whose coefficients are determined by the statistics of the field.« less
Statistical and dynamical remastering of classic exoplanet systems
NASA Astrophysics Data System (ADS)
Nelson, Benjamin Earl
The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. In this dissertation, I will review our efforts to improve the statistical analyses of radial velocity (RV) data and their applications to some renown, dynamically complex exoplanet system. In the first project (Chapters 2 and 4), we develop a differential evolution Markov chain Monte Carlo (RUN DMC) algorithm to tackle the aforementioned difficult aspects of data analysis. We test the robustness of the algorithm in regards to the number of modeled planets (model dimensionality) and increasing dynamical strength. We apply RUN DMC to a couple classic multi-planet systems and one highly debated system from radial velocity surveys. In the second project (Chapter 5), we analyze RV data of 55 Cancri, a wide binary system known to harbor five planetary orbiting the primary. We find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet to enter the stellar photosphere through its periastron passage. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50+/-6 10 degrees), but they are not orbiting in a mean-motion resonance. In the third project (Chapters 3, 4, 6), we analyze RV data of Gliese 876, a four planet system with three participating in a multi-body resonance, i.e. a Laplace resonance. From a combined observational and statistical analysis computing Bayes factors, we find a four-planet model is favored over one with three-planets. Conditioned on this preferred model, we meaningfully constrain the three-dimensional orbital architecture of all the planets orbiting Gliese 876 based on the radial velocity data alone. By demanding orbital stability, we find the resonant planets have low mutual inclinations phi so they must be roughly coplanar (phicb = 1.41(+/-0.62/0.57) degrees and phibe = 3.87(+/-1.99/1.86 degrees). The three-dimensional Laplace argument librates chaotically with an amplitude of 50.5(+/-7.9/10.0) degrees, indicating significant past disk migration and ensuring long-term stability. In the final project (Chapter 7), we analyze the RV data for nu Octantis, a closely separated binary with an alleged planet orbiting interior and retrograde to the binary. Preliminary results place very tight constraints on the planet-binary mutual inclination but no model is dynamically stable beyond 105 years. These empirically derived models motivate the need for more sophisticated algorithms to analyze exoplanet data and will provide new challenges for planet formation models.
NASA Astrophysics Data System (ADS)
Krestyannikov, E.; Tohka, J.; Ruotsalainen, U.
2008-06-01
This paper presents a novel statistical approach for joint estimation of regions-of-interest (ROIs) and the corresponding time-activity curves (TACs) from dynamic positron emission tomography (PET) brain projection data. It is based on optimizing the joint objective function that consists of a data log-likelihood term and two penalty terms reflecting the available a priori information about the human brain anatomy. The developed local optimization strategy iteratively updates both the ROI and TAC parameters and is guaranteed to monotonically increase the objective function. The quantitative evaluation of the algorithm is performed with numerically and Monte Carlo-simulated dynamic PET brain data of the 11C-Raclopride and 18F-FDG tracers. The results demonstrate that the method outperforms the existing sequential ROI quantification approaches in terms of accuracy, and can noticeably reduce the errors in TACs arising due to the finite spatial resolution and ROI delineation.
Ring polymer dynamics in curved spaces
NASA Astrophysics Data System (ADS)
Wolf, S.; Curotto, E.
2012-07-01
We formulate an extension of the ring polymer dynamics approach to curved spaces using stereographic projection coordinates. We test the theory by simulating the particle in a ring, {T}^1, mapped by a stereographic projection using three potentials. Two of these are quadratic, and one is a nonconfining sinusoidal model. We propose a new class of algorithms for the integration of the ring polymer Hamilton equations in curved spaces. These are designed to improve the energy conservation of symplectic integrators based on the split operator approach. For manifolds, the position-position autocorrelation function can be formulated in numerous ways. We find that the position-position autocorrelation function computed from configurations in the Euclidean space {R}^2 that contains {T}^1 as a submanifold has the best statistical properties. The agreement with exact results obtained with vector space methods is excellent for all three potentials, for all values of time in the interval simulated, and for a relatively broad range of temperatures.
Keys, Yolanda; Silverman, Susan R; Evans, Jennie
2017-10-01
The purpose of this study was to collect the perceptions of design professionals and clinicians regarding design process success strategies and elements of interprofessional engagement and communication during healthcare design and construction projects. Additional objectives were to gather best practices to maximize clinician engagement and provide tools and techniques to improve interdisciplinary collaboration for future projects. Strategies are needed to enhance the design and construction process and create interactions that benefit not only the project but the individuals working to see its completion. Meaningful interprofessional collaboration is essential to any healthcare design project and making sure the various players communicate is a critical element. This was a qualitative study conducted via an online survey. Respondents included architects, construction managers, interior designers, and healthcare personnel who had recently been involved in a building renovation or new construction project for a healthcare facility. Responses to open-ended questions were analyzed for themes, and descriptive statistics were used to provide insight into participant demographics. Information on the impressions, perceptions, and opportunities related to clinician involvement in design projects was collected from nurses, architects, interior designers, and construction managers. Qualitative analysis revealed themes of clinician input, organizational dynamics, and a variety of communication strategies to be the most frequently mentioned elements of successful interprofessional collaboration. This study validates the need to include clinician input in the design process, to consider the importance of organizational dynamics on design team functioning, and to incorporate effective communication strategies during design and construction projects.
NASA Astrophysics Data System (ADS)
Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge
2018-04-01
Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.
NASA Astrophysics Data System (ADS)
Dairaku, K.
2017-12-01
The Asia-Pacific regions are increasingly threatened by large scale natural disasters. Growing concerns that loss and damages of natural disasters are projected to further exacerbate by climate change and socio-economic change. Climate information and services for risk assessments are of great concern. Fundamental regional climate information is indispensable for understanding changing climate and making decisions on when and how to act. To meet with the needs of stakeholders such as National/local governments, spatio-temporal comprehensive and consistent information is necessary and useful for decision making. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 37 GCMs (RCP8.5) and a statistical downscaling (Bias Corrected Spatial Disaggregation (BCSD)) to investigate uncertainty of projected change associated with structural differences of the GCMs for the periods of historical climate (1950-2005) and near future climate (2026-2050). Statistical downscaling regional climate scenarios show good performance for annual and seasonal averages for precipitation and temperature. The regional climate scenarios show systematic underestimate of extreme events such as hot days of over 35 Celsius and annual maximum daily precipitation because of the interpolation processes in the BCSD method. Each model projected different responses in near future climate because of structural differences. The most of CMIP5 37 models show qualitatively consistent increase of average and extreme temperature and precipitation. The added values of statistical/dynamical downscaling methods are also investigated for locally forced nonlinear phenomena, extreme events.
Symposium on Turbulence (13th) Held at Rolla, Missouri on September 21- 23, 1992
1992-09-01
this article Is part of a project aimed at Increasing the role of computational fluid dynamics ( CFD ) in the process of developing more efficient gas...techniques in and fluid physics of high speed compressible or reacting flows undergoing significant changes of indices of refraction. Possible Topics...in experimental fluid mechanics; homogeneous turbulence, including closures and statistical properties; turbulence in compressible fluids ; fine scale
The future distribution of the savannah biome: model-based and biogeographic contingency
Scheiter, Simon; Langan, Liam; Trabucco, Antonio; Higgins, Steven I.
2016-01-01
The extent of the savannah biome is expected to be profoundly altered by climatic change and increasing atmospheric CO2 concentrations. Contrasting projections are given when using different modelling approaches to estimate future distributions. Furthermore, biogeographic variation within savannahs in plant function and structure is expected to lead to divergent responses to global change. Hence the use of a single model with a single savannah tree type will likely lead to biased projections. Here we compare and contrast projections of South American, African and Australian savannah distributions from the physiologically based Thornley transport resistance statistical distribution model (TTR-SDM)—and three versions of a dynamic vegetation model (DVM) designed and parametrized separately for specific continents. We show that attempting to extrapolate any continent-specific model globally biases projections. By 2070, all DVMs generally project a decrease in the extent of savannahs at their boundary with forests, whereas the TTR-SDM projects a decrease in savannahs at their boundary with aridlands and grasslands. This difference is driven by forest and woodland expansion in response to rising atmospheric CO2 concentrations in DVMs, unaccounted for by the TTR-SDM. We suggest that the most suitable models of the savannah biome for future development are individual-based dynamic vegetation models designed for specific biogeographic regions. This article is part of the themed issue ‘Tropical grassy biomes: linking ecology, human use and conservation’. PMID:27502376
The future distribution of the savannah biome: model-based and biogeographic contingency.
Moncrieff, Glenn R; Scheiter, Simon; Langan, Liam; Trabucco, Antonio; Higgins, Steven I
2016-09-19
The extent of the savannah biome is expected to be profoundly altered by climatic change and increasing atmospheric CO2 concentrations. Contrasting projections are given when using different modelling approaches to estimate future distributions. Furthermore, biogeographic variation within savannahs in plant function and structure is expected to lead to divergent responses to global change. Hence the use of a single model with a single savannah tree type will likely lead to biased projections. Here we compare and contrast projections of South American, African and Australian savannah distributions from the physiologically based Thornley transport resistance statistical distribution model (TTR-SDM)-and three versions of a dynamic vegetation model (DVM) designed and parametrized separately for specific continents. We show that attempting to extrapolate any continent-specific model globally biases projections. By 2070, all DVMs generally project a decrease in the extent of savannahs at their boundary with forests, whereas the TTR-SDM projects a decrease in savannahs at their boundary with aridlands and grasslands. This difference is driven by forest and woodland expansion in response to rising atmospheric CO2 concentrations in DVMs, unaccounted for by the TTR-SDM. We suggest that the most suitable models of the savannah biome for future development are individual-based dynamic vegetation models designed for specific biogeographic regions.This article is part of the themed issue 'Tropical grassy biomes: linking ecology, human use and conservation'. © 2016 The Author(s).
NASA Astrophysics Data System (ADS)
Keener, V. W.; Brewington, L.; Jaspers, K.
2016-12-01
To build an effective bridge from the climate modeling community to natural resource managers, we assessed the existing landscape to see where different groups diverge in their perceptions of climate data and needs. An understanding of a given community's shared knowledge and differences can help design more actionable science. Resource managers in Hawaii are eager to have future climate projections at spatial scales relevant to the islands. National initiatives to downscale climate data often exclude US insular regions, so researchers in Hawaii have generated regional dynamically and statistically downscaled projections. Projections of precipitation diverge, however, leading to difficulties in communication and use. Recently, a two day workshop was held with scientists and managers to evaluate available models and determine a set of best practices for moving forward with decision-relevant downscaling in Hawaii. To seed the discussion, the Pacific Regional Integrated Sciences and Assessments (RISA) program conducted a pre-workshop survey (N=65) of climate modelers and freshwater, ecosystem, and wildfire managers working in Hawaii. Scientists reported spending less than half of their time on operational research, although the majority was eager to partner with managers on specific projects. Resource managers had varying levels of familiarity with downscaled climate projections, but reported needing more information about uncertainty for decision making, and were less interested in the technical model details. There were large differences between groups of managers, with 41.7% of freshwater managers reporting that they used climate projections regularly, while a majority of ecosystem and wildfire managers reported having "no familiarity". Scientists and managers rated which spatial and temporal scales were most relevant to decision making. Finally, when asked to compare how confident they were in projections of specific climate variables between the dynamical and statistical data, 80-90% of managers responded that they had no opinion. Workshop attendees were very interested in the survey results, adding to evidence of a need for sustained engagement between modeler and user groups, as well as different strategies for working with different types of resource managers.
NASA Astrophysics Data System (ADS)
Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas
2017-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.
NASA Astrophysics Data System (ADS)
Wakazuki, Yasutaka; Hara, Masayuki; Fujita, Mikiko; Ma, Xieyao; Kimura, Fujio
2013-04-01
Regional scale climate change projections play an important role in assessments of influences of global warming and include statistical (SD) and dynamical downscaling (DD) approaches. One of DD methods is developed basing on the pseudo-global-warming (PGW) method developed by Kimura and Kitoh (2007) in this study. In general, DD uses regional climate model (RCM) with lateral boundary data. In PGW method, the climatological mean difference estimated by GCMs are added to the objective analysis data (ANAL), and the data are used as the lateral boundary data in the future climate simulations. The ANAL is also used as the lateral boundary conditions of the present climate simulation. One of merits of the PGW method is that influences of biases of GCMs in RCM simulations are reduced. However, the PGW method does not treat climate changes in relative humidity, year-to-year variation, and short-term disturbances. The developing new downscaling method is named as the incremental dynamical downscaling and analysis system (InDDAS). The InDDAS treat climate changes in relative humidity and year-to-year variations. On the other hand, uncertainties of climate change projections estimated by many GCMs are large and are not negligible. Thus, stochastic regional scale climate change projections are expected for assessments of influences of global warming. Many RCM runs must be performed to make stochastic information. However, the computational costs are huge because grid size of RCM runs should be small to resolve heavy rainfall phenomena. Therefore, the number of runs to make stochastic information must be reduced. In InDDAS, climatological differences added to ANAL become statistically pre-analyzed information. The climatological differences of many GCMs are divided into mean climatological difference (MD) and departures from MD. The departures are analyzed by principal component analysis, and positive and negative perturbations (positive and negative standard deviations multiplied by departure patterns (eigenvectors)) with multi modes are added to MD. Consequently, the most likely future states are calculated with climatological difference of MD. For example, future states in cases that temperature increase is large and small are calculated with MD plus positive and negative perturbations of the first mode.
Haas, Kevin R; Yang, Haw; Chu, Jhih-Wei
2013-12-12
The dynamics of a protein along a well-defined coordinate can be formally projected onto the form of an overdamped Lagevin equation. Here, we present a comprehensive statistical-learning framework for simultaneously quantifying the deterministic force (the potential of mean force, PMF) and the stochastic force (characterized by the diffusion coefficient, D) from single-molecule Förster-type resonance energy transfer (smFRET) experiments. The likelihood functional of the Langevin parameters, PMF and D, is expressed by a path integral of the latent smFRET distance that follows Langevin dynamics and realized by the donor and the acceptor photon emissions. The solution is made possible by an eigen decomposition of the time-symmetrized form of the corresponding Fokker-Planck equation coupled with photon statistics. To extract the Langevin parameters from photon arrival time data, we advance the expectation-maximization algorithm in statistical learning, originally developed for and mostly used in discrete-state systems, to a general form in the continuous space that allows for a variational calculus on the continuous PMF function. We also introduce the regularization of the solution space in this Bayesian inference based on a maximum trajectory-entropy principle. We use a highly nontrivial example with realistically simulated smFRET data to illustrate the application of this new method.
HINDERED DIFFUSION OF COAL LIQUIDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theodore T. Tsotsis; Muhammad Sahimi; Ian A. Webster
1996-01-01
It was the purpose of the project described here to carry out careful and detailed investigations of petroleum and coal asphaltene transport through model porous systems under a broad range of temperature conditions. The experimental studies were to be coupled with detailed, in-depth statistical and molecular dynamics models intended to provide a fundamental understanding of the overall transport mechanisms and a more accurate concept of the asphaltene structure. The following discussion describes some of our accomplishments.
Projections of Education Statistics to 2022. Forty-First Edition. NCES 2014-051
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2014-01-01
"Projections of Education Statistics to 2022" is the 41st report in a series begun in 1964. It includes statistics on elementary and secondary schools and postsecondary degree-granting institutions. This report provides revisions of projections shown in "Projections of Education Statistics to 2021" and projections of…
Projections of Education Statistics to 2021. Fortieth Edition. NCES 2013-008
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2013-01-01
"Projections of Education Statistics to 2021" is the 40th report in a series begun in 1964. It includes statistics on elementary and secondary schools and postsecondary degree-granting institutions. This report provides revisions of projections shown in "Projections of Education Statistics to 2020" and projections of…
Adapting regional watershed management to climate change in Bavaria and Québec
NASA Astrophysics Data System (ADS)
Ludwig, Ralf; Muerth, Markus; Schmid, Josef; Jobst, Andreas; Caya, Daniel; Gauvin St-Denis, Blaise; Chaumont, Diane; Velazquez, Juan-Alberto; Turcotte, Richard; Ricard, Simon
2013-04-01
The international research project QBic3 (Quebec-Bavarian Collaboration on Climate Change) aims at investigating the potential impacts of climate change on the hydrology of regional scale catchments in Southern Quebec (Canada) and Bavaria (Germany). For this purpose, a hydro-meteorological modeling chain has been established, applying climatic forcing from both dynamical and statistical climate model data to an ensemble of hydrological models of varying complexity. The selection of input data, process descriptions and scenarios allows for the inter-comparison of the uncertainty ranges on selected runoff indicators; a methodology to display the relative importance of each source of uncertainty is developed and results for past runoff (1971-2000) and potential future changes (2041-2070) are obtained. Finally, the impact of hydrological changes on the operational management of dams, reservoirs and transfer systems is investigated and shown for the Bavarian case studies, namely the potential change in i) hydro-power production for the Upper Isar watershed and ii) low flow augmentation and water transfer rates at the Donau-Main transfer system in Central Franconia. Two overall findings will be presented and discussed in detail: a) the climate change response of selected hydrological indicators, especially those related to low flows, is strongly affected by the choice of the hydrological model. It can be shown that an assessment of the changes in the hydrological cycle is best represented by a complex physically based hydrological model, computationally less demanding models (usually simple, lumped and conceptual) can give a significant level of trust for selected indicators. b) the major differences in the projected climate forcing stemming from the ensemble of dynamic climate models (GCM/RCM) versus the statistical-stochastical WETTREG2010 approach. While the dynamic ensemble reveals a moderate modification of the hydrological processes in the investigated catchments, the WETTREG2010 driven runs show a severe detraction for all water operations, mainly related to a strong decline in projected precipitation in all seasons (except winter).
Perspective: chemical dynamics simulations of non-statistical reaction dynamics
Ma, Xinyou; Hase, William L.
2017-01-01
Non-statistical chemical dynamics are exemplified by disagreements with the transition state (TS), RRKM and phase space theories of chemical kinetics and dynamics. The intrinsic reaction coordinate (IRC) is often used for the former two theories, and non-statistical dynamics arising from non-IRC dynamics are often important. In this perspective, non-statistical dynamics are discussed for chemical reactions, with results primarily obtained from chemical dynamics simulations and to a lesser extent from experiment. The non-statistical dynamical properties discussed are: post-TS dynamics, including potential energy surface bifurcations, product energy partitioning in unimolecular dissociation and avoiding exit-channel potential energy minima; non-RRKM unimolecular decomposition; non-IRC dynamics; direct mechanisms for bimolecular reactions with pre- and/or post-reaction potential energy minima; non-TS theory barrier recrossings; and roaming dynamics. This article is part of the themed issue ‘Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces’. PMID:28320906
Climate Change Impacts at Department of Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotamarthi, Rao; Wang, Jiali; Zoebel, Zach
This project is aimed at providing the U.S. Department of Defense (DoD) with a comprehensive analysis of the uncertainty associated with generating climate projections at the regional scale that can be used by stakeholders and decision makers to quantify and plan for the impacts of future climate change at specific locations. The merits and limitations of commonly used downscaling models, ranging from simple to complex, are compared, and their appropriateness for application at installation scales is evaluated. Downscaled climate projections are generated at selected DoD installations using dynamic and statistical methods with an emphasis on generating probability distributions of climatemore » variables and their associated uncertainties. The sites selection and selection of variables and parameters for downscaling was based on a comprehensive understanding of the current and projected roles that weather and climate play in operating, maintaining, and planning DoD facilities and installations.« less
Projections of Education Statistics to 2024. Forty-Third Edition. NCES 2016-013
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2016-01-01
"Projections of Education Statistics to 2024" is the 43rd report in a series begun in 1964. It includes statistics on elementary and secondary schools and degree-granting postsecondary institutions. This report provides revisions of projections shown in Projections of Education Statistics to 2023 and projections of enrollment, graduates,…
Projections of Education Statistics to 2019. Thirty-Eighth Edition. NCES 2011-017
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2011-01-01
"Projections of Education Statistics to 2019" is the 38th report in a series begun in 1964. It includes statistics on elementary and secondary schools and degree-granting institutions. This report provides revisions of projections shown in "Projections of Education Statistics to 2018." Included are projections of enrollment,…
Projections of Education Statistics to 2020. Thirty-Ninth Edition. NCES 2011-026
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2011-01-01
"Projections of Education Statistics to 2020" is the 39th report in a series begun in 1964. It includes statistics on elementary and secondary schools and postsecondary degree-granting institutions. This report provides revisions of projections shown in "Projections of Education Statistics to 2019". Included are projections of…
Projections of Education Statistics to 2025. Forty-Fourth Edition. NCES 2017-019
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2017-01-01
"Projections of Education Statistics to 2025" is the 44th report in a series begun in 1964. It includes statistics on elementary and secondary schools and degree-granting postsecondary institutions. This report provides revisions of projections shown in Projections of Education Statistics to 2024 and projections of enrollment, graduates,…
Projections of Education Statistics to 2023. Forty-Second Edition. NCES 2015-073
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2016-01-01
"Projections of Education Statistics to 2023" is the 42nd report in a series begun in 1964. It includes statistics on elementary and secondary schools and postsecondary degree-granting institutions. This report provides revisions of projections shown in Projections of Education Statistics to 2022 and projections of enrollment, graduates,…
Far-from-Equilibrium Route to Superthermal Light in Bimodal Nanolasers
NASA Astrophysics Data System (ADS)
Marconi, Mathias; Javaloyes, Julien; Hamel, Philippe; Raineri, Fabrice; Levenson, Ariel; Yacomotti, Alejandro M.
2018-02-01
Microscale and nanoscale lasers inherently exhibit rich photon statistics due to complex light-matter interaction in a strong spontaneous emission noise background. It is well known that they may display superthermal fluctuations—photon superbunching—in specific situations due to either gain competition, leading to mode-switching instabilities, or carrier-carrier coupling in superradiant microcavities. Here we show a generic route to superbunching in bimodal nanolasers by preparing the system far from equilibrium through a parameter quench. We demonstrate, both theoretically and experimentally, that transient dynamics after a short-pump-pulse-induced quench leads to heavy-tailed superthermal statistics when projected onto the weak mode. We implement a simple experimental technique to access the probability density functions that further enables quantifying the distance from thermal equilibrium via the thermodynamic entropy. The universality of this mechanism relies on the far-from-equilibrium dynamical scenario, which can be mapped to a fast cooling process of a suspension of Brownian particles in a liquid. Our results open up new avenues to mold photon statistics in multimode optical systems and may constitute a test bed to investigate out-of-equilibrium thermodynamics using micro or nanocavity arrays.
JIGSAW: Preference-directed, co-operative scheduling
NASA Technical Reports Server (NTRS)
Linden, Theodore A.; Gaw, David
1992-01-01
Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.
NASA Astrophysics Data System (ADS)
Ahn, J. B.; Hur, J.
2015-12-01
The seasonal prediction of both the surface air temperature and the first-flowering date (FFD) over South Korea are produced using dynamical downscaling (Hur and Ahn, 2015). Dynamical downscaling is performed using Weather Research and Forecast (WRF) v3.0 with the lateral forcing from hourly outputs of Pusan National University (PNU) coupled general circulation model (CGCM) v1.1. Gridded surface air temperature data with high spatial (3km) and temporal (daily) resolution are obtained using the physically-based dynamical models. To reduce systematic bias, simple statistical correction method is then applied to the model output. The FFDs of cherry, peach and pear in South Korea are predicted for the decade of 1999-2008 by applying the corrected daily temperature predictions to the phenological thermal-time model. The WRF v3.0 results reflect the detailed topographical effect, despite having cold and warm biases for warm and cold seasons, respectively. After applying the correction, the mean temperature for early spring (February to April) well represents the general pattern of observation, while preserving the advantages of dynamical downscaling. The FFD predictabilities for the three species of trees are evaluated in terms of qualitative, quantitative and categorical estimations. Although FFDs derived from the corrected WRF results well predict the spatial distribution and the variation of observation, the prediction performance has no statistical significance or appropriate predictability. The approach used in the study may be helpful in obtaining detailed and useful information about FFD and regional temperature by accounting for physically-based atmospheric dynamics, although the seasonal predictability of flowering phenology is not high enough. Acknowledgements This work was carried out with the support of the Rural Development Administration Cooperative Research Program for Agriculture Science and Technology Development under Grant Project No. PJ009953 and Project No. PJ009353, Republic of Korea. Reference Hur, J., J.-B. Ahn, 2015. Seasonal Prediction of Regional Surface Air Temperature and First-flowering Date over South Korea, Int. J. Climatol., DOI: 10.1002/joc.4323.
Towards estimates of future rainfall erosivity in Europe based on REDES and WorldClim datasets
NASA Astrophysics Data System (ADS)
Panagos, Panos; Ballabio, Cristiano; Meusburger, Katrin; Spinoni, Jonathan; Alewell, Christine; Borrelli, Pasquale
2017-05-01
The policy requests to develop trends in soil erosion changes can be responded developing modelling scenarios of the two most dynamic factors in soil erosion, i.e. rainfall erosivity and land cover change. The recently developed Rainfall Erosivity Database at European Scale (REDES) and a statistical approach used to spatially interpolate rainfall erosivity data have the potential to become useful knowledge to predict future rainfall erosivity based on climate scenarios. The use of a thorough statistical modelling approach (Gaussian Process Regression), with the selection of the most appropriate covariates (monthly precipitation, temperature datasets and bioclimatic layers), allowed to predict the rainfall erosivity based on climate change scenarios. The mean rainfall erosivity for the European Union and Switzerland is projected to be 857 MJ mm ha-1 h-1 yr-1 till 2050 showing a relative increase of 18% compared to baseline data (2010). The changes are heterogeneous in the European continent depending on the future projections of most erosive months (hot period: April-September). The output results report a pan-European projection of future rainfall erosivity taking into account the uncertainties of the climatic models.
Towards estimates of future rainfall erosivity in Europe based on REDES and WorldClim datasets.
Panagos, Panos; Ballabio, Cristiano; Meusburger, Katrin; Spinoni, Jonathan; Alewell, Christine; Borrelli, Pasquale
2017-05-01
The policy requests to develop trends in soil erosion changes can be responded developing modelling scenarios of the two most dynamic factors in soil erosion, i.e. rainfall erosivity and land cover change. The recently developed Rainfall Erosivity Database at European Scale (REDES) and a statistical approach used to spatially interpolate rainfall erosivity data have the potential to become useful knowledge to predict future rainfall erosivity based on climate scenarios. The use of a thorough statistical modelling approach (Gaussian Process Regression), with the selection of the most appropriate covariates (monthly precipitation, temperature datasets and bioclimatic layers), allowed to predict the rainfall erosivity based on climate change scenarios. The mean rainfall erosivity for the European Union and Switzerland is projected to be 857 MJ mm ha -1 h -1 yr -1 till 2050 showing a relative increase of 18% compared to baseline data (2010). The changes are heterogeneous in the European continent depending on the future projections of most erosive months (hot period: April-September). The output results report a pan-European projection of future rainfall erosivity taking into account the uncertainties of the climatic models.
A Multi-Class, Interdisciplinary Project Using Elementary Statistics
ERIC Educational Resources Information Center
Reese, Margaret
2012-01-01
This article describes a multi-class project that employs statistical computing and writing in a statistics class. Three courses, General Ecology, Meteorology, and Introductory Statistics, cooperated on a project for the EPA's Student Design Competition. The continuing investigation has also spawned several undergraduate research projects in…
The GBT Dynamic Scheduling System: Development and Testing
NASA Astrophysics Data System (ADS)
McCarty, M.; Clark, M.; Marganian, P.; O'Neil, K.; Shelton, A.; Sessoms, E.
2009-09-01
During the summer trimester of 2008, all observations on the Robert C. Byrd Green Bank Telescope (GBT) were scheduled using the new Dynamic Scheduling System (DSS). Beta testing exercised the policies, algorithms, and software developed for the DSS project. Since observers are located all over the world, the DSS was implemented as a web application. Technologies such as iCalendar, Really Simple Syndication (RSS) feeds, email, and instant messaging are used to transfer as much or as little information to observers as they request. We discuss the software engineering challenges leading to our implementation such as information distribution and building rich user interfaces in the web browser. We also relate our adaptation of agile development practices to design and develop the DSS. Additionally, we describe handling differences in expected versus actual initial conditions in the pool of project proposals for the 08B trimester. We then identify lessons learned from beta testing and present statistics on how the DSS was used during the trimester.
Development of hi-resolution regional climate scenarios in Japan by statistical downscaling
NASA Astrophysics Data System (ADS)
Dairaku, K.
2016-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. To meet with the needs of stakeholders such as local governments, a Japan national project, Social Implementation Program on Climate Change Adaptation Technology (SI-CAT), launched in December 2015. It develops reliable technologies for near-term climate change predictions. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 GCMs and a statistical downscaling method to support various municipal adaptation measures appropriate for possible regional climate changes. A statistical downscaling method, Bias Correction Spatial Disaggregation (BCSD), is employed to develop regional climate scenarios based on CMIP5 RCP8.5 five GCMs (MIROC5, MRI-CGCM3, GFDL-CM3, CSIRO-Mk3-6-0, HadGEM2-ES) for the periods of historical climate (1970-2005) and near future climate (2020-2055). Downscaled variables are monthly/daily precipitation and temperature. File format is NetCDF4 (conforming to CF1.6, HDF5 compression). Developed regional climate scenarios will be expanded to meet with needs of stakeholders and interface applications to access and download the data are under developing. Statistical downscaling method is not necessary to well represent locally forced nonlinear phenomena, extreme events such as heavy rain, heavy snow, etc. To complement the statistical method, dynamical downscaling approach is also combined and applied to some specific regions which have needs of stakeholders. The added values of statistical/dynamical downscaling methods compared with parent GCMs are investigated.
Characterizing Sub-Daily Flow Regimes: Implications of Hydrologic Resolution on Ecohydrology Studies
Bevelhimer, Mark S.; McManamay, Ryan A.; O'Connor, B.
2014-05-26
Natural variability in flow is a primary factor controlling geomorphic and ecological processes in riverine ecosystems. Within the hydropower industry, there is growing pressure from environmental groups and natural resource managers to change reservoir releases from daily peaking to run-of-river operations on the basis of the assumption that downstream biological communities will improve under a more natural flow regime. In this paper, we discuss the importance of assessing sub-daily flows for understanding the physical and ecological dynamics within river systems. We present a variety of metrics for characterizing sub-daily flow variation and use these metrics to evaluate general trends amongmore » streams affected by peaking hydroelectric projects, run-of-river projects and streams that are largely unaffected by flow altering activities. Univariate and multivariate techniques were used to assess similarity among different stream types on the basis of these sub-daily metrics. For comparison, similar analyses were performed using analogous metrics calculated with mean daily flow values. Our results confirm that sub-daily flow metrics reveal variation among and within streams that are not captured by daily flow statistics. Using sub-daily flow statistics, we were able to quantify the degree of difference between unaltered and peaking streams and the amount of similarity between unaltered and run-of-river streams. The sub-daily statistics were largely uncorrelated with daily statistics of similar scope. Furthermore, on short temporal scales, sub-daily statistics reveal the relatively constant nature of unaltered streamreaches and the highly variable nature of hydropower-affected streams, whereas daily statistics show just the opposite over longer temporal scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmieder, R.W.
The author presents a new approach for modeling the dynamics of collections of objects with internal structure. Based on the fact that the behavior of an individual in a population is modified by its knowledge of other individuals, a procedure for accounting for knowledge in a population of interacting objects is presented. It is assumed that each object has partial (or complete) knowledge of some (or all) other objects in the population. The dynamical equations for the objects are then modified to include the effects of this pairwise knowledge. This procedure has the effect of projecting out what the populationmore » will do from the much larger space of what it could do, i.e., filtering or smoothing the dynamics by replacing the complex detailed physical model with an effective model that produces the behavior of interest. The procedure therefore provides a minimalist approach for obtaining emergent collective behavior. The use of knowledge as a dynamical quantity, and its relationship to statistical mechanics, thermodynamics, information theory, and cognition microstructure are discussed.« less
Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross
2016-06-01
To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dpetstep images and noise properties agreed better with MC. The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.
Navigating Earthquake Physics with High-Resolution Array Back-Projection
NASA Astrophysics Data System (ADS)
Meng, Lingsen
Understanding earthquake source dynamics is a fundamental goal of geophysics. Progress toward this goal has been slow due to the gap between state-of-art earthquake simulations and the limited source imaging techniques based on conventional low-frequency finite fault inversions. Seismic array processing is an alternative source imaging technique that employs the higher frequency content of the earthquakes and provides finer detail of the source process with few prior assumptions. While the back-projection provides key observations of previous large earthquakes, the standard beamforming back-projection suffers from low resolution and severe artifacts. This thesis introduces the MUSIC technique, a high-resolution array processing method that aims to narrow the gap between the seismic observations and earthquake simulations. The MUSIC is a high-resolution method taking advantage of the higher order signal statistics. The method has not been widely used in seismology yet because of the nonstationary and incoherent nature of the seismic signal. We adapt MUSIC to transient seismic signal by incorporating the Multitaper cross-spectrum estimates. We also adopt a "reference window" strategy that mitigates the "swimming artifact," a systematic drift effect in back projection. The improved MUSIC back projections allow the imaging of recent large earthquakes in finer details which give rise to new perspectives on dynamic simulations. In the 2011 Tohoku-Oki earthquake, we observe frequency-dependent rupture behaviors which relate to the material variation along the dip of the subduction interface. In the 2012 off-Sumatra earthquake, we image the complicated ruptures involving orthogonal fault system and an usual branching direction. This result along with our complementary dynamic simulations probes the pressure-insensitive strength of the deep oceanic lithosphere. In another example, back projection is applied to the 2010 M7 Haiti earthquake recorded at regional distance. The high-frequency subevents are located at the edges of geodetic slip regions, which are correlated to the stopping phases associated with rupture speed reduction when the earthquake arrests.
NASA Astrophysics Data System (ADS)
Hagemann, Stefan; Chen, Cui; Haerter, Jan O.; Gerten, Dieter; Heinke, Jens; Piani, Claudio
2010-05-01
Future climate model scenarios depend crucially on their adequate representation of the hydrological cycle. Within the European project "Water and Global Change" (WATCH) special care is taken to couple state-of-the-art climate model output to a suite of hydrological models. This coupling is expected to lead to a better assessment of changes in the hydrological cycle. However, due to the systematic model errors of climate models, their output is often not directly applicable as input for hydrological models. Thus, the methodology of a statistical bias correction has been developed, which can be used for correcting climate model output to produce internally consistent fields that have the same statistical intensity distribution as the observations. As observations, global re-analysed daily data of precipitation and temperature are used that are obtained in the WATCH project. We will apply the bias correction to global climate model data of precipitation and temperature from the GCMs ECHAM5/MPIOM, CNRM-CM3 and LMDZ-4, and intercompare the bias corrected data to the original GCM data and the observations. Then, the orginal and the bias corrected GCM data will be used to force two global hydrology models: (1) the hydrological model of the Max Planck Institute for Meteorology (MPI-HM) consisting of the Simplified Land surface (SL) scheme and the Hydrological Discharge (HD) model, and (2) the dynamic vegetation model LPJmL operated by the Potsdam Institute for Climate Impact Research. The impact of the bias correction on the projected simulated hydrological changes will be analysed, and the resulting behaviour of the two hydrology models will be compared.
Statistical Literacy: Developing a Youth and Adult Education Statistical Project
ERIC Educational Resources Information Center
Conti, Keli Cristina; Lucchesi de Carvalho, Dione
2014-01-01
This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…
Comparative Longterm Mortality Trends in Cancer vs. Ischemic Heart Disease in Puerto Rico.
Torres, David; Pericchi, Luis R; Mattei, Hernando; Zevallos, Juan C
2017-06-01
Although contemporary mortality data are important for health assessment and planning purposes, their availability lag several years. Statistical projection techniques can be employed to obtain current estimates. This study aimed to assess annual trends of mortality in Puerto Rico due to cancer and Ischemic Heart Disease (IHD), and to predict shorterm and longterm cancer and IHD mortality figures. Age-adjusted mortality per 100,000 population projections with a 50% interval probability were calculated utilizing a Bayesian statistical approach of Age-Period-Cohort dynamic model. Multiple cause-of-death annual files for years 1994-2010 for Puerto Rico were used to calculate shortterm (2011-2012) predictions. Longterm (2013-2022) predictions were based on quinquennial data. We also calculated gender differences in rates (men-women) for each study period. Mortality rates for women were similar for cancer and IHD in the 1994-1998 period, but changed substantially in the projected 2018-2022 period. Cancer mortality rates declined gradually overtime, and the gender difference remained constant throughout the historical and projected trends. A consistent declining trend for IHD historical annual mortality rate was observed for both genders, with a substantial changepoint around 2004-2005 for men. The initial gender difference of 33% (80/100,00 vs. 60/100,000) in mortality rates observed between cancer and IHD in the 1994-1998 period increased to 300% (60/100,000 vs. 20/100,000) for the 2018-2022 period. The APC projection model accurately projects shortterm and longterm mortality trends for cancer and IHD in this population: The steady historical and projected cancer mortality rates contrasts with the substantial decline in IHD mortality rates, especially in men.
Visible and infrared investigations of planet-crossing asteroids and outer solar system objects
NASA Technical Reports Server (NTRS)
Tholen, David J.
1991-01-01
The project is supporting lightcurve photometry, colorimetry, thermal radiometry, and astrometry of selected asteroids. Targets include the planet-crossing population, particularly Earth approachers, which are believed to be the immediate source of terrestrial meteorites, future spacecraft targets, and those objects in the outer belt, primarily the Hilda and Trojan populations, that are dynamically isolated from the main asteroid belt. Goals include the determination of population statistics for the planet-crossing objects, the characterization of spacecraft targets to assist in encounter planning and subsequent interpretation of the data, a comparison of the collisional evolution of dynamically isolated Hilda and Trojan populations with the main belt, and the determination of the mechanism driving the activity of the distant object 2060 Chiron.
Using LabView for real-time monitoring and tracking of multiple biological objects
NASA Astrophysics Data System (ADS)
Nikolskyy, Aleksandr I.; Krasilenko, Vladimir G.; Bilynsky, Yosyp Y.; Starovier, Anzhelika
2017-04-01
Today real-time studying and tracking of movement dynamics of various biological objects is important and widely researched. Features of objects, conditions of their visualization and model parameters strongly influence the choice of optimal methods and algorithms for a specific task. Therefore, to automate the processes of adaptation of recognition tracking algorithms, several Labview project trackers are considered in the article. Projects allow changing templates for training and retraining the system quickly. They adapt to the speed of objects and statistical characteristics of noise in images. New functions of comparison of images or their features, descriptors and pre-processing methods will be discussed. The experiments carried out to test the trackers on real video files will be presented and analyzed.
Multi-criterion model ensemble of CMIP5 surface air temperature over China
NASA Astrophysics Data System (ADS)
Yang, Tiantian; Tao, Yumeng; Li, Jingjing; Zhu, Qian; Su, Lu; He, Xiaojia; Zhang, Xiaoming
2018-05-01
The global circulation models (GCMs) are useful tools for simulating climate change, projecting future temperature changes, and therefore, supporting the preparation of national climate adaptation plans. However, different GCMs are not always in agreement with each other over various regions. The reason is that GCMs' configurations, module characteristics, and dynamic forcings vary from one to another. Model ensemble techniques are extensively used to post-process the outputs from GCMs and improve the variability of model outputs. Root-mean-square error (RMSE), correlation coefficient (CC, or R) and uncertainty are commonly used statistics for evaluating the performances of GCMs. However, the simultaneous achievements of all satisfactory statistics cannot be guaranteed in using many model ensemble techniques. In this paper, we propose a multi-model ensemble framework, using a state-of-art evolutionary multi-objective optimization algorithm (termed MOSPD), to evaluate different characteristics of ensemble candidates and to provide comprehensive trade-off information for different model ensemble solutions. A case study of optimizing the surface air temperature (SAT) ensemble solutions over different geographical regions of China is carried out. The data covers from the period of 1900 to 2100, and the projections of SAT are analyzed with regard to three different statistical indices (i.e., RMSE, CC, and uncertainty). Among the derived ensemble solutions, the trade-off information is further analyzed with a robust Pareto front with respect to different statistics. The comparison results over historical period (1900-2005) show that the optimized solutions are superior over that obtained simple model average, as well as any single GCM output. The improvements of statistics are varying for different climatic regions over China. Future projection (2006-2100) with the proposed ensemble method identifies that the largest (smallest) temperature changes will happen in the South Central China (the Inner Mongolia), the North Eastern China (the South Central China), and the North Western China (the South Central China), under RCP 2.6, RCP 4.5, and RCP 8.5 scenarios, respectively.
From immunology to MRI data anlysis: Problems in mathematical biology
NASA Astrophysics Data System (ADS)
Waters, Ryan Samuel
This thesis represents a collection of four distinct biological projects rising from immunology and metabolomics that required unique and creative mathematical approaches. One project focuses on understanding the role IL-2 plays in immune response regulation and exploring how these effects can be altered. We developed several dynamic models of the receptor signaling network which we analyze analytically and numerically. In a second project focused also on MS, we sought to create a system for grading magnetic resonance images (MRI) with good correlation with disability. The goal is for these MRI scores to provide a better standard for large-scale clinical drug trials, which limits the bias associated with differences in available MRI technology and general grader/participant variability. The third project involves the study of the CRISPR adaptive immune system in bacteria. Bacterial cells recognize and acquire snippets of exogenous genetic material, which they incorporate into their DNA. In this project we explore the optimal design for the CRISPR system given a viral distribution to maximize its probability of survival. The final project involves the study of the benefits for colocalization of coupled enzymes in metabolic pathways. The hypothesized kinetic advantage, known as `channeling', of putting coupled enzymes closer together has been used as justification for the colocalization of coupled enzymes in biological systems. We developed and analyzed a simple partial differential equation of the diffusion of the intermediate substrate between coupled enzymes to explore the phenomena of channeling. The four projects of my thesis represent very distinct biological problems that required a variety of techniques from diverse areas of mathematics ranging from dynamical modeling to statistics, Fourier series and calculus of variations. In each case, quantitative techniques were used to address biological questions from a mathematical perspective ultimately providing insight back to the biological problems which motivated them.
High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gygi, Francois; Galli, Giulia; Schwegler, Eric
This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less
Development of a new family of normalized modulus reduction and material damping curves
NASA Astrophysics Data System (ADS)
Darendeli, Mehmet Baris
2001-12-01
As part of various research projects [including the SRS (Savannah River Site) Project AA891070, EPRI (Electric Power Research Institute) Project 3302, and ROSRINE (Resolution of Site Response Issues from the Northridge Earthquake) Project], numerous geotechnical sites were drilled and sampled. Intact soil samples over a depth range of several hundred meters were recovered from 20 of these sites. These soil samples were tested in the laboratory at The University of Texas at Austin (UTA) to characterize the materials dynamically. The presence of a database accumulated from testing these intact specimens motivated a re-evaluation of empirical curves employed in the state of practice. The weaknesses of empirical curves reported in the literature were identified and the necessity of developing an improved set of empirical curves was recognized. This study focused on developing the empirical framework that can be used to generate normalized modulus reduction and material damping curves. This framework is composed of simple equations, which incorporate the key parameters that control nonlinear soil behavior. The data collected over the past decade at The University of Texas at Austin are statistically analyzed using First-order, Second-moment Bayesian Method (FSBM). The effects of various parameters (such as confining pressure and soil plasticity) on dynamic soil properties are evaluated and quantified within this framework. One of the most important aspects of this study is estimating not only the mean values of the empirical curves but also estimating the uncertainty associated with these values. This study provides the opportunity to handle uncertainty in the empirical estimates of dynamic soil properties within the probabilistic seismic hazard analysis framework. A refinement in site-specific probabilistic seismic hazard assessment is expected to materialize in the near future by incorporating the results of this study into state of practice.
The Impact of Student-Directed Projects in Introductory Statistics
ERIC Educational Resources Information Center
Spence, Dianna J.; Bailey, Brad; Sharp, Julia L.
2017-01-01
A multi-year study investigated the impact of incorporating student-directed discovery projects into introductory statistics courses. Pilot instructors at institutions across the United States taught statistics implementing student-directed projects with the help of a common set of instructional materials designed to facilitate such projects.…
NASA Astrophysics Data System (ADS)
Verrucci, Enrica; Bevington, John; Vicini, Alessandro
2014-05-01
A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.
NASA Astrophysics Data System (ADS)
Jing, R.; Lin, N.; Emanuel, K.; Vecchi, G. A.; Knutson, T. R.
2017-12-01
A Markov environment-dependent hurricane intensity model (MeHiM) is developed to simulate the climatology of hurricane intensity given the surrounding large-scale environment. The model considers three unobserved discrete states representing respectively storm's slow, moderate, and rapid intensification (and deintensification). Each state is associated with a probability distribution of intensity change. The storm's movement from one state to another, regarded as a Markov chain, is described by a transition probability matrix. The initial state is estimated with a Bayesian approach. All three model components (initial intensity, state transition, and intensity change) are dependent on environmental variables including potential intensity, vertical wind shear, midlevel relative humidity, and ocean mixing characteristics. This dependent Markov model of hurricane intensity shows a significant improvement over previous statistical models (e.g., linear, nonlinear, and finite mixture models) in estimating the distributions of 6-h and 24-h intensity change, lifetime maximum intensity, and landfall intensity, etc. Here we compare MeHiM with various dynamical models, including a global climate model [High-Resolution Forecast-Oriented Low Ocean Resolution model (HiFLOR)], a regional hurricane model (Geophysical Fluid Dynamics Laboratory (GFDL) hurricane model), and a simplified hurricane dynamic model [Coupled Hurricane Intensity Prediction System (CHIPS)] and its newly developed fast simulator. The MeHiM developed based on the reanalysis data is applied to estimate the intensity of simulated storms to compare with the dynamical-model predictions under the current climate. The dependences of hurricanes on the environment under current and future projected climates in the various models will also be compared statistically.
Simulating the Interactions Among Land Use, Transportation ...
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic and non-linear interactions among transportation, land use, and socioeconomic systems. System dynamics (SD) provides a common framework for modeling the complex interactions among transportation and other related systems. This study uses a SD model to simulate the cascading impacts of a proposed light rail transit (LRT) system in central North Carolina, USA. The Durham-Orange Light Rail Project (D-O LRP) SD model incorporates relationships among the land use, transportation, and economy sectors to simulate the complex feedbacks that give rise to the travel behavior changes forecasted by the region’s transportation model. This paper demonstrates the sensitivity of changes in travel behavior to the proposed LRT system and the assumptions that went into the transportation modeling, and compares those results to the impacts of an alternative fare-free transit system. SD models such as the D-O LRP SD model can complement transportation studies by providing valuable insight into the interdependent community systems that collectively contribute to travel behavior changes. Presented at the 35th International Conference of the System Dynamics Society in Cambridge, MA, July 18th, 2017
Non-commutative methods in quantum mechanics
NASA Astrophysics Data System (ADS)
Millard, Andrew Clive
1997-09-01
Non-commutativity appears in physics almost hand in hand with quantum mechanics. Non-commuting operators corresponding to observables lead to Heisenberg's Uncertainty Principle, which is often used as a prime example of how quantum mechanics transcends 'common sense', while the operators that generate a symmetry group are usually given in terms of their commutation relations. This thesis discusses a number of new developments which go beyond the usual stopping point of non-commuting quantities as matrices with complex elements. Chapter 2 shows how certain generalisations of quantum mechanics, from using complex numbers to using other (often non-commutative) algebras, can still be written as linear systems with symplectic phase flows. Chapter 3 deals with Adler's trace dynamics, a non-linear graded generalisation of Hamiltonian dynamics with supersymmetry applications, where the phase space coordinates are (generally non-commuting) operators, and reports on aspects of a demonstration that the statistical averages of the dynamical variables obey the rules of complex quantum field theory. The last two chapters discuss specific aspects of quaternionic quantum mechanics. Chapter 4 reports a generalised projective representation theory and presents a structure theorem that categorises quaternionic projective representations. Chapter 5 deals with a generalisation of the coherent states formalism and examines how it may be applied to two commonly used groups.
Data mining on long-term barometric data within the ARISE2 project
NASA Astrophysics Data System (ADS)
Hupe, Patrick; Ceranna, Lars; Pilger, Christoph
2016-04-01
The Comprehensive nuclear-Test-Ban Treaty (CTBT) led to the implementation of an international infrasound array network. The International Monitoring System (IMS) network includes 48 certified stations, each providing data for up to 15 years. As part of work package 3 of the ARISE2 project (Atmospheric dynamics Research InfraStructure in Europe, phase 2) the data sets will be statistically evaluated with regard on atmospheric dynamics. The current study focusses on fluctuations of absolute air pressure. Time series have been analysed for 17 monitoring stations which are located all over the world between Greenland and Antarctica along the latitudes to represent different climate zones and characteristic atmospheric conditions. Hence this enables quantitative comparisons between those regions. Analyses are shown including wavelet power spectra, multi-annual time series of average variances with regard to long-wave scales, and spectral densities to derive characteristics and special events. Evaluations reveal periodicities in average variances on 2 to 20 day scale with a maximum in the winter months and a minimum in summer of the respective hemisphere. This basically applies to time series of IMS stations beyond the tropics where the dominance of cyclones and anticyclones changes with seasons. Furthermore, spectral density analyses illustrate striking signals for several dynamic activities within one day, e.g., the semidiurnal tide.
NASA Astrophysics Data System (ADS)
Laugel, Amélie; Menendez, Melisa; Benoit, Michel; Mattarolo, Giovanni; Mendez, Fernando
2013-04-01
Wave climate forecasting is a major issue for numerous marine and coastal related activities, such as offshore industries, flooding risks assessment and wave energy resource evaluation, among others. Generally, there are two main ways to predict the impacts of the climate change on the wave climate at regional scale: the dynamical and the statistical downscaling of GCM (Global Climate Model). In this study, both methods have been applied on the French coast (Atlantic , English Channel and North Sea shoreline) under three climate change scenarios (A1B, A2, B1) simulated with the GCM ARPEGE-CLIMAT, from Météo-France (AR4, IPCC). The aim of the work is to characterise the wave climatology of the 21st century and compare the statistical and dynamical methods pointing out advantages and disadvantages of each approach. The statistical downscaling method proposed by the Environmental Hydraulics Institute of Cantabria (Spain) has been applied (Menendez et al., 2011). At a particular location, the sea-state climate (Predictand Y) is defined as a function, Y=f(X), of several atmospheric circulation patterns (Predictor X). Assuming these climate associations between predictor and predictand are stationary, the statistical approach has been used to project the future wave conditions with reference to the GCM. The statistical relations between predictor and predictand have been established over 31 years, from 1979 to 2009. The predictor is built as the 3-days-averaged squared sea level pressure gradient from the hourly CFSR database (Climate Forecast System Reanalysis, http://cfs.ncep.noaa.gov/cfsr/). The predictand has been extracted from the 31-years hindcast sea-state database ANEMOC-2 performed with the 3G spectral wave model TOMAWAC (Benoit et al., 1996), developed at EDF R&D LNHE and Saint-Venant Laboratory for Hydraulics and forced by the CFSR 10m wind field. Significant wave height, peak period and mean wave direction have been extracted with an hourly-resolution at 110 coastal locations along the French coast. The model, based on the BAJ parameterization of the source terms (Bidlot et al, 2007) was calibrated against ten years of GlobWave altimeter observations (2000-2009) and validated through deep and shallow water buoy observations. The dynamical downscaling method has been performed with the same numerical wave model TOMAWAC used for building ANEMOC-2. Forecast simulations are forced by the 10m wind fields of ARPEGE-CLIMAT (A1B, A2, B1) from 2010 to 2100. The model covers the Atlantic Ocean and uses a spatial resolution along the French and European coast of 10 and 20 km respectively. The results of the model are stored with a time resolution of one hour. References: Benoit M., Marcos F., and F. Becq, (1996). Development of a third generation shallow-water wave model with unstructured spatial meshing. Proc. 25th Int. Conf. on Coastal Eng., (ICCE'1996), Orlando (Florida, USA), pp 465-478. Bidlot J-R, Janssen P. and Adballa S., (2007). A revised formulation of ocean wave dissipation and its model impact, technical memorandum ECMWF n°509. Menendez, M., Mendez, F.J., Izaguirre,C., Camus, P., Espejo, A., Canovas, V., Minguez, R., Losada, I.J., Medina, R. (2011). Statistical Downscaling of Multivariate Wave Climate Using a Weather Type Approach, 12th International Workshop on Wave Hindcasting and Forecasting and 3rd Coastal Hazard Symposium, Kona (Hawaii).
NASA Astrophysics Data System (ADS)
Stroeve, J. C.
2014-12-01
The last four decades have seen a remarkable decline in the spatial extent of the Arctic sea ice cover, presenting both challenges and opportunities to Arctic residents, government agencies and industry. After the record low extent in September 2007 effort has increased to improve seasonal, decadal-scale and longer-term predictions of the sea ice cover. Coupled global climate models (GCMs) consistently project that if greenhouse gas concentrations continue to rise, the eventual outcome will be a complete loss of the multiyear ice cover. However, confidence in these projections depends o HoHoweon the models ability to reproduce features of the present-day climate. Comparison between models participating in the World Climate Research Programme Coupled Model Intercomparison Project Phase 5 (CMIP5) and observations of sea ice extent and thickness show that (1) historical trends from 85% of the model ensemble members remain smaller than observed, and (2) spatial patterns of sea ice thickness are poorly represented in most models. Part of the explanation lies with a failure of models to represent details of the mean atmospheric circulation pattern that governs the transport and spatial distribution of sea ice. These results raise concerns regarding the ability of CMIP5 models to realistically represent the processes driving the decline of Arctic sea ice and to project the timing of when a seasonally ice-free Arctic may be realized. On shorter time-scales, seasonal sea ice prediction has been challenged to predict the sea ice extent from Arctic conditions a few months to a year in advance. Efforts such as the Sea Ice Outlook (SIO) project, originally organized through the Study of Environmental Change (SEARCH) and now managed by the Sea Ice Prediction Network project (SIPN) synthesize predictions of the September sea ice extent based on a variety of approaches, including heuristic, statistical and dynamical modeling. Analysis of SIO contributions reveals that when the September sea ice extent is near the long-term trend, contributions tend to be accurate. Years when the observed extent departs from the trend have proven harder to predict. Predictability skill does not appear to be more accurate for dynamical models over statistical ones, nor is there a measurable improvement in skill as the summer progresses.
Projections of Education Statistics to 2018. Thirty-Seventh Edition. NCES 2009-062
ERIC Educational Resources Information Center
Hussar, William J.; Bailey, Tabitha M.
2009-01-01
"Projections of Education Statistics to 2018" is the 37th report in a series begun in 1964. It includes statistics on elementary and secondary schools and degree-granting institutions. Included are projections of enrollment, graduates, teachers, and expenditures to the year 2018. This is the first edition of the "Projections of…
A Realistic Experimental Design and Statistical Analysis Project
ERIC Educational Resources Information Center
Muske, Kenneth R.; Myers, John A.
2007-01-01
A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…
The Cognitive Visualization System with the Dynamic Projection of Multidimensional Data
NASA Astrophysics Data System (ADS)
Gorohov, V.; Vitkovskiy, V.
2008-08-01
The phenomenon of cognitive machine drawing consists in the generation on the screen the special graphic representations, which create in the brain of human operator entertainment means. These means seem man by aesthetically attractive and, thus, they stimulate its descriptive imagination, closely related to the intuitive mechanisms of thinking. The essence of cognitive effect lies in the fact that man receives the moving projection as pseudo-three-dimensional object characterizing multidimensional means in the multidimensional space. After the thorough qualitative study of the visual aspects of multidimensional means with the aid of the enumerated algorithms appears the possibility, using algorithms of standard machine drawing to paint the interesting user separate objects or the groups of objects. Then it is possible to again return to the dynamic behavior of the rotation of means for the purpose of checking the intuitive ideas of user about the clusters and the connections in multidimensional data. Is possible the development of the methods of cognitive machine drawing in combination with other information technologies, first of all with the packets of digital processing of images and multidimensional statistical analysis.
Karamzadeh, Razieh; Karimi-Jafari, Mohammad Hossein; Sharifi-Zarchi, Ali; Chitsaz, Hamidreza; Salekdeh, Ghasem Hosseini; Moosavi-Movahedi, Ali Akbar
2017-06-16
The human protein disulfide isomerase (hPDI), is an essential four-domain multifunctional enzyme. As a result of disulfide shuffling in its terminal domains, hPDI exists in two oxidation states with different conformational preferences which are important for substrate binding and functional activities. Here, we address the redox-dependent conformational dynamics of hPDI through molecular dynamics (MD) simulations. Collective domain motions are identified by the principal component analysis of MD trajectories and redox-dependent opening-closing structure variations are highlighted on projected free energy landscapes. Then, important structural features that exhibit considerable differences in dynamics of redox states are extracted by statistical machine learning methods. Mapping the structural variations to time series of residue interaction networks also provides a holistic representation of the dynamical redox differences. With emphasizing on persistent long-lasting interactions, an approach is proposed that compiled these time series networks to a single dynamic residue interaction network (DRIN). Differential comparison of DRIN in oxidized and reduced states reveals chains of residue interactions that represent potential allosteric paths between catalytic and ligand binding sites of hPDI.
NASA Technical Reports Server (NTRS)
Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.
2006-01-01
A usability study evaluating dynamic tunnel concepts has been completed under the Aviation Safety and Security Program, Synthetic Vision Systems Project. The usability study was conducted in the Visual Imaging Simulator for Transport Aircraft Systems (VISTAS) III simulator in the form of questionnaires and pilot-in-the-loop simulation sessions. Twelve commercial pilots participated in the study to determine their preferences via paired comparisons and subjective rankings regarding the color, line thickness and sensitivity of the dynamic tunnel. The results of the study showed that color was not significant in pilot preference paired comparisons or in pilot rankings. Line thickness was significant for both pilot preference paired comparisons and in pilot rankings. The preferred line/halo thickness combination was a line width of 3 pixels and a halo of 4 pixels. Finally, pilots were asked their preference for the current dynamic tunnel compared to a less sensitive dynamic tunnel. The current dynamic tunnel constantly gives feedback to the pilot with regard to path error while the less sensitive tunnel only changes as the path error approaches the edges of the tunnel. The tunnel sensitivity comparison results were not statistically significant.
Effects of future climate conditions on terrestrial export from coastal southern California
NASA Astrophysics Data System (ADS)
Feng, D.; Zhao, Y.; Raoufi, R.; Beighley, E.; Melack, J. M.
2015-12-01
The Santa Barbara Coastal - Long Term Ecological Research Project (SBC-LTER) is focused on investigating the relative importance of land and ocean processes in structuring giant kelp forest ecosystems. Understanding how current and future climate conditions influence terrestrial export is a central theme for the project. Here we combine the Hillslope River Routing (HRR) model and daily precipitation and temperature downscaled using statistical downscaling based on localized constructed Analogs (LOCA) to estimate recent streamflow dynamics (2000 to 2014) and future conditions (2015 to 2100). The HRR model covers the SBC-LTER watersheds from just west of the Ventura River to Point Conception; a land area of roughly 800 km2 with 179 watersheds ranging from 0.1 to 123 km2. The downscaled climate conditions have a spatial resolution of 6 km by 6 km. Here, we use the Penman-Monteith method with the Food and Agriculture Organization of the United Nations (FAO) limited climate data approximations and land surface conditions (albedo, leaf area index, land cover) measured from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra and Aqua satellites to estimate potential evapotranspiration (PET). The HRR model is calibrated for the period 2000 to 2014 using USGS and LTER streamflow. An automated calibration technique is used. For future climate scenarios, we use mean 8-day land cover conditions. Future streamflow, ET and soil moisture statistics are presented and based on downscaled P and T from ten climate model projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5).
NASA Technical Reports Server (NTRS)
Haefner, L. E.
1975-01-01
Mathematical and philosophical approaches are presented for evaluation and implementation of ground and air transportation systems. Basic decision processes are examined that are used for cost analyses and planning (i.e, statistical decision theory, linear and dynamic programming, optimization, game theory). The effects on the environment and the community that a transportation system may have are discussed and modelled. Algorithmic structures are examined and selected bibliographic annotations are included. Transportation dynamic models were developed. Citizen participation in transportation projects (i.e, in Maryland and Massachusetts) is discussed. The relevance of the modelling and evaluation approaches to air transportation (i.e, airport planning) is examined in a case study in St. Louis, Missouri.
Communication Dynamics of Blog Networks
NASA Astrophysics Data System (ADS)
Goldberg, Mark; Kelley, Stephen; Magdon-Ismail, Malik; Mertsalov, Konstantin; Wallace, William (Al)
We study the communication dynamics of Blog networks, focusing on the Russian section of LiveJournal as a case study. Communication (blogger-to-blogger links) in such online communication networks is very dynamic: over 60% of the links in the network are new from one week to the next, though the set of bloggers remains approximately constant. Two fundamental questions are: (i) what models adequately describe such dynamic communication behavior; and (ii) how does one detect the phase transitions, i.e. the changes that go beyond the standard high-level dynamics? We approach these questions through the notion of stable statistics. We give strong experimental evidence to the fact that, despite the extreme amount of communication dynamics, several aggregate statistics are remarkably stable. We use stable statistics to test our models of communication dynamics postulating that any good model should produce values for these statistics which are both stable and close to the observed ones. Stable statistics can also be used to identify phase transitions, since any change in a normally stable statistic indicates a substantial change in the nature of the communication dynamics. We describe models of the communication dynamics in large social networks based on the principle of locality of communication: a node's communication energy is spent mostly within its own "social area," the locality of the node.
The Impact of Climate Projection Method on the Analysis of Climate Change in Semi-arid Basins
NASA Astrophysics Data System (ADS)
Halper, E.; Shamir, E.
2016-12-01
In small basins with arid climates, rainfall characteristics are highly variable and stream flow is tightly coupled with the nuances of rainfall events (e.g. hourly precipitation patterns Climate change assessments in these basins typically employ CMIP5 projections downscaled with Bias Corrected Statistical Downscaling and Bias Correction/Constructed Analogs (BCSD-BCCA) methods, but these products have drawbacks. Specifically, BCSD-BCCA these projections do not explicitly account for localized physical precipitation mechanisms (e.g. monsoon and snowfall) that are essential to many hydrological systems in the U. S. Southwest. An investigation of the impact of different types of precipitation projections for two kinds of hydrologic studies is being conducted under the U.S. Bureau of Reclamation's Science and Technology Grant Program. An innovative modeling framework consisting of a weather generator of likely hourly precipitation scenarios, coupled with rainfall-runoff, river routing and groundwater models, has been developed in the Nogales, Arizona area. This framework can simulate the impact of future climate on municipal water operations. This framework allows the rigorous comparison of the BCSD-BCCA methods with alternative approaches including rainfall output from dynamical downscaled Regional Climate Models (RCM), a stochastic rainfall generator forced by either Global Climate Models (GCM) or RCM, and projections using historical records conditioned on either GCM or RCM. The results will provide guide for the use of climate change projections into hydrologic studies of semi-arid areas. The project extends this comparison to analyses of flood control. Large flows on the Bill Williams River are a concern for the operation of dams along the Lower Colorado River. After adapting the weather generator for this region, we will evaluate the model performance for rainfall and stream flow, with emphasis on statistical features important to the specific needs of flood management. The end product of the research is to develop a test to guide selection of a precipitation projection method (including downscaling procedure) for a given region and objective.
ERIC Educational Resources Information Center
Phelps, Amy L.; Dostilio, Lina
2008-01-01
The present study addresses the efficacy of using service-learning methods to meet the GAISE guidelines (http://www.amstat.org/education/gaise/GAISECollege.htm) in a second business statistics course and further explores potential advantages of assigning a service-learning (SL) project as compared to the traditional statistics project assignment.…
Feng, Cun-Fang; Xu, Xin-Jian; Wang, Sheng-Jun; Wang, Ying-Hai
2008-06-01
We study projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random networks. We relax some limitations of previous work, where projective-anticipating and projective-lag synchronization can be achieved only on two coupled chaotic systems. In this paper, we realize projective-anticipating and projective-lag synchronization on complex dynamical networks composed of a large number of interconnected components. At the same time, although previous work studied projective synchronization on complex dynamical networks, the dynamics of the nodes are coupled partially linear chaotic systems. In this paper, the dynamics of the nodes of the complex networks are time-delayed chaotic systems without the limitation of the partial linearity. Based on the Lyapunov stability theory, we suggest a generic method to achieve the projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random dynamical networks, and we find both its existence and sufficient stability conditions. The validity of the proposed method is demonstrated and verified by examining specific examples using Ikeda and Mackey-Glass systems on Erdos-Renyi networks.
SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliopoulos, AS; Sun, X; Floros, D
Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-09-01
Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.
Quantum walks: The first detected passage time problem
NASA Astrophysics Data System (ADS)
Friedman, H.; Kessler, D. A.; Barkai, E.
2017-03-01
Even after decades of research, the problem of first passage time statistics for quantum dynamics remains a challenging topic of fundamental and practical importance. Using a projective measurement approach, with a sampling time τ , we obtain the statistics of first detection events for quantum dynamics on a lattice, with the detector located at the origin. A quantum renewal equation for a first detection wave function, in terms of which the first detection probability can be calculated, is derived. This formula gives the relation between first detection statistics and the solution of the corresponding Schrödinger equation in the absence of measurement. We illustrate our results with tight-binding quantum walk models. We examine a closed system, i.e., a ring, and reveal the intricate influence of the sampling time τ on the statistics of detection, discussing the quantum Zeno effect, half dark states, revivals, and optimal detection. The initial condition modifies the statistics of a quantum walk on a finite ring in surprising ways. In some cases, the average detection time is independent of the sampling time while in others the average exhibits multiple divergences as the sampling time is modified. For an unbounded one-dimensional quantum walk, the probability of first detection decays like (time)(-3 ) with superimposed oscillations, with exceptional behavior when the sampling period τ times the tunneling rate γ is a multiple of π /2 . The amplitude of the power-law decay is suppressed as τ →0 due to the Zeno effect. Our work, an extended version of our previously published paper, predicts rich physical behaviors compared with classical Brownian motion, for which the first passage probability density decays monotonically like (time)-3 /2, as elucidated by Schrödinger in 1915.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Häggström, Ida, E-mail: haeggsti@mskcc.org; Beattie, Bradley J.; Schmidtlein, C. Ross
2016-06-15
Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuationmore » are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dPETSTEP can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.« less
Regionalisation of statistical model outputs creating gridded data sets for Germany
NASA Astrophysics Data System (ADS)
Höpp, Simona Andrea; Rauthe, Monika; Deutschländer, Thomas
2016-04-01
The goal of the German research program ReKliEs-De (regional climate projection ensembles for Germany, http://.reklies.hlug.de) is to distribute robust information about the range and the extremes of future climate for Germany and its neighbouring river catchment areas. This joint research project is supported by the German Federal Ministry of Education and Research (BMBF) and was initiated by the German Federal States. The Project results are meant to support the development of adaptation strategies to mitigate the impacts of future climate change. The aim of our part of the project is to adapt and transfer the regionalisation methods of the gridded hydrological data set (HYRAS) from daily station data to the station based statistical regional climate model output of WETTREG (regionalisation method based on weather patterns). The WETTREG model output covers the period of 1951 to 2100 with a daily temporal resolution. For this, we generate a gridded data set of the WETTREG output for precipitation, air temperature and relative humidity with a spatial resolution of 12.5 km x 12.5 km, which is common for regional climate models. Thus, this regionalisation allows comparing statistical to dynamical climate model outputs. The HYRAS data set was developed by the German Meteorological Service within the German research program KLIWAS (www.kliwas.de) and consists of daily gridded data for Germany and its neighbouring river catchment areas. It has a spatial resolution of 5 km x 5 km for the entire domain for the hydro-meteorological elements precipitation, air temperature and relative humidity and covers the period of 1951 to 2006. After conservative remapping the HYRAS data set is also convenient for the validation of climate models. The presentation will consist of two parts to present the actual state of the adaptation of the HYRAS regionalisation methods to the statistical regional climate model WETTREG: First, an overview of the HYRAS data set and the regionalisation methods for precipitation (REGNIE method based on a combination of multiple linear regression with 5 predictors and inverse distance weighting), air temperature and relative humidity (optimal interpolation) will be given. Finally, results of the regionalisation of WETTREG model output will be shown.
On-Line Analysis of Physiologic and Neurobehavioral Variables During Long-Duration Space Missions
NASA Technical Reports Server (NTRS)
Brown, Emery N.
1999-01-01
The goal of this project is to develop reliable statistical algorithms for on-line analysis of physiologic and neurobehavioral variables monitored during long-duration space missions. Maintenance of physiologic and neurobehavioral homeostasis during long-duration space missions is crucial for ensuring optimal crew performance. If countermeasures are not applied, alterations in homeostasis will occur in nearly all-physiologic systems. During such missions data from most of these systems will be either continually and/or continuously monitored. Therefore, if these data can be analyzed as they are acquired and the status of these systems can be continually assessed, then once alterations are detected, appropriate countermeasures can be applied to correct them. One of the most important physiologic systems in which to maintain homeostasis during long-duration missions is the circadian system. To detect and treat alterations in circadian physiology during long duration space missions requires development of: 1) a ground-based protocol to assess the status of the circadian system under the light-dark environment in which crews in space will typically work; and 2) appropriate statistical methods to make this assessment. The protocol in Project 1, Circadian Entrainment, Sleep-Wake Regulation and Neurobehavioral will study human volunteers under the simulated light-dark environment of long-duration space missions. Therefore, we propose to develop statistical models to characterize in near real time circadian and neurobehavioral physiology under these conditions. The specific aims of this project are to test the hypotheses that: 1) Dynamic statistical methods based on the Kronauer model of the human circadian system can be developed to estimate circadian phase, period, amplitude from core-temperature data collected under simulated light- dark conditions of long-duration space missions. 2) Analytic formulae and numerical algorithms can be developed to compute the error in the estimates of circadian phase, period and amplitude determined from the data in Specific Aim 1. 3) Statistical models can detect reliably in near real- time (daily) significant alternations in the circadian physiology of individual subjects by analyzing the circadian and neurobehavioral data collected in Project 1. 4) Criteria can be developed using the Kronauer model and the recently developed Jewett model of cognitive -performance and subjective alertness to define altered circadian and neurobehavioral physiology and to set conditions for immediate administration of countermeasures.
An ex ante control chart for project monitoring using earned duration management observations
NASA Astrophysics Data System (ADS)
Mortaji, Seyed Taha Hossein; Noori, Siamak; Noorossana, Rassoul; Bagherpour, Morteza
2017-12-01
In the past few years, there has been an increasing interest in developing project control systems. The primary purpose of such systems is to indicate whether the actual performance is consistent with the baseline and to produce a signal in the case of non-compliance. Recently, researchers have shown an increased interest in monitoring project's performance indicators, by plotting them on the Shewhart-type control charts over time. However, these control charts are fundamentally designed for processes and ignore project-specific dynamics, which can lead to weak results and misleading interpretations. By paying close attention to the project baseline schedule and using statistical foundations, this paper proposes a new ex ante control chart which discriminates between acceptable (as-planned) and non-acceptable (not-as-planned) variations of the project's schedule performance. Such control chart enables project managers to set more realistic thresholds leading to a better decision making for taking corrective and/or preventive actions. For the sake of clarity, an illustrative example has been presented to show how the ex ante control chart is constructed in practice. Furthermore, an experimental investigation has been set up to analyze the performance of the proposed control chart. As expected, the results confirm that, when a project starts to deflect significantly from the project's baseline schedule, the ex ante control chart shows a respectable ability to detect and report right signals while avoiding false alarms.
Grotjahn, Richard; Black, Robert; Leung, Ruby; ...
2015-05-22
This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Pavlos, G. P.; Iliopoulos, A. C.; Pavlos, E. G.; Clark, P. M.; Duke, J. L.; Monos, D. S.
2018-09-01
This study combines two independent domains of science, the high throughput DNA sequencing capabilities of Genomics and complexity theory from Physics, to assess the information encoded by the different genomic segments of exonic, intronic and intergenic regions of the Major Histocompatibility Complex (MHC) and identify possible interactive relationships. The dynamic and non-extensive statistical characteristics of two well characterized MHC sequences from the homozygous cell lines, PGF and COX, in addition to two other genomic regions of comparable size, used as controls, have been studied using the reconstructed phase space theorem and the non-extensive statistical theory of Tsallis. The results reveal similar non-linear dynamical behavior as far as complexity and self-organization features. In particular, the low-dimensional deterministic nonlinear chaotic and non-extensive statistical character of the DNA sequences was verified with strong multifractal characteristics and long-range correlations. The nonlinear indices repeatedly verified that MHC sequences, whether exonic, intronic or intergenic include varying levels of information and reveal an interaction of the genes with intergenic regions, whereby the lower the number of genes in a region, the less the complexity and information content of the intergenic region. Finally we showed the significance of the intergenic region in the production of the DNA dynamics. The findings reveal interesting content information in all three genomic elements and interactive relationships of the genes with the intergenic regions. The results most likely are relevant to the whole genome and not only to the MHC. These findings are consistent with the ENCODE project, which has now established that the non-coding regions of the genome remain to be of relevance, as they are functionally important and play a significant role in the regulation of expression of genes and coordination of the many biological processes of the cell.
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
Inter-model variability in hydrological extremes projections for Amazonian sub-basins
NASA Astrophysics Data System (ADS)
Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier
2014-05-01
Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.
Remote sensing for urban planning
NASA Technical Reports Server (NTRS)
Davis, Bruce A.; Schmidt, Nicholas; Jensen, John R.; Cowen, Dave J.; Halls, Joanne; Narumalani, Sunil; Burgess, Bryan
1994-01-01
Utility companies are challenged to provide services to a highly dynamic customer base. With factory closures and shifts in employment becoming a routine occurrence, the utility industry must develop new techniques to maintain records and plan for expected growth. BellSouth Telecommunications, the largest of the Bell telephone companies, currently serves over 13 million residences and 2 million commercial customers. Tracking the movement of customers and scheduling the delivery of service are major tasks for BellSouth that require intensive manpower and sophisticated information management techniques. Through NASA's Commercial Remote Sensing Program Office, BellSouth is investigating the utility of remote sensing and geographic information system techniques to forecast residential development. This paper highlights the initial results of this project, which indicate a high correlation between the U.S. Bureau of Census block group statistics and statistics derived from remote sensing data.
Inverse stochastic-dynamic models for high-resolution Greenland ice core records
NASA Astrophysics Data System (ADS)
Boers, Niklas; Chekroun, Mickael D.; Liu, Honghu; Kondrashov, Dmitri; Rousseau, Denis-Didier; Svensson, Anders; Bigler, Matthias; Ghil, Michael
2017-12-01
Proxy records from Greenland ice cores have been studied for several decades, yet many open questions remain regarding the climate variability encoded therein. Here, we use a Bayesian framework for inferring inverse, stochastic-dynamic models from δ18O and dust records of unprecedented, subdecadal temporal resolution. The records stem from the North Greenland Ice Core Project (NGRIP), and we focus on the time interval 59-22 ka b2k. Our model reproduces the dynamical characteristics of both the δ18O and dust proxy records, including the millennial-scale Dansgaard-Oeschger variability, as well as statistical properties such as probability density functions, waiting times and power spectra, with no need for any external forcing. The crucial ingredients for capturing these properties are (i) high-resolution training data, (ii) cubic drift terms, (iii) nonlinear coupling terms between the δ18O and dust time series, and (iv) non-Markovian contributions that represent short-term memory effects.
The statistical distribution of aerosol properties in sourthern West Africa
NASA Astrophysics Data System (ADS)
Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh
2017-04-01
The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future projections of cloud properties and radiative effects in the region.
Simulation of the National Aerospace System for Safety Analysis
NASA Technical Reports Server (NTRS)
Pritchett, Amy; Goldsman, Dave; Statler, Irv (Technical Monitor)
2002-01-01
Work started on this project on January 1, 1999, the first year of the grant. Following the outline of the grant proposal, a simulator architecture has been established which can incorporate the variety of types of models needed to accurately simulate national airspace dynamics. For the sake of efficiency, this architecture was based on an established single-aircraft flight simulator, the Reconfigurable Flight Simulator (RFS), already developed at Georgia Tech. Likewise, in the first year substantive changes and additions were made to the RFS to convert it into a simulation of the National Airspace System, with the flexibility to incorporate many types of models: aircraft models; controller models; airspace configuration generators; discrete event generators; embedded statistical functions; and display and data outputs. The architecture has been developed with the capability to accept any models of these types; due to its object-oriented structure, individual simulator components can be added and removed during run-time, and can be compiled separately. Simulation objects from other projects should be easy to convert to meet architecture requirements, with the intent that both this project may now be able to incorporate established simulation components from other projects, and that other projects may easily use this simulation without significant time investment.
NASA Astrophysics Data System (ADS)
Sun, F.; Hall, A. D.; Walton, D.; Capps, S. B.; Qu, X.; Huang, H. J.; Berg, N.; Jousse, A.; Schwartz, M.; Nakamura, M.; Cerezo-Mota, R.
2012-12-01
Using a combination of dynamical and statistical downscaling techniques, we projected mid-21st century warming in the Los Angeles region at 2-km resolution. To account for uncertainty associated with the trajectory of future greenhouse gas emissions, we examined projections for both "business-as-usual" (RCP8.5) and "mitigation" (RCP2.6) emissions scenarios from the Fifth Coupled Model Intercomparison Project (CMIP5). To account for the considerable uncertainty associated with choice of global climate model, we downscaled results for all available global climate models in CMIP5. For the business-as-usual scenario, we find that by the mid-21st century, the most likely warming is roughly 2.6°C averaged over the region's land areas, with a 95% confidence that the warming lies between 0.9 and 4.2°C. The high resolution of the projections reveals a pronounced spatial pattern in the warming: High elevations and inland areas separated from the coast by at least one mountain complex warm 20 to 50% more than the areas near the coast or within the Los Angeles basin. This warming pattern is especially apparent in summertime. The summertime warming contrast between the inland and coastal zones has a large effect on the most likely expected number of extremely hot days per year. Coastal locations and areas within the Los Angeles basin see roughly two to three times the number of extremely hot days, while high elevations and inland areas typically experience approximately three to five times the number of extremely hot days. Under the mitigation emissions scenario, the most likely warming and increase in heat extremes are somewhat smaller. However, the majority of the warming seen in the business-as-usual scenario still occurs at all locations in the most likely case under the mitigation scenario, and heat extremes still increase significantly. This warming study is the first part of a series studies of our project. More climate change impacts on the Santa Ana wind, rainfall, snowfall and snowmelt, cloud and surface hydrology are forthcoming and could be found in www.atmos.ucla.edu/csrl.he ensemble-mean, annual-mean surface air temperature change and its uncertainty from the available CMIP5 GCMs under the RCP8.5 (left) and RCP2.6 (right) emissions scenarios, unit: °C.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
ERIC Educational Resources Information Center
Romeu, Jorge Luis
2008-01-01
This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…
The Effect of a Student-Designed Data Collection: Project on Attitudes toward Statistics
ERIC Educational Resources Information Center
Carnell, Lisa J.
2008-01-01
Students often enter an introductory statistics class with less than positive attitudes about the subject. They tend to believe statistics is difficult and irrelevant to their lives. Observational evidence from previous studies suggests including projects in a statistics course may enhance students' attitudes toward statistics. This study examines…
Molecular Dynamics of Hot Dense Plasmas: New Horizons
NASA Astrophysics Data System (ADS)
Graziani, Frank
2011-06-01
We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Z elements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. We will also discuss a new idea called kinetic theory MD which now being explored to deal more efficiently with the very disparate dynamical timescales that arise in fusion plasmas. We discuss how this approach can be derived rigorously from the n-body quantum Wigner equation and illustrate the approach with an example. This work is performed under the auspices of the U. S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
A Systematic Analysis of Caustic Methods for Galaxy Cluster Masses
NASA Astrophysics Data System (ADS)
Gifford, Daniel; Miller, Christopher; Kern, Nicholas
2013-08-01
We quantify the expected observed statistical and systematic uncertainties of the escape velocity as a measure of the gravitational potential and total mass of galaxy clusters. We focus our attention on low redshift (z <=0.15) clusters, where large and deep spectroscopic datasets currently exist. Utilizing a suite of Millennium Simulation semi-analytic galaxy catalogs, we find that the dynamical mass, as traced by either the virial relation or the escape velocity, is robust to variations in how dynamical friction is applied to "orphan" galaxies in the mock catalogs (i.e., those galaxies whose dark matter halos have fallen below the resolution limit). We find that the caustic technique recovers the known halo masses (M 200) with a third less scatter compared to the virial masses. The bias we measure increases quickly as the number of galaxies used decreases. For N gal > 25, the scatter in the escape velocity mass is dominated by projections along the line-of-sight. Algorithmic uncertainties from the determination of the projected escape velocity profile are negligible. We quantify how target selection based on magnitude, color, and projected radial separation can induce small additional biases into the escape velocity masses. Using N gal = 150 (25), the caustic technique has a per cluster scatter in ln (M|M 200) of 0.3 (0.5) and bias 1% ± 3} (16% ± 5}) for clusters with masses >1014 M ⊙ at z < 0.15.
Estimating topological properties of weighted networks from limited information
NASA Astrophysics Data System (ADS)
Gabrielli, Andrea; Cimini, Giulio; Garlaschelli, Diego; Squartini, Angelo
A typical problem met when studying complex systems is the limited information available on their topology, which hinders our understanding of their structural and dynamical properties. A paramount example is provided by financial networks, whose data are privacy protected. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here we develop a reconstruction method, based on statistical mechanics concepts, that exploits the empirical link density in a highly non-trivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems. Acknoweledgement to ``Growthcom'' ICT - EC project (Grant No: 611272) and ``Crisislab'' Italian Project.
Population models for passerine birds: structure, parameterization, and analysis
Noon, B.R.; Sauer, J.R.; McCullough, D.R.; Barrett, R.H.
1992-01-01
Population models have great potential as management tools, as they use infonnation about the life history of a species to summarize estimates of fecundity and survival into a description of population change. Models provide a framework for projecting future populations, determining the effects of management decisions on future population dynamics, evaluating extinction probabilities, and addressing a variety of questions of ecological and evolutionary interest. Even when insufficient information exists to allow complete identification of the model, the modelling procedure is useful because it forces the investigator to consider the life history of the species when determining what parameters should be estimated from field studies and provides a context for evaluating the relative importance of demographic parameters. Models have been little used in the study of the population dynamics of passerine birds because of: (1) widespread misunderstandings of the model structures and parameterizations, (2) a lack of knowledge of life histories of many species, (3) difficulties in obtaining statistically reliable estimates of demographic parameters for most passerine species, and (4) confusion about functional relationships among demographic parameters. As a result, studies of passerine demography are often designed inappropriately and fail to provide essential data. We review appropriate models for passerine bird populations and illustrate their possible uses in evaluating the effects of management or other environmental influences on population dynamics. We identify environmental influences on population dynamics. We identify parameters that must be estimated from field data, briefly review existing statistical methods for obtaining valid estimates, and evaluate the present status of knowledge of these parameters.
Climate change impacts in Zhuoshui watershed, Taiwan
NASA Astrophysics Data System (ADS)
Chao, Yi-Chiung; Liu, Pei-Ling; Cheng, Chao-Tzuen; Li, Hsin-Chi; Wu, Tingyeh; Chen, Wei-Bo; Shih, Hung-Ju
2017-04-01
There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualty and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily. One of the aims for Taiwan Climate Change Projection and Information Platform (TCCIP) is to demonstrate the linkage between climate change data and watershed impact models. The purpose is to understand relative disasters induced by extreme rainfall (typhoons) under climate change in watersheds including landslides, debris flows, channel erosion and deposition, floods, and economic loss. The study applied dynamic downscaling approach to release climate change projected typhoon events under RCP 8.5, the worst-case scenario. The Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability (TRIGRS) and FLO-2D models, then, were used to simulate hillslope disaster impacts in the upstream of Zhuoshui River. CCHE1D model was used to elevate the sediment erosion or deposition in channel. FVCOM model was used to asses a flood impact in urban area in the downstream. Finally, whole potential loss associate with these typhoon events was evaluated by the Taiwan Typhoon Loss Assessment System (TLAS) under climate change scenario. Results showed that the total loss will increase roughly by NT 49.7 billion (1.6 billion USD) in future in Zhuoshui watershed in Taiwan. The results of this research could help to understand future impact; however model bias still exists. Because typhoon track is a critical factor to consider regional disaster risk and the projection of typhoon is still highly uncertain and typhoon number is very limited in a single model simulation. Since Taiwan is a small island, different typhoon tracks induce different level of disaster impacts in watersheds. Therefore, more samples dynamic downscaled typhoon events are needed for analysis to improve and increase reliability in future. Considering dynamical downscaling methods consume massive computing power, developing a new statistical downscaling approach and new method to release daily climate change data to hourly data could be a short-term solution.
NASA Astrophysics Data System (ADS)
Jacquemin, Ingrid; Henrot, Alexandra-Jane; Fontaine, Corentin M.; Dendoncker, Nicolas; Beckers, Veronique; Debusscher, Bos; Tychon, Bernard; Hambuckers, Alain; François, Louis
2016-04-01
Dynamic vegetation models (DVM) were initially designed to describe the dynamics of natural ecosystems as a function of climate and soil, to study the role of the vegetation in the carbon cycle. These models are now directly coupled with climate models in order to evaluate feedbacks between vegetation and climate. But DVM characteristics allow numerous other applications, leading to amelioration of some of their modules (e.g., evaluating sensitivity of the hydrological module to land surface changes) and developments (e.g., coupling with other models like agent-based models), to be used in ecosystem management and land use planning studies. It is in this dynamic context about DVMs that we have adapted the CARAIB (CARbon Assimilation In the Biosphere) model. One of the main improvements is the implementation of a crop module, allowing the assessment of climate change impacts on crop yields. We try to validate this module at different scales: - from the plot level, with the use of eddy-covariance data from agricultural sites in the FLUXNET network, such as Lonzée (Belgium) or other Western European sites (Grignon, Dijkgraaf,…), - to the country level, for which we compare the crop yield calculated by CARAIB to the crop yield statistics for Belgium and for different agricultural regions of the country. Another challenge for the CARAIB DVM was to deal with the landscape dynamics, which is not directly possible due to the lack of consideration of anthropogenic factors in the system. In the framework of the VOTES and the MASC projects, CARAIB is coupled with an agent-based model (ABM), representing the societal component of the system. This coupled module allows the use of climate and socio-economic scenarios, particularly interesting for studies which aim at ensuring a sustainable approach. This module has particularly been exploited in the VOTES project, where the objective was to provide a social, biophysical and economic assessment of the ecosystem services in four municipalities under urban pressure in the center of Belgium. The biophysical valuation was carried out with the coupled module, allowing a quantitative evaluation of key ecosystem services as a function of three climatic and socio-economic scenarios.
Hamilton, Matthew; Mahiane, Guy; Werst, Elric; Sanders, Rachel; Briët, Olivier; Smith, Thomas; Cibulskis, Richard; Cameron, Ewan; Bhatt, Samir; Weiss, Daniel J; Gething, Peter W; Pretorius, Carel; Korenromp, Eline L
2017-02-10
Scale-up of malaria prevention and treatment needs to continue but national strategies and budget allocations are not always evidence-based. This article presents a new modelling tool projecting malaria infection, cases and deaths to support impact evaluation, target setting and strategic planning. Nested in the Spectrum suite of programme planning tools, the model includes historic estimates of case incidence and deaths in groups aged up to 4, 5-14, and 15+ years, and prevalence of Plasmodium falciparum infection (PfPR) among children 2-9 years, for 43 sub-Saharan African countries and their 602 provinces, from the WHO and malaria atlas project. Impacts over 2016-2030 are projected for insecticide-treated nets (ITNs), indoor residual spraying (IRS), seasonal malaria chemoprevention (SMC), and effective management of uncomplicated cases (CMU) and severe cases (CMS), using statistical functions fitted to proportional burden reductions simulated in the P. falciparum dynamic transmission model OpenMalaria. In projections for Nigeria, ITNs, IRS, CMU, and CMS scale-up reduced health burdens in all age groups, with largest proportional and especially absolute reductions in children up to 4 years old. Impacts increased from 8 to 10 years following scale-up, reflecting dynamic effects. For scale-up of each intervention to 80% effective coverage, CMU had the largest impacts across all health outcomes, followed by ITNs and IRS; CMS and SMC conferred additional small but rapid mortality impacts. Spectrum-Malaria's user-friendly interface and intuitive display of baseline data and scenario projections holds promise to facilitate capacity building and policy dialogue in malaria programme prioritization. The module's linking to the OneHealth Tool for costing will support use of the software for strategic budget allocation. In settings with moderately low coverage levels, such as Nigeria, improving case management and achieving universal coverage with ITNs could achieve considerable burden reductions. Projections remain to be refined and validated with local expert input data and actual policy scenarios.
NASA Astrophysics Data System (ADS)
Amin, Asad; Nasim, Wajid; Mubeen, Muhammad; Kazmi, Dildar Hussain; Lin, Zhaohui; Wahid, Abdul; Sultana, Syeda Refat; Gibbs, Jim; Fahad, Shah
2017-09-01
Unpredictable precipitation trends have largely influenced by climate change which prolonged droughts or floods in South Asia. Statistical analysis of monthly, seasonal, and annual precipitation trend carried out for different temporal (1996-2015 and 2041-2060) and spatial scale (39 meteorological stations) in Pakistan. Statistical downscaling model (SimCLIM) was used for future precipitation projection (2041-2060) and analyzed by statistical approach. Ensemble approach combined with representative concentration pathways (RCPs) at medium level used for future projections. The magnitude and slop of trends were derived by applying Mann-Kendal and Sen's slop statistical approaches. Geo-statistical application used to generate precipitation trend maps. Comparison of base and projected precipitation by statistical analysis represented by maps and graphical visualization which facilitate to detect trends. Results of this study projects that precipitation trend was increasing more than 70% of weather stations for February, March, April, August, and September represented as base years. Precipitation trend was decreased in February to April but increase in July to October in projected years. Highest decreasing trend was reported in January for base years which was also decreased in projected years. Greater variation in precipitation trends for projected and base years was reported in February to April. Variations in projected precipitation trend for Punjab and Baluchistan highly accredited in March and April. Seasonal analysis shows large variation in winter, which shows increasing trend for more than 30% of weather stations and this increased trend approaches 40% for projected precipitation. High risk was reported in base year pre-monsoon season where 90% of weather station shows increasing trend but in projected years this trend decreased up to 33%. Finally, the annual precipitation trend has increased for more than 90% of meteorological stations in base (1996-2015) which has decreased for projected year (2041-2060) up to 76%. These result revealed that overall precipitation trend is decreasing in future year which may prolonged the drought in 14% of weather stations under study.
ERIC Educational Resources Information Center
Ossai, Peter Agbadobi Uloku
2016-01-01
This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…
ERIC Educational Resources Information Center
Thebaud, Schiller
This report examines four UNESCO pilot projects undertaken in 1972 in Brazil, Colombia, Peru, and Uruguay to study the methods used for national statistical surveys of science and technology. The projects specifically addressed the problems of comparing statistics gathered by different methods in different countries. Surveys carried out in Latin…
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
A New High Resolution Climate Dataset for Climate Change Impacts Assessments in New England
NASA Astrophysics Data System (ADS)
Komurcu, M.; Huber, M.
2016-12-01
Assessing regional impacts of climate change (such as changes in extreme events, land surface hydrology, water resources, energy, ecosystems and economy) requires much higher resolution climate variables than those available from global model projections. While it is possible to run global models in higher resolution, the high computational cost associated with these simulations prevent their use in such manner. To alleviate this problem, dynamical downscaling offers a method to deliver higher resolution climate variables. As part of an NSF EPSCoR funded interdisciplinary effort to assess climate change impacts on New Hampshire ecosystems, hydrology and economy (the New Hampshire Ecosystems and Society project), we create a unique high-resolution climate dataset for New England. We dynamically downscale global model projections under a high impact emissions scenario using the Weather Research and Forecasting model (WRF) with three nested grids of 27, 9 and 3 km horizontal resolution with the highest resolution innermost grid focusing over New England. We prefer dynamical downscaling over other methods such as statistical downscaling because it employs physical equations to progressively simulate climate variables as atmospheric processes interact with surface processes, emissions, radiation, clouds, precipitation and other model components, hence eliminates fix relationships between variables. In addition to simulating mean changes in regional climate, dynamical downscaling also allows for the simulation of climate extremes that significantly alter climate change impacts. We simulate three time slices: 2006-2015, 2040-2060 and 2080-2100. This new high-resolution climate dataset (with more than 200 variables saved in hourly (six hourly) intervals for the highest resolution domain (outer two domains)) along with model input and restart files used in our WRF simulations will be publicly available for use to the broader scientific community to support in-depth climate change impacts assessments for New England. We present results focusing on future changes in New England extreme events.
The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics
1974-08-01
VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated
NASA Astrophysics Data System (ADS)
Graham, L. Phil; Andersson, Lotta; Horan, Mark; Kunz, Richard; Lumsden, Trevor; Schulze, Roland; Warburton, Michele; Wilk, Julie; Yang, Wei
This study used climate change projections from different regional approaches to assess hydrological effects on the Thukela River Basin in KwaZulu-Natal, South Africa. Projecting impacts of future climate change onto hydrological systems can be undertaken in different ways and a variety of effects can be expected. Although simulation results from global climate models (GCMs) are typically used to project future climate, different outcomes from these projections may be obtained depending on the GCMs themselves and how they are applied, including different ways of downscaling from global to regional scales. Projections of climate change from different downscaling methods, different global climate models and different future emissions scenarios were used as input to simulations in a hydrological model to assess climate change impacts on hydrology. A total of 10 hydrological change simulations were made, resulting in a matrix of hydrological response results. This matrix included results from dynamically downscaled climate change projections from the same regional climate model (RCM) using an ensemble of three GCMs and three global emissions scenarios, and from statistically downscaled projections using results from five GCMs with the same emissions scenario. Although the matrix of results does not provide complete and consistent coverage of potential uncertainties from the different methods, some robust results were identified. In some regards, the results were in agreement and consistent for the different simulations. For others, particularly rainfall, the simulations showed divergence. For example, all of the statistically downscaled simulations showed an annual increase in precipitation and corresponding increase in river runoff, while the RCM downscaled simulations showed both increases and decreases in runoff. According to the two projections that best represent runoff for the observed climate, increased runoff would generally be expected for this basin in the future. Dealing with such variability in results is not atypical for assessing climate change impacts in Africa and practitioners are faced with how to interpret them. This work highlights the need for additional, well-coordinated regional climate downscaling for the region to further define the range of uncertainties involved.
Regojo Zapata, O; Lamata Hernández, F; Sánchez Zalabardo, J M; Elizalde Benito, A; Navarro Gil, J; Valdivia Uría, J G
2004-09-01
Studies about quality in thesis and investigation projects in biomedical sciences are unusual, but very important in university teaching because is necessary to improve the quality elaboration of the thesis. The objectives the study were to determine the project's quality of thesis in our department, according to the fulfillment of the scientific methodology and to establish, if it exists, a relation between the global quality of the project and the statistical used resources. Descriptive study of 273 thesis projects performed between 1995-2002 in surgery department of the Zaragoza University. The review realized for 15 observers that they analyzed 28 indicators of every project. Giving a value to each of the indicators, the projects qualified in a scale from 1 to 10 according to the quality in the fulfillment of the scientific methodology. The mean of the project's quality was 5.53 (D.E: 1.77). In 13.9% the thesis projects was concluded with the reading of the work. The three indicators of statistical used resources had a significant difference with the value of the quality projects. The quality of the statistical resources is very important when a project of thesis wants to be realized by good methodology, because it assures to come to certain conclusions. In our study we have thought that more of the third part of the variability in the quality of the project of thesis explains for three statistical above-mentioned articles.
DBMS as a Tool for Project Management
NASA Technical Reports Server (NTRS)
Linder, H.
1984-01-01
Scientific objectives of crustal dynamics are listed as well as the contents of the centralized data information system for the crustal dynamics project. The system provides for project observation schedules, gives project configuration control information and project site information.
Features of statistical dynamics in a finite system
NASA Astrophysics Data System (ADS)
Yan, Shiwei; Sakata, Fumihiko; Zhuo, Yizhong
2002-03-01
We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time.
Features of statistical dynamics in a finite system.
Yan, Shiwei; Sakata, Fumihiko; Zhuo, Yizhong
2002-03-01
We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time.
Implementation of Discovery Projects in Statistics
ERIC Educational Resources Information Center
Bailey, Brad; Spence, Dianna J.; Sinn, Robb
2013-01-01
Researchers and statistics educators consistently suggest that students will learn statistics more effectively by conducting projects through which they actively engage in a broad spectrum of tasks integral to statistical inquiry, in the authentic context of a real-world application. In keeping with these findings, we share an implementation of…
ERIC Educational Resources Information Center
Fawcett, Lee
2017-01-01
The CASE project (Case-based Approaches to Statistics Education; see www.mas.ncl.ac.uk/~nlf8/innovation) was established to investigate how the use of real-life, discipline-specific case study material in Statistics service courses could improve student engagement, motivation, and confidence. Ultimately, the project aims to promote deep learning…
The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade
ERIC Educational Resources Information Center
Koparan, Timur; Güven, Bülent
2014-01-01
This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…
ERIC Educational Resources Information Center
Ramler, Ivan P.; Chapman, Jessica L.
2011-01-01
In this article we describe a semester-long project, based on the popular video game series Guitar Hero, designed to introduce upper-level undergraduate statistics students to statistical research. Some of the goals of this project are to help students develop statistical thinking that allows them to approach and answer open-ended research…
NASA Astrophysics Data System (ADS)
Asadollahi, Parisa; Li, Jian
2016-04-01
Understanding the dynamic behavior of complex structures such as long-span bridges requires dense deployment of sensors. Traditional wired sensor systems are generally expensive and time-consuming to install due to cabling. With wireless communication and on-board computation capabilities, wireless smart sensor networks have the advantages of being low cost, easy to deploy and maintain and therefore facilitate dense instrumentation for structural health monitoring. A long-term monitoring project was recently carried out for a cable-stayed bridge in South Korea with a dense array of 113 smart sensors, which feature the world's largest wireless smart sensor network for civil structural monitoring. This paper presents a comprehensive statistical analysis of the modal properties including natural frequencies, damping ratios and mode shapes of the monitored cable-stayed bridge. Data analyzed in this paper is composed of structural vibration signals monitored during a 12-month period under ambient excitations. The correlation between environmental temperature and the modal frequencies is also investigated. The results showed the long-term statistical structural behavior of the bridge, which serves as the basis for Bayesian statistical updating for the numerical model.
Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.
2000-01-01
Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.
Korenromp, Eline; Hamilton, Matthew; Sanders, Rachel; Mahiané, Guy; Briët, Olivier J T; Smith, Thomas; Winfrey, William; Walker, Neff; Stover, John
2017-11-07
In malaria-endemic countries, malaria prevention and treatment are critical for child health. In the context of intervention scale-up and rapid changes in endemicity, projections of intervention impact and optimized program scale-up strategies need to take into account the consequent dynamics of transmission and immunity. The new Spectrum-Malaria program planning tool was used to project health impacts of Insecticide-Treated mosquito Nets (ITNs) and effective management of uncomplicated malaria cases (CMU), among other interventions, on malaria infection prevalence, case incidence and mortality in children 0-4 years, 5-14 years of age and adults. Spectrum-Malaria uses statistical models fitted to simulations of the dynamic effects of increasing intervention coverage on these burdens as a function of baseline malaria endemicity, seasonality in transmission and malaria intervention coverage levels (estimated for years 2000 to 2015 by the World Health Organization and Malaria Atlas Project). Spectrum-Malaria projections of proportional reductions in under-five malaria mortality were compared with those of the Lives Saved Tool (LiST) for the Democratic Republic of the Congo and Zambia, for given (standardized) scenarios of ITN and/or CMU scale-up over 2016-2030. Proportional mortality reductions over the first two years following scale-up of ITNs from near-zero baselines to moderately higher coverages align well between LiST and Spectrum-Malaria -as expected since both models were fitted to cluster-randomized ITN trials in moderate-to-high-endemic settings with 2-year durations. For further scale-up from moderately high ITN coverage to near-universal coverage (as currently relevant for strategic planning for many countries), Spectrum-Malaria predicts smaller additional ITN impacts than LiST, reflecting progressive saturation. For CMU, especially in the longer term (over 2022-2030) and for lower-endemic settings (like Zambia), Spectrum-Malaria projects larger proportional impacts, reflecting onward dynamic effects not fully captured by LiST. Spectrum-Malaria complements LiST by extending the scope of malaria interventions, program packages and health outcomes that can be evaluated for policy making and strategic planning within and beyond the perspective of child survival.
Hybrid function projective synchronization in complex dynamical networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Qiang; Wang, Xing-yuan, E-mail: wangxy@dlut.edu.cn; Hu, Xiao-peng
2014-02-15
This paper investigates hybrid function projective synchronization in complex dynamical networks. When the complex dynamical networks could be synchronized up to an equilibrium or periodic orbit, a hybrid feedback controller is designed to realize the different component of vector of node could be synchronized up to different desired scaling function in complex dynamical networks with time delay. Hybrid function projective synchronization (HFPS) in complex dynamical networks with constant delay and HFPS in complex dynamical networks with time-varying coupling delay are researched, respectively. Finally, the numerical simulations show the effectiveness of theoretical analysis.
A platform to integrate climate information and rural telemedicine in Malawi
NASA Astrophysics Data System (ADS)
Lowe, R.; Chadza, T.; Chirombo, J.; Fonda, C.; Muyepa, A.; Nkoloma, M.; Pietrosemoli, E.; Radicella, S. M.; Tompkins, A. M.; Zennaro, M.
2012-04-01
It is commonly accepted that climate plays a role in the transmission of many infectious diseases, particularly those transmitted by mosquitoes such as malaria, which is one of the most important causes of mortality and morbidity in developing countries. Due to time lags involved in the climate-disease transmission system, lagged observed climate variables could provide some predictive lead for forecasting disease epidemics. This lead time could be extended by using forecasts of the climate in disease prediction models. This project aims to implement a platform for the dissemination of climate-driven disease risk forecasts, using a telemedicine approach. A pilot project has been established in Malawi, where a 162 km wireless link has been installed, spanning from Blantyre City to remote health facilities in the district of Mangochi in the Southern region, bordering Lake Malawi. This long Wi-Fi technology allows rural health facilities to upload real-time disease cases as they occur to an online health information system (DHIS2); a national medical database repository administered by the Ministry of Health. This technology provides a real-time data logging system for disease incidence monitoring and facilitates the flow of information between local and national levels. This platform allows statistical and dynamical disease prediction models to be rapidly updated with real-time climate and epidemiological information. This permits health authorities to target timely interventions ahead of an imminent increase in malaria incidence. By integrating meteorological and health information systems in a statistical-dynamical prediction model, we show that a long-distance Wi-Fi link is a practical and inexpensive means to enable the rapid analysis of real-time information in order to target disease prevention and control measures and mobilise resources at the local level.
Electrical engineering research support for FDOT Traffic Statistics Office
DOT National Transportation Integrated Search
2010-03-01
The aim of this project was to provide electrical engineering support for the telemetered traffic monitoring sites (TTMSs) operated by the Statistics Office of the Florida Department of Transportation. This project was a continuation of project BD-54...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Fuyao; Yu, Yan; Notaro, Michael
This study advances the practicality and stability of the traditional multivariate statistical method, generalized equilibrium feedback assessment (GEFA), for decomposing the key oceanic drivers of regional atmospheric variability, especially when available data records are short. An advanced stepwise GEFA methodology is introduced, in which unimportant forcings within the forcing matrix are eliminated through stepwise selection. Method validation of stepwise GEFA is performed using the CESM, with a focused application to northern and tropical Africa (NTA). First, a statistical assessment of the atmospheric response to each primary oceanic forcing is carried out by applying stepwise GEFA to a fully coupled controlmore » run. Then, a dynamical assessment of the atmospheric response to individual oceanic forcings is performed through ensemble experiments by imposing sea surface temperature anomalies over focal ocean basins. Finally, to quantify the reliability of stepwise GEFA, the statistical assessment is evaluated against the dynamical assessment in terms of four metrics: the percentage of grid cells with consistent response sign, the spatial correlation of atmospheric response patterns, the area-averaged seasonal cycle of response magnitude, and consistency in associated mechanisms between assessments. In CESM, tropical modes, namely El Niño–Southern Oscillation and the tropical Indian Ocean Basin, tropical Indian Ocean dipole, and tropical Atlantic Niño modes, are the dominant oceanic controls of NTA climate. In complementary studies, stepwise GEFA is validated in terms of isolating terrestrial forcings on the atmosphere, and observed oceanic and terrestrial drivers of NTA climate are extracted to establish an observational benchmark for subsequent coupled model evaluation and development of process-based weights for regional climate projections.« less
Wang, Fuyao; Yu, Yan; Notaro, Michael; ...
2017-09-27
This study advances the practicality and stability of the traditional multivariate statistical method, generalized equilibrium feedback assessment (GEFA), for decomposing the key oceanic drivers of regional atmospheric variability, especially when available data records are short. An advanced stepwise GEFA methodology is introduced, in which unimportant forcings within the forcing matrix are eliminated through stepwise selection. Method validation of stepwise GEFA is performed using the CESM, with a focused application to northern and tropical Africa (NTA). First, a statistical assessment of the atmospheric response to each primary oceanic forcing is carried out by applying stepwise GEFA to a fully coupled controlmore » run. Then, a dynamical assessment of the atmospheric response to individual oceanic forcings is performed through ensemble experiments by imposing sea surface temperature anomalies over focal ocean basins. Finally, to quantify the reliability of stepwise GEFA, the statistical assessment is evaluated against the dynamical assessment in terms of four metrics: the percentage of grid cells with consistent response sign, the spatial correlation of atmospheric response patterns, the area-averaged seasonal cycle of response magnitude, and consistency in associated mechanisms between assessments. In CESM, tropical modes, namely El Niño–Southern Oscillation and the tropical Indian Ocean Basin, tropical Indian Ocean dipole, and tropical Atlantic Niño modes, are the dominant oceanic controls of NTA climate. In complementary studies, stepwise GEFA is validated in terms of isolating terrestrial forcings on the atmosphere, and observed oceanic and terrestrial drivers of NTA climate are extracted to establish an observational benchmark for subsequent coupled model evaluation and development of process-based weights for regional climate projections.« less
Targeted Informatics General Information Software Posters NIH Program Projects and Statistics QC Statistics Completed Projects Publications Contact Information NIH Contacts CIDR Contacts ___________________ -Contact
ERIC Educational Resources Information Center
Spence, Dianna J.; Sharp, Julia L.; Sinn, Robb
2011-01-01
Four instructors used authentic research projects and related curriculum materials when teaching elementary statistics in secondary and undergraduate settings. Projects were authentic in that students selected their own variables, defined their own research questions, and collected and analyzed their own data. Classes using these projects were…
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL INFORMATION § 22.2... capacity. (c) Research or statistical project means any program, project, or component thereof which is... statistical information means any information which is collected during the conduct of a research or...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL INFORMATION § 22.2... capacity. (c) Research or statistical project means any program, project, or component thereof which is... statistical information means any information which is collected during the conduct of a research or...
A Conway-Maxwell-Poisson (CMP) model to address data dispersion on positron emission tomography.
Santarelli, Maria Filomena; Della Latta, Daniele; Scipioni, Michele; Positano, Vincenzo; Landini, Luigi
2016-10-01
Positron emission tomography (PET) in medicine exploits the properties of positron-emitting unstable nuclei. The pairs of γ- rays emitted after annihilation are revealed by coincidence detectors and stored as projections in a sinogram. It is well known that radioactive decay follows a Poisson distribution; however, deviation from Poisson statistics occurs on PET projection data prior to reconstruction due to physical effects, measurement errors, correction of deadtime, scatter, and random coincidences. A model that describes the statistical behavior of measured and corrected PET data can aid in understanding the statistical nature of the data: it is a prerequisite to develop efficient reconstruction and processing methods and to reduce noise. The deviation from Poisson statistics in PET data could be described by the Conway-Maxwell-Poisson (CMP) distribution model, which is characterized by the centring parameter λ and the dispersion parameter ν, the latter quantifying the deviation from a Poisson distribution model. In particular, the parameter ν allows quantifying over-dispersion (ν<1) or under-dispersion (ν>1) of data. A simple and efficient method for λ and ν parameters estimation is introduced and assessed using Monte Carlo simulation for a wide range of activity values. The application of the method to simulated and experimental PET phantom data demonstrated that the CMP distribution parameters could detect deviation from the Poisson distribution both in raw and corrected PET data. It may be usefully implemented in image reconstruction algorithms and quantitative PET data analysis, especially in low counting emission data, as in dynamic PET data, where the method demonstrated the best accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.
How will climate change affect watershed mercury export in a representative Coastal Plain watershed?
NASA Astrophysics Data System (ADS)
Golden, H. E.; Knightes, C. D.; Conrads, P. A.; Feaster, T.; Davis, G. M.; Benedict, S. T.; Bradley, P. M.
2012-12-01
Future climate change is expected to drive variations in watershed hydrological processes and water quality across a wide range of physiographic provinces, ecosystems, and spatial scales. How such shifts in climatic conditions will impact watershed mercury (Hg) dynamics and hydrologically-driven Hg transport is a significant concern. We simulate the responses of watershed hydrological and total Hg (HgT) fluxes and concentrations to a unified set of past and future climate change projections in a Coastal Plain basin using multiple watershed models. We use two statistically downscaled global precipitation and temperature models, ECHO, a hybrid of the ECHAM4 and HOPE-G models, and the Community Climate System Model (CCSM3) across two thirty-year simulations (1980 to 2010 and 2040 to 2070). We apply three watershed models to quantify and bracket potential changes in hydrologic and HgT fluxes, including the Visualizing Ecosystems for Land Management Assessment Model for Hg (VELMA-Hg), the Grid Based Mercury Model (GBMM), and TOPLOAD, a water quality constituent model linked to TOPMODEL hydrological simulations. We estimate a decrease in average annual HgT fluxes in response to climate change using the ECHO projections and an increase with the CCSM3 projections in the study watershed. Average monthly HgT fluxes increase using both climate change projections between in the late spring (March through May), when HgT concentrations and flow are high. Results suggest that hydrological transport associated with changes in precipitation and temperature is the primary mechanism driving HgT flux response to climate change. Our multiple model/multiple projection approach allows us to bracket the relative response of HgT fluxes to climate change, thereby illustrating the uncertainty associated with the projections. In addition, our approach allows us to examine potential variations in climate change-driven water and HgT export based on different conceptualizations of watershed HgT dynamics and the representative mathematical structures underpinning existing watershed Hg models.
Population growth rates: issues and an application.
Godfray, H Charles J; Rees, Mark
2002-01-01
Current issues in population dynamics are discussed in the context of The Royal Society Discussion Meeting 'Population growth rate: determining factors and role in population regulation'. In particular, different views on the centrality of population growth rates to the study of population dynamics and the role of experiments and theory are explored. Major themes emerging include the role of modern statistical techniques in bringing together experimental and theoretical studies, the importance of long-term experimentation and the need for ecology to have model systems, and the value of population growth rate as a means of understanding and predicting population change. The last point is illustrated by the application of a recently introduced technique, integral projection modelling, to study the population growth rate of a monocarpic perennial plant, its elasticities to different life-history components and the evolution of an evolutionarily stable strategy size at flowering. PMID:12396521
NASA Technical Reports Server (NTRS)
Cole, Stanley R.; Johnson, R. Keith; Piatak, David J.; Florance, Jennifer P.; Rivera, Jose A., Jr.
2003-01-01
The Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for over forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities compared to testing in air. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. This paper describes TDT capabilities that make it particularly suited for aeroelasticity testing. The paper also discusses the nature of recent test activities in the TDT, including summaries of several specific tests. Finally, the paper documents recent facility improvement projects and the continuous statistical quality assessment effort for the TDT.
Study on a new chaotic bitwise dynamical system and its FPGA implementation
NASA Astrophysics Data System (ADS)
Wang, Qian-Xue; Yu, Si-Min; Guyeux, C.; Bahi, J.; Fang, Xiao-Le
2015-06-01
In this paper, the structure of a new chaotic bitwise dynamical system (CBDS) is described. Compared to our previous research work, it uses various random bitwise operations instead of only one. The chaotic behavior of CBDS is mathematically proven according to the Devaney's definition, and its statistical properties are verified both for uniformity and by a comprehensive, reputed and stringent battery of tests called TestU01. Furthermore, a systematic methodology developing the parallel computations is proposed for FPGA platform-based realization of this CBDS. Experiments finally validate the proposed systematic methodology. Project supported by China Postdoctoral Science Foundation (Grant No. 2014M552175), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, Chinese Education Ministry, the National Natural Science Foundation of China (Grant No. 61172023), and the Specialized Research Foundation of Doctoral Subjects of Chinese Education Ministry (Grant No. 20114420110003).
Growing Land-Sea Temperature Contrast and the Intensification of Arctic Cyclones
NASA Astrophysics Data System (ADS)
Day, Jonathan J.; Hodges, Kevin I.
2018-04-01
Cyclones play an important role in the coupled dynamics of the Arctic climate system on a range of time scales. Modeling studies suggest that storminess will increase in Arctic summer due to enhanced land-sea thermal contrast along the Arctic coastline, in a region known as the Arctic Frontal Zone (AFZ). However, the climate models used in these studies are poor at reproducing the present-day Arctic summer cyclone climatology and so their projections of Arctic cyclones and related quantities, such as sea ice, may not be reliable. In this study we perform composite analysis of Arctic cyclone statistics using AFZ variability as an analog for climate change. High AFZ years are characterized both by increased cyclone frequency and dynamical intensity, compared to low years. Importantly, the size of the response in this analog suggests that General Circulation Models may underestimate the response of Arctic cyclones to climate change, given a similar change in baroclinicity.
NASA Astrophysics Data System (ADS)
Passerini, Tiziano; Veneziani, Alessandro; Sangalli, Laura; Secchi, Piercesare; Vantini, Simone
2010-11-01
In cerebral blood circulation, the interplay of arterial geometrical features and flow dynamics is thought to play a significant role in the development of aneurysms. In the framework of the Aneurisk project, patient-specific morphology reconstructions were conducted with the open-source software VMTK (www.vmtk.org) on a set of computational angiography images provided by Ospedale Niguarda (Milano, Italy). Computational fluid dynamics (CFD) simulations were performed with a software based on the library LifeV (www.lifev.org). The joint statistical analysis of geometries and simulations highlights the possible association of certain spatial patterns of radius, curvature and shear load along the Internal Carotid Artery (ICA) with the presence, position and previous event of rupture of an aneurysm in the entire cerebral vasculature. Moreover, some possible landmarks are identified to be monitored for the assessment of a Potential Rupture Risk Index.
Report to DHS on Summer Internship 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckwith, R H
2006-07-26
This summer I worked at Lawrence Livermore National Laboratory in a bioforensics collection and extraction research group under David Camp. The group is involved with researching efficiencies of various methods for collecting bioforensic evidence from crime scenes. The different methods under examination are a wipe, swab, HVAC filter and a vacuum. The vacuum is something that has particularly gone uncharacterized. My time was spent mostly on modeling and calculations work, but at the end of the summer I completed my internship with a few experiments to supplement my calculations. I had two major projects this summer. My first major projectmore » this summer involved fluid mechanics modeling of collection and extraction situations. This work examines different fluid dynamic models for the case of a micron spore attached to a fiber. The second project I was involved with was a statistical analysis of the different sampling techniques.« less
OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS
OTT, WILLIAM; RIVAS, MAURICIO A.; WEST, JAMES
2016-01-01
Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝN using a ‘typical’ nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time-T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence). PMID:28066028
OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS.
Ott, William; Rivas, Mauricio A; West, James
2015-12-01
Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝ N using a 'typical' nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C 1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time- T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence).
Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koprinkov, I. G.
2010-11-25
The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.
Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics
NASA Astrophysics Data System (ADS)
Altaner, Bernhard
2017-11-01
Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Bernhard Altaner was selected by the Editorial Board of J. Phys. A as an Emerging Talent.
Machine Learning-based discovery of closures for reduced models of dynamical systems
NASA Astrophysics Data System (ADS)
Pan, Shaowu; Duraisamy, Karthik
2017-11-01
Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Quantifying Climate Change Hydrologic Risk at NASA Ames Research Center
NASA Astrophysics Data System (ADS)
Mills, W. B.; Bromirski, P. D.; Coats, R. N.; Costa-Cabral, M.; Fong, J.; Loewenstein, M.; Milesi, C.; Miller, N.; Murphy, N.; Roy, S.
2013-12-01
In response to 2009 Executive Order 13514 mandating U.S. federal agencies to evaluate infrastructure vulnerabilities due to climate variability and change we provide an analysis of future climate flood risk at NASA Ames Research Center (Ames) along South S.F. Bay. This includes likelihood analysis of large-scale water vapor transport, statistical analysis of intense precipitation, high winds, sea level rise, storm surge, estuary dynamics, saturated overland flooding, and likely impacts to wetlands and habitat loss near Ames. We use the IPCC CMIP5 data from three Atmosphere-Ocean General Circulation Models with Radiative Concentration Pathways of 8.5 Wm-2 and 4.5 Wm-2 and provide an analysis of climate variability and change associated with flooding and impacts at Ames. Intense storms impacting Ames are due to two large-scale processes, sub-tropical atmospheric rivers (AR) and north Pacific Aleutian low-pressure (AL) storm systems, both of which are analyzed here in terms of the Integrated Water Vapor (IWV) exceeding a critical threshold within a search domain and the wind vector transporting the IWV from southerly to westerly to northwesterly for ARs and northwesterly to northerly for ALs and within the Ames impact area during 1970-1999, 2040-2069, and 2070-2099. We also include a statistical model of extreme precipitation at Ames based on large-scale climatic predictors, and characterize changes using CMIP5 projections. Requirements for levee height to protect Ames are projected to increase and continually accelerate throughout this century as sea level rises. We use empirical statistical and analytical methods to determine the likelihood, in each year from present through 2099, of water level surpassing different threshold values in SF Bay near NASA Ames. We study the sensitivity of the water level corresponding to a 1-in-10 and 1-in-100 likelihood of exceedance to changes in the statistical distribution of storm surge height and ENSO height, in addition to increasing mean sea level. We examine the implications in the face of the CMIP5 projections. Storm intensification may result in increased flooding hazards at Ames. We analyze how the changes in precipitation intensity will impact the storm drainage system at Ames through continuous stormwater modeling of runoff with the EPA model SWMM 5 and projected downscaled daily precipitation data. Although extreme events will not adversely affect wetland habitats, adaptation projects--especially levee construction and improvement--will require filling of wetlands. Federal law mandates mitigation for fill placed in wetlands. We are currently calculating the potential mitigation burden by habitat type.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
NASA Astrophysics Data System (ADS)
Tu, Weichao; Cunningham, G. S.; Chen, Y.; Henderson, M. G.; Camporeale, E.; Reeves, G. D.
2013-10-01
a response to the Geospace Environment Modeling (GEM) "Global Radiation Belt Modeling Challenge," a 3D diffusion model is used to simulate the radiation belt electron dynamics during two intervals of the Combined Release and Radiation Effects Satellite (CRRES) mission, 15 August to 15 October 1990 and 1 February to 31 July 1991. The 3D diffusion model, developed as part of the Dynamic Radiation Environment Assimilation Model (DREAM) project, includes radial, pitch angle, and momentum diffusion and mixed pitch angle-momentum diffusion, which are driven by dynamic wave databases from the statistical CRRES wave data, including plasmaspheric hiss, lower-band, and upper-band chorus. By comparing the DREAM3D model outputs to the CRRES electron phase space density (PSD) data, we find that, with a data-driven boundary condition at Lmax = 5.5, the electron enhancements can generally be explained by radial diffusion, though additional local heating from chorus waves is required. Because the PSD reductions are included in the boundary condition at Lmax = 5.5, our model captures the fast electron dropouts over a large L range, producing better model performance compared to previous published results. Plasmaspheric hiss produces electron losses inside the plasmasphere, but the model still sometimes overestimates the PSD there. Test simulations using reduced radial diffusion coefficients or increased pitch angle diffusion coefficients inside the plasmasphere suggest that better wave models and more realistic radial diffusion coefficients, both inside and outside the plasmasphere, are needed to improve the model performance. Statistically, the results show that, with the data-driven outer boundary condition, including radial diffusion and plasmaspheric hiss is sufficient to model the electrons during geomagnetically quiet times, but to best capture the radiation belt variations during active times, pitch angle and momentum diffusion from chorus waves are required.
A web-based relational database for monitoring and analyzing mosquito population dynamics.
Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric
2008-07-01
Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.
Mills, Paul C; Woodall, Peter F; Bellingham, Mark; Noad, Michael; Lloyd, Shan
2007-01-01
There is a tendency for students from different nationalities to remain within groups of similar cultural backgrounds. The study reported here used group project work to encourage integration and cooperative learning between Australian students and Asian (Southeast Asian) international students in the second year of a veterinary science program. The group project involved an oral presentation during a second-year course (Structure and Function), with group formation engineered to include very high, high, moderate, and low achievers (based on previous grades). One Asian student and three Australian students were placed in each group. Student perceptions of group dynamics were analyzed through a self-report survey completed at the end of the presentations and through group student interviews. Results from the survey were analyzed by chi-square to compare the responses between Asian and Australian students, with statistical significance accepted at p < 0.05. There were too few Asian students for statistical analysis from a single year; therefore, the results from two successive years, 2004 (N = 104; 26% Asian) and 2005 (N = 105; 20% Asian), were analyzed. All participating students indicated in the interviews that the project was worthwhile and a good learning experience. Asian students expressed a greater preference for working in a group than for working alone (p = 0.001) and reported more frequently than Australian students that teamwork produces better results (p = 0.01). Australian students were more likely than Asian students to voice their opinion in a team setting (p = 0.001), while Asian students were more likely to depend on the lecturer for directions (p = 0.001). The results also showed that group project work appeared to create an environment that supported learning and was a successful strategy to achieve acceptance of cultural differences.
A SYSTEMATIC ANALYSIS OF CAUSTIC METHODS FOR GALAXY CLUSTER MASSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gifford, Daniel; Miller, Christopher; Kern, Nicholas
We quantify the expected observed statistical and systematic uncertainties of the escape velocity as a measure of the gravitational potential and total mass of galaxy clusters. We focus our attention on low redshift (z {<=}0.15) clusters, where large and deep spectroscopic datasets currently exist. Utilizing a suite of Millennium Simulation semi-analytic galaxy catalogs, we find that the dynamical mass, as traced by either the virial relation or the escape velocity, is robust to variations in how dynamical friction is applied to ''orphan'' galaxies in the mock catalogs (i.e., those galaxies whose dark matter halos have fallen below the resolution limit).more » We find that the caustic technique recovers the known halo masses (M{sub 200}) with a third less scatter compared to the virial masses. The bias we measure increases quickly as the number of galaxies used decreases. For N{sub gal} > 25, the scatter in the escape velocity mass is dominated by projections along the line-of-sight. Algorithmic uncertainties from the determination of the projected escape velocity profile are negligible. We quantify how target selection based on magnitude, color, and projected radial separation can induce small additional biases into the escape velocity masses. Using N{sub gal} = 150 (25), the caustic technique has a per cluster scatter in ln (M|M{sub 200}) of 0.3 (0.5) and bias 1% {+-} 3{r_brace} (16% {+-} 5{r_brace}) for clusters with masses >10{sup 14} M{sub Sun} at z < 0.15.« less
A perturbation method to the tent map based on Lyapunov exponent and its application
NASA Astrophysics Data System (ADS)
Cao, Lv-Chen; Luo, Yu-Ling; Qiu, Sen-Hui; Liu, Jun-Xiu
2015-10-01
Perturbation imposed on a chaos system is an effective way to maintain its chaotic features. A novel parameter perturbation method for the tent map based on the Lyapunov exponent is proposed in this paper. The pseudo-random sequence generated by the tent map is sent to another chaos function — the Chebyshev map for the post processing. If the output value of the Chebyshev map falls into a certain range, it will be sent back to replace the parameter of the tent map. As a result, the parameter of the tent map keeps changing dynamically. The statistical analysis and experimental results prove that the disturbed tent map has a highly random distribution and achieves good cryptographic properties of a pseudo-random sequence. As a result, it weakens the phenomenon of strong correlation caused by the finite precision and effectively compensates for the digital chaos system dynamics degradation. Project supported by the Guangxi Provincial Natural Science Foundation, China (Grant No. 2014GXNSFBA118271), the Research Project of Guangxi University, China (Grant No. ZD2014022), the Fund from Guangxi Provincial Key Laboratory of Multi-source Information Mining & Security, China (Grant No. MIMS14-04), the Fund from the Guangxi Provincial Key Laboratory of Wireless Wideband Communication & Signal Processing, China (Grant No. GXKL0614205), the Education Development Foundation and the Doctoral Research Foundation of Guangxi Normal University, the State Scholarship Fund of China Scholarship Council (Grant No. [2014]3012), and the Innovation Project of Guangxi Graduate Education, China (Grant No. YCSZ2015102).
DOT National Transportation Integrated Search
2010-12-01
This report presents the results for the national evaluation of the FY 2003 Earmarked ITS Integration Project: Southern Wyoming, I-80 Dynamic Message Signs. The I-80 Dynamic Message Signs project is a rural infrastructure deployment of ITS devices th...
Acquisition and production of skilled behavior in dynamic decision-making tasks
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1993-01-01
Summaries of the four projects completed during the performance of this research are included. The four projects described are: Perceptual Augmentation Aiding for Situation Assessment, Perceptual Augmentation Aiding for Dynamic Decision-Making and Control, Action Advisory Aiding for Dynamic Decision-Making and Control, and Display Design to Support Time-Constrained Route Optimization. Papers based on each of these projects are currently in preparation. The theoretical framework upon which the first three projects are based, Ecological Task Analysis, was also developed during the performance of this research, and is described in a previous report. A project concerned with modeling strategies in human control of a dynamic system was also completed during the performance of this research.
A microscopic model of the Stokes-Einstein relation in arbitrary dimension.
Charbonneau, Benoit; Charbonneau, Patrick; Szamel, Grzegorz
2018-06-14
The Stokes-Einstein relation (SER) is one of the most robust and widely employed results from the theory of liquids. Yet sizable deviations can be observed for self-solvation, which cannot be explained by the standard hydrodynamic derivation. Here, we revisit the work of Masters and Madden [J. Chem. Phys. 74, 2450-2459 (1981)], who first solved a statistical mechanics model of the SER using the projection operator formalism. By generalizing their analysis to all spatial dimensions and to partially structured solvents, we identify a potential microscopic origin of some of these deviations. We also reproduce the SER-like result from the exact dynamics of infinite-dimensional fluids.
Long-term noise statistics from the Gulf of Mexico
NASA Astrophysics Data System (ADS)
Eller, Anthony I.; Ioup, George E.; Ioup, Juliette W.; Larue, James P.
2003-04-01
Long-term, omnidirectional acoustic noise measurements were conducted in the northeastern Gulf of Mexico during the summer of 2001. These efforts were a part of the Littoral Acoustic Demonstration Center project, Phase I. Initial looks at the noise time series, processed in standard one-third-octave bands from 10 to 5000 Hz, show noise levels that differ substantially from customary deep-water noise spectra. Contributing factors to this highly dynamic noise environment are an abundance of marine mammal emissions and various industrial noises. Results presented here address long-term temporal variability, temporal coherence times, the fluctuation spectrum, and coherence of fluctuations across the frequency spectrum. [Research supported by ONR.
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalski, D; Huq, M; Bednarz, G
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less
NASA Astrophysics Data System (ADS)
Tibaduiza, D.-A.; Torres-Arredondo, M.-A.; Mujica, L. E.; Rodellar, J.; Fritzen, C.-P.
2013-12-01
This article is concerned with the practical use of Multiway Principal Component Analysis (MPCA), Discrete Wavelet Transform (DWT), Squared Prediction Error (SPE) measures and Self-Organizing Maps (SOM) to detect and classify damages in mechanical structures. The formalism is based on a distributed piezoelectric active sensor network for the excitation and detection of structural dynamic responses. Statistical models are built using PCA when the structure is known to be healthy either directly from the dynamic responses or from wavelet coefficients at different scales representing Time-frequency information. Different damages on the tested structures are simulated by adding masses at different positions. The data from the structure in different states (damaged or not) are then projected into the different principal component models by each actuator in order to obtain the input feature vectors for a SOM from the scores and the SPE measures. An aircraft fuselage from an Airbus A320 and a multi-layered carbon fiber reinforced plastic (CFRP) plate are used as examples to test the approaches. Results are presented, compared and discussed in order to determine their potential in structural health monitoring. These results showed that all the simulated damages were detectable and the selected features proved capable of separating all damage conditions from the undamaged state for both approaches.
Building integral projection models: a user's guide
Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim
2014-01-01
In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157
Effects of household dynamics on resource consumption and biodiversity.
Liu, Jianguo; Daily, Gretchen C; Ehrlich, Paul R; Luck, Gary W
2003-01-30
Human population size and growth rate are often considered important drivers of biodiversity loss, whereas household dynamics are usually neglected. Aggregate demographic statistics may mask substantial changes in the size and number of households, and their effects on biodiversity. Household dynamics influence per capita consumption and thus biodiversity through, for example, consumption of wood for fuel, habitat alteration for home building and associated activities, and greenhouse gas emissions. Here we report that growth in household numbers globally, and particularly in countries with biodiversity hotspots (areas rich in endemic species and threatened by human activities), was more rapid than aggregate population growth between 1985 and 2000. Even when population size declined, the number of households increased substantially. Had the average household size (that is, the number of occupants) remained static, there would have been 155 million fewer households in hotspot countries in 2000. Reduction in average household size alone will add a projected 233 million additional households to hotspot countries during the period 2000-15. Rapid increase in household numbers, often manifested as urban sprawl, and resultant higher per capita resource consumption in smaller households pose serious challenges to biodiversity conservation.
Undergraduate Statistics Education and the National Science Foundation
ERIC Educational Resources Information Center
Hall, Megan R.; Rowell, Ginger Holmes
2008-01-01
This paper describes 25 National Science Foundation supported projects that have innovations designed to improve education for students majoring or minoring in statistics. The characteristics of these projects and the common themes which emerge are compared with the American Statistical Association's (ASA) guidelines for developing statistics…
Introductory Statistics Education and the National Science Foundation
ERIC Educational Resources Information Center
Hall, Megan R.; Rowell, Ginger Holmes
2008-01-01
This paper describes 27 National Science Foundation supported grant projects that have innovations designed to improve teaching and learning in introductory statistics courses. The characteristics of these projects are compared with the six recommendations given in the "Guidelines for Assessment and Instruction in Statistics Education (GAISE)…
Temporal scaling and spatial statistical analyses of groundwater level fluctuations
NASA Astrophysics Data System (ADS)
Sun, H.; Yuan, L., Sr.; Zhang, Y.
2017-12-01
Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
A microprocessor card software server to support the Quebec health microprocessor card project.
Durant, P; Bérubé, J; Lavoie, G; Gamache, A; Ardouin, P; Papillon, M J; Fortin, J P
1995-01-01
The Quebec Health Smart Card Project is advocating the use of a memory card software server[1] (SCAM) to implement a portable medical record (PMR) on a smart card. The PMR is viewed as an object that can be manipulated by SCAM's services. In fact, we can talk about a pseudo-object-oriented approach. This software architecture provides a flexible and evolutive way to manage and optimize the PMR. SCAM is a generic software server; it can manage smart cards as well as optical (laser) cards or other types of memory cards. But, in the specific case of the Quebec Health Card Project, SCAM is used to provide services between physicians' or pharmacists' software and IBM smart card technology. We propose to expose the concepts and techniques used to provide a generic environment to deal with smart cards (and more generally with memory cards), to obtain a dynamic an evolutive PMR, to raise the system global security level and the data integrity, to optimize significantly the management of the PMR, and to provide statistic information about the use of the PMR.
Zooming in on vibronic structure by lowest-value projection reconstructed 4D coherent spectroscopy
NASA Astrophysics Data System (ADS)
Harel, Elad
2018-05-01
A fundamental goal of chemical physics is an understanding of microscopic interactions in liquids at and away from equilibrium. In principle, this microscopic information is accessible by high-order and high-dimensionality nonlinear optical measurements. Unfortunately, the time required to execute such experiments increases exponentially with the dimensionality, while the signal decreases exponentially with the order of the nonlinearity. Recently, we demonstrated a non-uniform acquisition method based on radial sampling of the time-domain signal [W. O. Hutson et al., J. Phys. Chem. Lett. 9, 1034 (2018)]. The four-dimensional spectrum was then reconstructed by filtered back-projection using an inverse Radon transform. Here, we demonstrate an alternative reconstruction method based on the statistical analysis of different back-projected spectra which results in a dramatic increase in sensitivity and at least a 100-fold increase in dynamic range compared to conventional uniform sampling and Fourier reconstruction. These results demonstrate that alternative sampling and reconstruction methods enable applications of increasingly high-order and high-dimensionality methods toward deeper insights into the vibronic structure of liquids.
Luedeling, Eike; Zhang, Minghua; Girvetz, Evan H
2009-07-16
Winter chill is one of the defining characteristics of a location's suitability for the production of many tree crops. We mapped and investigated observed historic and projected future changes in winter chill in California, quantified with two different chilling models (Chilling Hours, Dynamic Model). Based on hourly and daily temperature records, winter chill was modeled for two past temperature scenarios (1950 and 2000), and 18 future scenarios (average conditions during 2041-2060 and 2080-2099 under each of the B1, A1B and A2 IPCC greenhouse gas emissions scenarios, for the CSIRO-MK3, HadCM3 and MIROC climate models). For each scenario, 100 replications of the yearly temperature record were produced, using a stochastic weather generator. We then introduced and mapped a novel climatic statistic, "safe winter chill", the 10% quantile of the resulting chilling distributions. This metric can be interpreted as the amount of chilling that growers can safely expect under each scenario. Winter chill declined substantially for all emissions scenarios, with the area of safe winter chill for many tree species or cultivars decreasing 50-75% by mid-21st century, and 90-100% by late century. Both chilling models consistently projected climatic conditions by the middle to end of the 21st century that will no longer support some of the main tree crops currently grown in California, with the Chilling Hours Model projecting greater changes than the Dynamic Model. The tree crop industry in California will likely need to develop agricultural adaptation measures (e.g. low-chill varieties and dormancy-breaking chemicals) to cope with these projected changes. For some crops, production might no longer be possible.
An ocean dynamical thermostat—dominant in observations, absent in climate models
NASA Astrophysics Data System (ADS)
Coats, S.; Karnauskas, K. B.
2016-12-01
The pattern of sea surface temperature (SST) in the tropical Pacific Ocean is coupled to the Walker circulation, necessitating an understanding of how this pattern will change in response to anthropogenic radiative forcing. State-of-the-art climate models from the Coupled Model Intercomparison Project phase 5 (CMIP5) overwhelmingly project a decrease in the tropical Pacific zonal SST gradient over the coming century. This decrease in the zonal SST gradient is a response of the ocean to a weakening Walker circulation in the CMIP5 models, a consequence of the mass and energy balances of the hydrologic cycle identified by Held and Soden (2006). CMIP5 models, however, are not able to reproduce the observed increase in the zonal SST gradient between 1900-2013 C.E., which we argue to be robust using advanced statistical techniques and new observational datasets. While the observed increase in the zonal SST gradient is suggestive of the ocean dynamical thermostat mechanism of Clement et al. (1996), a strengthening Equatorial Undercurrent (EUC) also contributes to eastern equatorial Pacific cooling. Importantly, the strengthening EUC is a response of the ocean to a seasonal weakening of the Walker circulation and thus can reconcile disparate observations of changes to the atmosphere and ocean in the equatorial Pacific. CMIP5 models do not capture the magnitude of this response of the EUC to anthropogenic radiative forcing potentially because of biases in the sensitivity of the EUC to changes in zonal wind stress, like the weakening Walker circulation. Consequently, they project a continuation of the opposite to what has been observed in the real world, with potentially serious consequences for projected climate impacts that are influenced by the tropical Pacific.
NASA Astrophysics Data System (ADS)
Coats, Sloan; Karnauskas, Kristopher
2017-04-01
The pattern of sea surface temperature (SST) in the tropical Pacific Ocean provides an important control on global climate, necessitating an understanding of how this pattern will change in response to anthropogenic radiative forcing. State-of-the-art climate models from the Coupled Model Intercomparison Project phase 5 (CMIP5) overwhelmingly project a decrease in the tropical Pacific zonal SST gradient over the coming century. This decrease is, in part, a response of the ocean to a weakening Walker circulation in the CMIP5 models, a consequence of the mass and energy balances of the hydrologic cycle identified by Held and Soden (2006). CMIP5 models, however, are not able to reproduce the observed increase in the zonal SST gradient between 1900-2013 C.E., which we argue to be robust using advanced statistical techniques and new observational datasets. While this increase is suggestive of the ocean dynamical thermostat mechanism of Clement et al. (1996), we provide evidence that a strengthening Equatorial Undercurrent (EUC) also contributes to eastern equatorial Pacific cooling. Importantly, the strengthening EUC is a response of the ocean to a weakening Walker circulation and thus can help to reconcile the range of opposing theories and observations of anthropogenic climate change in the tropical Pacific Ocean. Because of a newly identified bias in their simulation of equatorial coupled atmosphere-ocean dynamics, however, CMIP5 models do not capture the magnitude of the response of the EUC to anthropogenic radiative forcing. Consequently, they project a continuation of the opposite to what has been observed in the real world, with potentially serious consequences for projected climate impacts that are influenced by the tropical Pacific Ocean.
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Projections of Education Statistics to 2001: An Update.
ERIC Educational Resources Information Center
Gerald, Debra E.; Hussar, William J.
Statistical projections for elementary and secondary schools and institutions of higher education are provided at the national and state levels through the year 2001. National projection tables cover enrollment, high school graduates, earned degrees conferred, classroom teachers, and expenditures of public elementary and secondary schools.…
Improving Statistical Skills through Students' Participation in the Development of Resources
ERIC Educational Resources Information Center
Biza, Irene; Vande Hey, Eugénie
2015-01-01
This paper summarizes the evaluation of a project that involved undergraduate mathematics students in the development of teaching and learning resources for statistics modules taught in various departments of a university. This evaluation regards students' participation in the project and its impact on their learning of statistics, as…
Making Statistics "Real" for Social Work Students
ERIC Educational Resources Information Center
Wells, Melissa
2006-01-01
This article presents results from an evaluation of service learning in statistics courses for master of social work students. The article provides an overview of the application of a community-based statistics project, describes student feedback regarding the project, and illustrates some strengths and limitations of using this pedagogy with…
Martin, Daniel R; Matyushov, Dmitry V
2012-08-30
We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.
NASA Technical Reports Server (NTRS)
Schweikhard, W. G.; Chen, Y. S.
1986-01-01
The Melick method of inlet flow dynamic distortion prediction by statistical means is outlined. A hypothetic vortex model is used as the basis for the mathematical formulations. The main variables are identified by matching the theoretical total pressure rms ratio with the measured total pressure rms ratio. Data comparisons, using the HiMAT inlet test data set, indicate satisfactory prediction of the dynamic peak distortion for cases with boundary layer control device vortex generators. A method for the dynamic probe selection was developed. Validity of the probe selection criteria is demonstrated by comparing the reduced-probe predictions with the 40-probe predictions. It is indicated that the the number of dynamic probes can be reduced to as few as two and still retain good accuracy.
A functional-dynamic reflection on participatory processes in modeling projects.
Seidl, Roman
2015-12-01
The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.
Sapsis, Themistoklis P; Majda, Andrew J
2013-08-20
A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.
A Survey of Probabilistic Methods for Dynamical Systems with Uncertain Parameters.
1986-05-01
J., "An Approach to the Theoretical Background of Statistical Energy Analysis Applied to Structural Vibration," Journ. Acoust. Soc. Amer., Vol. 69...1973, Sect. 8.3. 80. Lyon, R.H., " Statistical Energy Analysis of Dynamical Systems," M.I.T. Press, 1975. e) Late References added in Proofreading !! 81...Dowell, E.H., and Kubota, Y., "Asymptotic Modal Analysis and ’~ y C-" -165- Statistical Energy Analysis of Dynamical Systems," Journ. Appi. - Mech
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... statistical and other methodological consultation to this collaborative project. Discussion: Grantees under... and technical assistance must be designed to contribute to the following outcomes: (a) Maintenance of... methodological consultation available for research projects that use the BMS Database, as well as site- specific...
ERIC Educational Resources Information Center
Williamson, Bob; And Others
1985-01-01
Describes Statistical Work Analysis Teams (S.W.A.T.), which marry the two factors necessary for successful statistical analysis with the personal nature of attribute data into a single effort. Discusses S.W.A.T. project guidelines, implementation of the first S.W.A.T. projects, team training, and project completion. (CT)
Development of a funding, cost, and spending model for satellite projects
NASA Technical Reports Server (NTRS)
Johnson, Jesse P.
1989-01-01
The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.
Statistical-Dynamical Seasonal Forecasts of Central-Southwest Asian Winter Precipitation.
NASA Astrophysics Data System (ADS)
Tippett, Michael K.; Goddard, Lisa; Barnston, Anthony G.
2005-06-01
Interannual precipitation variability in central-southwest (CSW) Asia has been associated with East Asian jet stream variability and western Pacific tropical convection. However, atmospheric general circulation models (AGCMs) forced by observed sea surface temperature (SST) poorly simulate the region's interannual precipitation variability. The statistical-dynamical approach uses statistical methods to correct systematic deficiencies in the response of AGCMs to SST forcing. Statistical correction methods linking model-simulated Indo-west Pacific precipitation and observed CSW Asia precipitation result in modest, but statistically significant, cross-validated simulation skill in the northeast part of the domain for the period from 1951 to 1998. The statistical-dynamical method is also applied to recent (winter 1998/99 to 2002/03) multimodel, two-tier December-March precipitation forecasts initiated in October. This period includes 4 yr (winter of 1998/99 to 2001/02) of severe drought. Tercile probability forecasts are produced using ensemble-mean forecasts and forecast error estimates. The statistical-dynamical forecasts show enhanced probability of below-normal precipitation for the four drought years and capture the return to normal conditions in part of the region during the winter of 2002/03.May Kabul be without gold, but not without snow.—Traditional Afghan proverb
Utilizing NASA EOS Data for Fire Management in el Departmento del Valle del Cauco, Colombia
NASA Astrophysics Data System (ADS)
Brenton, J. C.; Bledsoe, N.; Alabdouli, K.
2012-12-01
In the last few years, fire incidence in Colombian wild areas has increased, damaging pristine forests into savannas and sterile lands. Fire poses a significant threat to biodiversity, rural communities and established infrastructure. These events issue an urgent need to address this problem. NASA Earth Observing System (EOS) can play a significant role in the monitoring fires and natural disasters. SERVIR, the Regional Visualization and Monitoring Network, constitutes a platform for the observation, forecasting and modeling of environmental processes in Central America. A project called "The GIS for fire management in Guatemala (SIGMA-I)" has been already conducted to address the same problem in another Latin American country, Guatemala. SIGMA-I was developed by the Inter-agency work among the National protected areas council (CONAP), National Forestry Institution (INAB), the National Coordinator for Disaster Reduction / National Forest Fire Prevention and Control System (CONRED/SIPECIF), and the Ministry of the Environment and National Resources (MARN) in Guatemala under the guidance and assistance of SERVIR. With SIGMA-I as an example, we proposed to conduct a similar project for the country of Colombia. First, a pilot study in the area of the watershed of the Cali River, Colombia was conducted to ensure that the data was available and that the maps and models were accurate. The proposed study will investigate the technical resources required: 1.) A fire map with a compilation of ignition data (hot spots) utilizing Fire Information for Resource Management System (FIRMS) derived from MODIS (Moderate Resolution Imaging Spectroradiometer) products MOD14 and MYD14 2.) A map of fire scars derived from medium resolution satellite data (ASTER) during the period 2003-2011 for the entire country, and a map of fire scar recurrence and statistics derived from the datasets produced. 3.) A pattern analysis and ignition cause model derived from a matrix of variables statistically exploring the demographic and environmental factors of fire risk, such as land surface temperature, precipitation, and NDVI .4.) A dynamic fire risk evaluation able to generate a dynamic map of ignition risk based on statistical analysis factors. This study aims to research integrating MODIS, Landsat and ASTER data along with in-situ data on environmental parameters from the Corporation of the Cauca Valley River (CVC) along with other data on social, economical and cultural variables obtained by researchers of the Wild Fire Observatory (OCIF) from the "Universidad Autónoma de Occidente" in order to create an ignition cause model, dynamic fire risk evaluation system and compile any and all geospatial data generated for the region. In this way the research will help predict and forecast fire vulnerabilities in the region. The team undertook this project through SERVIR with the guidance of the scientist, Victor Hugo Ramos, who was the leader and principal investigator on the SIGMA-I.
NASA Technical Reports Server (NTRS)
Bales, K. S.
1983-01-01
The objectives, expected results, approach, and milestones for research projects of the IPAD Project Office and the impact dynamics, structural mechanics, and structural dynamics branches of the Structures and Dynamics Division are presented. Research facilities are described. Topics covered include computer aided design; general aviation/transport crash dynamics; aircraft ground performance; composite structures; failure analysis, space vehicle dynamics; and large space structures.
Quinoa - Adaptive Computational Fluid Dynamics, 0.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakosi, Jozsef; Gonzalez, Francisco; Rogers, Brandon
Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter,more » an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh, (http://www.geuz.org/gmsh), Netgen, (http://sourceforge.net/apps/mediawiki/netgen-mesher), ExodusII, (http://sourceforge.net/projects/exodusii), HyperMesh, (http://www.altairhyperworks.com/product/HyperMesh).« less
1987-06-15
001 GENERAL DYNAMICS 00 FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Repc t JUNG 0 ?7 PROJECT 28 AUTOMATION...DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Report PROJECT 28 AUTOMATION OF RECEIVING, RECEIVING...13 6 PROJECT ASSUMPTIONS 20 7 PRELIMINARY/FINAL DESIGN AND FINDINGS 21 8 SYSTEM/EQUIPMENT/MACHINING SPECIFICATIONS 37 9 VENDOR/ INDUSTRY ANALYSIS
The Application of Statistics Education Research in My Classroom
ERIC Educational Resources Information Center
Jordan, Joy
2007-01-01
A collaborative, statistics education research project (Lovett, 2001) is discussed. Some results of the project were applied in the computer lab sessions of my elementary statistics course. I detail the process of applying these research results, as well as the use of knowledge surveys. Furthermore, I give general suggestions to teachers who want…
Prototype development and demonstration for integrated dynamic transit operations.
DOT National Transportation Integrated Search
2016-01-01
This document serves as the Final Report specific to the Integrated Dynamic Transit Operations (IDTO) Prototype Development and Deployment Project, hereafter referred to as IDTO Prototype Deployment or IDTO PD project. This project was performed unde...
ERIC Educational Resources Information Center
Arantes do Amaral, Joao Alberto
2017-01-01
In this case study we discuss the dynamics that drive a free-of-charge project-based learning extension course. We discuss the lessons learned in the course, "Laboratory of Social Projects." The course aimed to teach project management skills to the participants. It was conducted from August to November of 2015, at Federal University of…
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
The Pelagics Habitat Analysis Module (PHAM): Decision Support Tools for Pelagic Fisheries
NASA Astrophysics Data System (ADS)
Armstrong, E. M.; Harrison, D. P.; Kiefer, D.; O'Brien, F.; Hinton, M.; Kohin, S.; Snyder, S.
2009-12-01
PHAM is a project funded by NASA to integrate satellite imagery and circulation models into the management of commercial and threatened pelagic species. Specifically, the project merges data from fishery surveys, and fisheries catch and effort data with satellite imagery and circulation models to define the habitat of each species. This new information on habitat will then be used to inform population distribution and models of population dynamics that are used for management. During the first year of the project, we created two prototype modules. One module, which was developed for the Inter-American Tropical Tuna Commission, is designed to help improve information available to manage the tuna fisheries of the eastern Pacific Ocean. The other module, which was developed for the Coastal Pelagics Division of the Southwest Fishery Science Center, assists management of by-catch of mako, blue, and thresher sharks along the Californian coast. Both modules were built with the EASy marine geographic information system, which provides a 4 dimensional (latitude, longitude, depth, and time) home for integration of the data. The projects currently provide tools for automated downloading and geo-referencing of satellite imagery of sea surface temperature, height, and chlorophyll concentrations; output from JPL’s ECCO2 global circulation model and its ROM California current model; and gridded data from fisheries and fishery surveys. It also provides statistical tools for defining species habitat from these and other types of environmental data. These tools include unbalanced ANOVA, EOF analysis of satellite imagery, and multivariate search routines for fitting fishery data to transforms of the environmental data. Output from the projects consists of dynamic maps of the distribution of the species that are driven by the time series of satellite imagery and output from the circulation models. It also includes relationships between environmental variables and recruitment. During the talk, we will briefly demonstrate features of the software and present the results of our analyses of habitats.
A Study of Arizona Labor Market Demand Data for Vocational Education Planning.
ERIC Educational Resources Information Center
Gould, Albert W.; Manning, Doris E.
A study examined the project methodology used by the Bureau of Labor Statistics and the related projections made by the state employment security agencies. Findings from a literature review indicated that the system has steadily improved since 1979. Projections made from the Occupational Employment Statistics Surveys were remarkably accurate.…
Library Impact Data Project: Looking for the Link between Library Usage and Student Attainment
ERIC Educational Resources Information Center
Stone, Graham; Ramsden, Bryony
2013-01-01
The Library Impact Data Project was a six-month project funded by Jisc and managed by the University of Huddersfield to investigate this hypothesis: "There is a statistically significant correlation across a number of universities between library activity data and student attainment." E-resources usage, library borrowing statistics, and…
NASA Astrophysics Data System (ADS)
Xu, Lei; Chen, Nengcheng; Zhang, Xiang
2018-02-01
Drought is an extreme natural disaster that can lead to huge socioeconomic losses. Drought prediction ahead of months is helpful for early drought warning and preparations. In this study, we developed a statistical model, two weighted dynamic models and a statistical-dynamic (hybrid) model for 1-6 month lead drought prediction in China. Specifically, statistical component refers to climate signals weighting by support vector regression (SVR), dynamic components consist of the ensemble mean (EM) and Bayesian model averaging (BMA) of the North American Multi-Model Ensemble (NMME) climatic models, and the hybrid part denotes a combination of statistical and dynamic components by assigning weights based on their historical performances. The results indicate that the statistical and hybrid models show better rainfall predictions than NMME-EM and NMME-BMA models, which have good predictability only in southern China. In the 2011 China winter-spring drought event, the statistical model well predicted the spatial extent and severity of drought nationwide, although the severity was underestimated in the mid-lower reaches of Yangtze River (MLRYR) region. The NMME-EM and NMME-BMA models largely overestimated rainfall in northern and western China in 2011 drought. In the 2013 China summer drought, the NMME-EM model forecasted the drought extent and severity in eastern China well, while the statistical and hybrid models falsely detected negative precipitation anomaly (NPA) in some areas. Model ensembles such as multiple statistical approaches, multiple dynamic models or multiple hybrid models for drought predictions were highlighted. These conclusions may be helpful for drought prediction and early drought warnings in China.
NASA Astrophysics Data System (ADS)
Izvekov, Sergei
2017-03-01
We consider the generalized Langevin equations of motion describing exactly the particle-based coarse-grained dynamics in the classical microscopic ensemble that were derived recently within the Mori-Zwanzig formalism based on new projection operators [S. Izvekov, J. Chem. Phys. 138(13), 134106 (2013)]. The fundamental difference between the new family of projection operators and the standard Zwanzig projection operator used in the past to derive the coarse-grained equations of motion is that the new operators average out the explicit irrelevant trajectories leading to the possibility of solving the projected dynamics exactly. We clarify the definition of the projection operators and revisit the formalism to compute the projected dynamics exactly for the microscopic system in equilibrium. The resulting expression for the projected force is in the form of a "generalized additive fluctuating force" describing the departure of the generalized microscopic force associated with the coarse-grained coordinate from its projection. Starting with this key expression, we formulate a new exact formula for the memory function in terms of microscopic and coarse-grained conservative forces. We conclude by studying two independent limiting cases of practical importance: the Markov limit (vanishing correlations of projected force) and the limit of weak dependence of the memory function on the particle momenta. We present computationally affordable expressions which can be efficiently evaluated from standard molecular dynamics simulations.
a Process-Based Drought Early Warning Indicator for Supporting State Drought Mitigation Decision
NASA Astrophysics Data System (ADS)
Fu, R.; Fernando, D. N.; Pu, B.
2014-12-01
Drought prone states such as Texas requires creditable and actionable drought early warning ranging from seasonal to multi-decadal scales. Such information cannot be simply extracted from the available climate prediction and projections because of their large uncertainties at regional scales and unclear connections to the needs of the decision makers. In particular, current dynamic seasonal predictions and climate projections, such as those produced by the NOAA national multi-models ensemble experiment (NMME) and the IPCC AR5 (CMIP5) models, are much more reliable for winter and spring than for the summer season for the US Southern Plains. They also show little connection between the droughts in winter/spring and those in summer, in contrast to the observed dry memory from spring to summer over that region. To mitigate the weakness of dynamic prediction/projections, we have identified three key processes behind the spring-to-summer dry memory through observational studies. Based on these key processes and related fields, we have developed a multivariate principle component statistical model to provide a probabilistic summer drought early warning indicator, using the observed or predicted climate conditions in winter and spring on seasonal scale and climate projection for the mid-21stcentury. The summer drought early warning indicator is constructed in a similar way to the NOAA probabilistic predictions that are familiar to water resource managers. The indicator skill is assessed using the standard NOAA climate prediction assessment tools, i.e., the two alternative forced choice (2AFC) and the Receiver Operating Characteristic (ROC). Comparison with long-term observations suggest that this summer drought early warning indicator is able to capture nearly all the strong summer droughts and outperform the dynamic prediction in this regard over the US Southern Plains. This early warning indicator has been used by the state water agency in May 2014 in briefing the state drought preparedness council and will be provided to stake holders through the website of the Texas state water planning agency. We will also present the results of our ongoing work on using NASA satellite based soil moisture and vegetation stress measurements to further improve the reliability of the summer drought early warning indicator.
NASA Astrophysics Data System (ADS)
Castro, C. L.; Dominguez, F.; Chang, H.
2010-12-01
Current seasonal climate forecasts and climate change projections of the North American monsoon are based on the use of course-scale information from a general circulation model. The global models, however, have substantial difficulty in resolving the regional scale forcing mechanisms of precipitation. This is especially true during the period of the North American Monsoon in the warm season. Precipitation is driven primarily due to the diurnal cycle of convection, and this process cannot be resolve in coarse-resolution global models that have a relatively poor representation of terrain. Though statistical downscaling may offer a relatively expedient method to generate information more appropriate for the regional scale, and is already being used in the resource decision making processes in the Southwest U.S., its main drawback is that it cannot account for a non-stationary climate. Here we demonstrate the use of a regional climate model, specifically the Weather Research and Forecast (WRF) model, for dynamical downscaling of the North American Monsoon. To drive the WRF simulations, we use retrospective reforecasts from the Climate Forecast System (CFS) model, the operational model used at the U.S. National Center for Environmental Prediction, and three select “well performing” IPCC AR 4 models for the A2 emission scenario. Though relatively computationally expensive, the use of WRF as a regional climate model in this way adds substantial value in the representation of the North American Monsoon. In both cases, the regional climate model captures a fairly realistic and reasonable monsoon, where none exists in the driving global model, and captures the dominant modes of precipitation anomalies associated with ENSO and the Pacific Decadal Oscillation (PDO). Long-term precipitation variability and trends in these simulations is considered via the standardized precipitation index (SPI), a commonly used metric to characterize long-term drought. Dynamically downscaled climate projection data will be integrated into future water resource projections in the state of Arizona, through a cooperative effort involving numerous water resource stakeholders.
NASA Astrophysics Data System (ADS)
Sangelantoni, Lorenzo; Coluccelli, Alessandro; Russo, Aniello
2014-05-01
Marche region (central Italy, facing the Adriatic Sea) climate dynamics are connected to the Mediterranean basin, identified as one of the most sensitive areas to ongoing climate change. Taken into account difficulties to carry out an overarching assessment over the heterogeneous Mediterranean climate-change issues frame, we opted toward a consistent regional bordered study. Projected changes in mean seasonal temperature, with an introductory multi-statistical model performance evaluation and a future heat waves intensity and duration characterization, are here presented. Multi-model projections over Marche Region, on daily mean, minimum and maximum temperature, have been extracted from the outputs of a set of 7 Regional Climate Models (RCMs) over Europe run by several research Institutes participating to the EU ENSEMBLE project. These climate simulations from 1961 to 2100 refer to the boundary conditions of the IPCC A1B emission scenario, and have a horizontal resolution of 25km × 25km. Furthermore, two RCMs outputs from Med-CORDEX project, with a higher horizontal resolution (12km x 12km) and boundary conditions provided by the new Representative Concentration Pathway (RCP) 4.5 and 8.5, are considered. Observed daily mean, minimum and maximum temperature over Marche region domain have been extracted from E-OBS gridded data set (Version 9.0) referring to the period 1970-2004. This twofold work firstly provides a concise statistical summary of how well employed RCMs reproduce observed (1970-2004) mean temperature over Marche region in term of correlation, root-mean-square difference, and ratio of their variances, graphically displayed on a 2D-Taylor diagram. This multi-statistical model performance evaluation easily allows: - to compare the agreement with observation of the 9 individual RCMs - to compare RCMs with different horizontal resolution (12 km and 25 km) - to evaluate the improvement provided by the RCMs ensemble. Results indicate that the 9 RCMs ensemble provides the statistically best reproduction of the observed interannual mean temperature distribution. Secondly, we assessed projected seasonal ensemble average change in mean temperature referring to the ending 21st century obtained by comparison between 2071-2100 and 1961-1990 time slice modeled mean value over Marche region. Results emphasize summer as the season most affected by projected temperature increase (+4.5°C / +5.0°C), followed by spring season temperature increase (+3.5°C / +4.0°C). Finally, considering that some of the most severe health hazards arise from multi-day heat-waves, associated with both hot day-time and warm night-time temperatures, we assessed modeled trend (1961-2100) of the heat waves intensity and duration: intensity through the temporal evolution of the summer (J J A months) maximum and minimum temperature 90th percentile, heat waves length by temporal evolution of two detected threshold-based indices (annual maximum number of consecutive days characterized by Tmin >= 24°C and annual maximum number of consecutive days characterized by Tmax > = 32°C). Same analysis for both coastal and mountainous areas has been conducted. Future research plans aim to involve ensemble RCMs simulation, processed with bias correction methods, in forcing climate change impacts models, to provide a detailed regional heat waves impacts scenario, mainly over agriculture and health sectors.
Measuring radiation damage dynamics by pulsed ion beam irradiation: 2016 project annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kucheyev, Sergei O.
2017-01-04
The major goal of this project is to develop and demonstrate a novel experimental approach to access the dynamic regime of radiation damage formation in nuclear materials. In particular, the project exploits a pulsed-ion-beam method in order to gain insight into defect interaction dynamics by measuring effective defect interaction time constants and defect diffusion lengths. For Year 3, this project had the following two major milestones: (i) the demonstration of the measurement of thermally activated defect-interaction processes by pulsed ion beam techniques and (ii) the demonstration of alternative characterization techniques to study defect dynamics. As we describe below, both ofmore » these milestones have been met.« less
Research a Novel Integrated and Dynamic Multi-object Trade-Off Mechanism in Software Project
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yuhui
Aiming at practical requirements of present software project management and control, the paper presented to construct integrated multi-object trade-off model based on software project process management, so as to actualize integrated and dynamic trade-oil of the multi-object system of project. Based on analyzing basic principle of dynamic controlling and integrated multi-object trade-off system process, the paper integrated method of cybernetics and network technology, through monitoring on some critical reference points according to the control objects, emphatically discussed the integrated and dynamic multi- object trade-off model and corresponding rules and mechanism in order to realize integration of process management and trade-off of multi-object system.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
ERIC Educational Resources Information Center
Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh
2005-01-01
A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…
ERIC Educational Resources Information Center
Hassan, Mahamood M.; Schwartz, Bill N.
2014-01-01
This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…
Doing Research That Matters: A Success Story from Statistics Education
ERIC Educational Resources Information Center
Hipkins, Rosemary
2014-01-01
This is the first report from a new initiative called TLRI Project Plus. It aims to add value to the Teaching and Learning Research Initiative (TLRI), which NZCER manages on behalf of the government, by synthesising findings across multiple projects. This report focuses on two projects in statistics education and explores the factors that…
Statistical Process Control. A Summary. FEU/PICKUP Project Report.
ERIC Educational Resources Information Center
Owen, M.; Clark, I.
A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…
Statistics in Action: The Story of a Successful Service-Learning Project
ERIC Educational Resources Information Center
DeHart, Mary; Ham, Jim
2011-01-01
The purpose of this article is to share the stories of an Introductory Statistics service-learning project in which students from both New Jersey and Michigan design and conduct phone surveys that lead to publication in local newspapers; to discuss the pedagogical benefits and challenges of the project; and to provide information for those who…
NASA Astrophysics Data System (ADS)
Kunwar, S.; Bowden, J.; Milly, G.; Previdi, M. J.; Fiore, A. M.; West, J. J.
2017-12-01
In the coming decades, anthropogenically induced climate change will likely impact PM2.5 through both changing meteorology and feedback in natural emissions. A major goal of our project is to assess changes in PM2.5 levels over the continental US due to climate variability and change for the period 2005-2065. We will achieve this by using regional models to dynamically downscale coarse resolution (20 × 20) meteorology and air chemistry from a global model to finer spatial resolution (12 km), improving air quality projections for regions and subregions of the US (NE, SE, SW, NW, Midwest, Intermountain West). We downscale from GFDL CM3 simulations of the RCP8.5 scenario for the years 2006-2100 with aerosol and ozone precursor emissions fixed at 2005 levels. We carefully select model years from the global simulations that sample the range of PM2.5 distributions for different US regions at mid 21st century (2050-2065). Here we will show results for the meteorological downscaling (using WRF version 3.8.1) for this project, including a performance evaluation for meteorological variables with respect to the global model. In the future, the downscaled meteorology presented here will be used to drive air quality downscaling in CMAQ (version 5.2). Analysis of the resulting PM2.5 statistics for US regions, as well as the drivers for PM2.5 changes, will be important in supporting informed policies for air quality (also health and visibility) planning for different US regions for the next five decades.
Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections
NASA Astrophysics Data System (ADS)
Wakazuki, Y.
2015-12-01
A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Higher order statistical moment application for solar PV potential analysis
NASA Astrophysics Data System (ADS)
Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan
2016-10-01
Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.
2017-09-01
efficacy of statistical post-processing methods downstream of these dynamical model components with a hierarchical multivariate Bayesian approach to...Bayesian hierarchical modeling, Markov chain Monte Carlo methods , Metropolis algorithm, machine learning, atmospheric prediction 15. NUMBER OF PAGES...scale processes. However, this dissertation explores the efficacy of statistical post-processing methods downstream of these dynamical model components
Do You Need to See It to Believe It? Let's See Statistics and Geometry Dynamically Together!
ERIC Educational Resources Information Center
Martins, José Alexandre; Roca, Assumpta Estrada; Nascimento, Maria Manuel
2014-01-01
Statistical graphs, measures of central tendency and measures of spread are key concepts in the statistics curriculum, so we present here a dynamic method (software) that may be used in the classroom. In this work we begin with an introductory approach. This is done to emphasize the importance of stimulating the visualization of statistical…
Are groups of galaxies virialized systems?
NASA Technical Reports Server (NTRS)
Diaferio, Antonaldo; Ramella, Massimo; Geller, Margaret J.; Ferrari, Attilio
1993-01-01
Groups are systems of galaxies with crossing times t(cr) much smaller than the Hubble time. Most of them have t(cr) less than 0.1/H0. The usual interpretation is that they are in virial equilibrium. We compare the data of the group catalog selected from the CfA redshift survey extension with different N-body models. We show that the distributions of kinematic and dynamical quantities of the groups in the CfA catalog can be reproduced by a single collapsing group observed along different line of sights. This result shows that (1) projection effects dominate the statistics of these systems, and (2) observed groups of galaxies are probably still in the collapse phase.
HDX Workbench: Software for the Analysis of H/D Exchange MS Data
NASA Astrophysics Data System (ADS)
Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.
2012-09-01
Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.
Dynamics and Statistical Mechanics of Rotating and non-Rotating Vortical Flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Chjan
Three projects were analyzed with the overall aim of developing a computational/analytical model for estimating values of the energy, angular momentum, enstrophy and total variation of fluid height at phase transitions between disordered and self-organized flow states in planetary atmospheres. It is believed that these transitions in equilibrium statistical mechanics models play a role in the construction of large-scale, stable structures including super-rotation in the Venusian atmosphere and the formation of the Great Red Spot on Jupiter. Exact solutions of the spherical energy-enstrophy models for rotating planetary atmospheres by Kac's method of steepest descent predicted phase transitions to super-rotating solid-bodymore » flows at high energy to enstrophy ratio for all planetary spins and to sub-rotating modes if the planetary spin is large enough. These canonical statistical ensembles are well-defined for the long-range energy interactions that arise from 2D fluid flows on compact oriented manifolds such as the surface of the sphere and torus. This is because in Fourier space available through Hodge theory, the energy terms are exactly diagonalizable and hence has zero range, leading to well-defined heat baths.« less
Crustal Dynamics Project: Catalogue of site information
NASA Technical Reports Server (NTRS)
1985-01-01
This document represents a catalogue of site information for the Crustal Dynamics Project. It contains information and descriptions of those sites used by the Project as observing stations for making the precise geodetic measurements useful for studies of the Earth's crustal movements and deformation.
Gassman, Philip W.; Tisl, J.A.; Palas, E.A.; Fields, C.L.; Isenhart, T.M.; Schilling, K.E.; Wolter, C.F.; Seigley, L.S.; Helmers, M.J.
2010-01-01
Coldwater trout streams are important natural resources in northeast Iowa. Extensive efforts have been made by state and federal agencies to protect and improve water quality in northeast Iowa streams that include Sny Magill Creek and Bloody Run Creek, which are located in Clayton County. A series of three water quality projects were implemented in Sny Magill Creek watershed during 1988 to 1999, which were supported by multiple agencies and focused on best management practice (BMP) adoption. Water quality monitoring was performed during 1992 to 2001 to assess the impact of these installed BMPs in the Sny Magill Creek watershed using a paired watershed approach, where the Bloody Run Creek watershed served as the control. Conservation practice adoption still occurred in the Bloody Run Creek watershed during the 10-year monitoring project and accelerated after the project ended, when a multiagency supported water quality project was implemented during 2002 to 2007. Statistical analysis of the paired watershed results using a pre/post model indicated that discharge increased 8% in Sny Magill Creek watershed relative to the Bloody Run Creek watershed, turbidity declined 41%, total suspended sediment declined 7%, and NOx-N (nitrate-nitrogen plus nitrite-nitrogen) increased 15%. Similar results were obtained with a gradual change statistical model.The weak sediment reductions and increased NOx-N levels were both unexpected and indicate that dynamics between adopted BMPs and stream systems need to be better understood. Fish surveys indicate that conditions for supporting trout fisheries have improved in both streams. Important lessons to be taken from the overall study include (1) committed project coordinators, agency collaborators, and landowners/producers are all needed for successful water quality projects; (2) smaller watershed areas should be used in paired studies; (3) reductions in stream discharge may be required in these systems in order for significant sediment load decreases to occur; (4) long-term monitoring on the order of decades can be required to detect meaningful changes in water quality in response to BMP implementation; and (5) all consequences of specific BMPs need to be considered when considering strategies for watershed protection.
Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits
NASA Astrophysics Data System (ADS)
Friedl, L.; Kiefer, D. A.; Turner, W.
2013-12-01
This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.
Uncertainty Analysis for DAM Projects.
1987-09-01
overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases
ERIC Educational Resources Information Center
Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.
2016-01-01
For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…
The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning
ERIC Educational Resources Information Center
Koparan, Timur; Güven, Bülent
2014-01-01
This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…
NASA Astrophysics Data System (ADS)
Gädeke, Anne; Koch, Hagen; Pohle, Ina; Grünewald, Uwe
2014-05-01
In anthropogenically heavily impacted river catchments, such as the Lusatian river catchments of Spree and Schwarze Elster (Germany), the robust assessment of possible impacts of climate change on the regional water resources is of high relevance for the development and implementation of suitable climate change adaptation strategies. Large uncertainties inherent in future climate projections may, however, reduce the willingness of regional stakeholder to develop and implement suitable adaptation strategies to climate change. This study provides an overview of different possibilities to consider uncertainties in climate change impact assessments by means of (1) an ensemble based modelling approach and (2) the incorporation of measured and simulated meteorological trends. The ensemble based modelling approach consists of the meteorological output of four climate downscaling approaches (DAs) (two dynamical and two statistical DAs (113 realisations in total)), which drive different model configurations of two conceptually different hydrological models (HBV-light and WaSiM-ETH). As study area serve three near natural subcatchments of the Spree and Schwarze Elster river catchments. The objective of incorporating measured meteorological trends into the analysis was twofold: measured trends can (i) serve as a mean to validate the results of the DAs and (ii) be regarded as harbinger for the future direction of change. Moreover, regional stakeholders seem to have more trust in measurements than in modelling results. In order to evaluate the nature of the trends, both gradual (Mann-Kendall test) and step changes (Pettitt test) are considered as well as both temporal and spatial correlations in the data. The results of the ensemble based modelling chain show that depending on the type (dynamical or statistical) of DA used, opposing trends in precipitation, actual evapotranspiration and discharge are simulated in the scenario period (2031-2060). While the statistical DAs simulate a strong decrease in future long term annual precipitation, the dynamical DAs simulate a tendency towards increasing precipitation. The trend analysis suggests that precipitation has not changed significantly during the period 1961-2006. Therefore, the decrease simulated by the statistical DAs should be interpreted as a rather dry future projection. Concerning air temperature, measured and simulated trends agree on a positive trend. Also the uncertainty related to the hydrological model within the climate change modelling chain is comparably low when long-term averages are considered but increases significantly during extreme events. This proposed framework of combining an ensemble based modelling approach with measured trend analysis is a promising approach for regional stakeholders to gain more confidence into the final results of climate change impact assessments. However, climate change impact assessments will remain highly uncertain. Thus, flexible adaptation strategies need to be developed which should not only consider climate but also other aspects of global change.
Dynamic Open Inquiry Performances of High-School Biology Students
ERIC Educational Resources Information Center
Zion, Michal; Sadeh, Irit
2010-01-01
In examining open inquiry projects among high-school biology students, we found dynamic inquiry performances expressed in two criteria: "changes occurring during inquiry" and "procedural understanding". Characterizing performances in a dynamic open inquiry project can shed light on both the procedural and epistemological…
Crustal Dynamics Project: Catalogue of site information
NASA Technical Reports Server (NTRS)
Noll, Carey E. (Editor)
1988-01-01
This document represents a catalog of site information for the Crustal Dynamics Project. It contains information on and descriptions of those sites used by the Project as observing stations for making the precise geodetic measurements necessary for studies of the Earth's crustal movements and deformation.
van Mantgem, P.J.; Stephenson, N.L.
2005-01-01
1 We assess the use of simple, size-based matrix population models for projecting population trends for six coniferous tree species in the Sierra Nevada, California. We used demographic data from 16 673 trees in 15 permanent plots to create 17 separate time-invariant, density-independent population projection models, and determined differences between trends projected from initial surveys with a 5-year interval and observed data during two subsequent 5-year time steps. 2 We detected departures from the assumptions of the matrix modelling approach in terms of strong growth autocorrelations. We also found evidence of observation errors for measurements of tree growth and, to a more limited degree, recruitment. Loglinear analysis provided evidence of significant temporal variation in demographic rates for only two of the 17 populations. 3 Total population sizes were strongly predicted by model projections, although population dynamics were dominated by carryover from the previous 5-year time step (i.e. there were few cases of recruitment or death). Fractional changes to overall population sizes were less well predicted. Compared with a null model and a simple demographic model lacking size structure, matrix model projections were better able to predict total population sizes, although the differences were not statistically significant. Matrix model projections were also able to predict short-term rates of survival, growth and recruitment. Mortality frequencies were not well predicted. 4 Our results suggest that simple size-structured models can accurately project future short-term changes for some tree populations. However, not all populations were well predicted and these simple models would probably become more inaccurate over longer projection intervals. The predictive ability of these models would also be limited by disturbance or other events that destabilize demographic rates. ?? 2005 British Ecological Society.
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2016-08-01
This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.
Ruiz, Daniel; Cerón, Viviana; Molina, Adriana M.; Quiñónes, Martha L.; Jiménez, Mónica M.; Ahumada, Martha; Gutiérrez, Patricia; Osorio, Salua; Mantilla, Gilma; Connor, Stephen J.; Thomson, Madeleine C.
2014-01-01
As part of the Integrated National Adaptation Pilot project and the Integrated Surveillance and Control System, the Colombian National Institute of Health is working on the design and implementation of a Malaria Early Warning System framework, supported by seasonal climate forecasting capabilities, weather and environmental monitoring, and malaria statistical and dynamic models. In this report, we provide an overview of the local ecoepidemiologic settings where four malaria process-based mathematical models are currently being implemented at a municipal level. The description includes general characteristics, malaria situation (predominant type of infection, malaria-positive cases data, malaria incidence, and seasonality), entomologic conditions (primary and secondary vectors, mosquito densities, and feeding frequencies), climatic conditions (climatology and long-term trends), key drivers of epidemic outbreaks, and non-climatic factors (populations at risk, control campaigns, and socioeconomic conditions). Selected pilot sites exhibit different ecoepidemiologic settings that must be taken into account in the development of the integrated surveillance and control system. PMID:24891460
Development and Implementation of Dynamic Scripts to Execute Cycled GSI/WRF Forecasts
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Xuanli; Watson, Leela
2014-01-01
The Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model and Gridpoint Statistical Interpolation (GSI) data assimilation (DA) are the operational systems that make up the North American Mesoscale (NAM) model and the NAM Data Assimilation System (NDAS) analysis used by National Weather Service forecasters. The Developmental Testbed Center (DTC) manages and distributes the code for the WRF and GSI, but it is up to individual researchers to link the systems together and write scripts to run the systems, which can take considerable time for those not familiar with the code. The objective of this project is to develop and disseminate a set of dynamic scripts that mimic the unique cycling configuration of the operational NAM to enable researchers to develop new modeling and data assimilation techniques that can be easily transferred to operations. The current version of the SPoRT GSI/WRF Scripts (v3.0.1) is compatible with WRF v3.3 and GSI v3.0.
Simultaneous measurement of two noncommuting quantum variables: Solution of a dynamical model
NASA Astrophysics Data System (ADS)
Perarnau-Llobet, Martí; Nieuwenhuizen, Theodorus Maria
2017-05-01
The possibility of performing simultaneous measurements in quantum mechanics is investigated in the context of the Curie-Weiss model for a projective measurement. Concretely, we consider a spin-1/2 system simultaneously interacting with two magnets, which act as measuring apparatuses of two different spin components. We work out the dynamics of this process and determine the final state of the measuring apparatuses, from which we can find the probabilities of the four possible outcomes of the measurements. The measurement is found to be nonideal, as (i) the joint statistics do not coincide with the one obtained by separately measuring each spin component, and (ii) the density matrix of the spin does not collapse in either of the measured observables. However, we give an operational interpretation of the process as a generalized quantum measurement, and show that it is fully informative: The expected value of the measured spin components can be found with arbitrary precision for sufficiently many runs of the experiment.
Liu, Xiao; Levine, Naomi M
2016-02-28
Subtropical gyres contribute significantly to global ocean productivity. As the climate warms, the strength of these gyres as a biological carbon pump is predicted to diminish due to increased stratification and depleted surface nutrients. We present results suggesting that the impact of submesoscale physics on phytoplankton in the oligotrophic ocean is substantial and may either compensate or exacerbate future changes in carbon cycling. A new statistical tool was developed to quantify surface patchiness from sea surface temperatures. Chlorophyll concentrations in the North Pacific Subtropical Gyre were shown to be enhanced by submesoscale frontal dynamics with an average increase of 38% (maximum of 83%) during late winter. The magnitude of this enhancement is comparable to the observed decline in chlorophyll due to a warming of ~1.1°C. These results highlight the need for an improved understanding of fine-scale physical variability in order to predict the response of marine ecosystems to projected climate changes.
NASA Astrophysics Data System (ADS)
Liu, Xiao; Levine, Naomi M.
2016-02-01
Subtropical gyres contribute significantly to global ocean productivity. As the climate warms, the strength of these gyres as a biological carbon pump is predicted to diminish due to increased stratification and depleted surface nutrients. We present results suggesting that the impact of submesoscale physics on phytoplankton in the oligotrophic ocean is substantial and may either compensate or exacerbate future changes in carbon cycling. A new statistical tool was developed to quantify surface patchiness from sea surface temperatures. Chlorophyll concentrations in the North Pacific Subtropical Gyre were shown to be enhanced by submesoscale frontal dynamics with an average increase of 38% (maximum of 83%) during late winter. The magnitude of this enhancement is comparable to the observed decline in chlorophyll due to a warming of ~1.1°C. These results highlight the need for an improved understanding of fine-scale physical variability in order to predict the response of marine ecosystems to projected climate changes.
Dynamic system simulation of small satellite projects
NASA Astrophysics Data System (ADS)
Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper
2010-11-01
A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.
NASA Astrophysics Data System (ADS)
Ramirez Cuesta, Timmy
Incoherent inelastic neutron scattering spectroscopy is a very powerful technique that requires the use of ab-initio models to interpret the experimental data. Albeit not exact the information obtained from the models gives very valuable insight into the dynamics of atoms in solids and molecules, that, in turn, provides unique access to the vibrational density of states. It is extremely sensitive to hydrogen since the neutron cross section of hydrogen is the largest of all chemical elements. Hydrogen, being the lightest element highlights quantum effects more pronounced than the rest of the elements.In the case of non-crystalline or disordered materials, the models provide partial information and only a reduced sampling of possible configurations can be done at the present. With very large computing power, as exascale computing will provide, a new opportunity arises to study these systems and introduce a description of statistical configurations including energetics and dynamics characterization of configurational entropy. As part of the ICE-MAN project, we are developing the tools to manage the workflows, visualize and analyze the results. To use state of the art computational methods and most neutron scattering that using atomistic models for interpretation of experimental data This work is supported by the Laboratory Directed Research and Development (LDRD 8237) program of the UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy.
Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric; Duraisamy, Karthk
2017-11-01
The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Sarkodie, Samuel Asumadu; Strezov, Vladimir
2018-10-15
Energy production remains the major emitter of atmospheric emissions, thus, in accordance with Australia's Emissions Projections by 2030, this study analyzed the impact of Australia's energy portfolio on environmental degradation and CO 2 emissions using locally compiled data on disaggregate energy production, energy imports and exports spanning from 1974 to 2013. This study employed the fully modified ordinary least squares, dynamic ordinary least squares, and canonical cointegrating regression estimators; statistically inspired modification of partial least squares regression analysis with a subsequent sustainability sensitivity analysis. The validity of the environmental Kuznets curve hypothesis proposes a paradigm shift from energy-intensive and carbon-intensive industries to less-energy-intensive and green energy industries and its related services, leading to a structural change in the economy. Thus, decoupling energy services provide better interpretation of the role of the energy sector portfolio in environmental degradation and CO 2 emissions assessment. The sensitivity analysis revealed that nonrenewable energy production above 10% and energy imports above 5% will dampen the goals for the 2030 emission reduction target. Increasing the share of renewable energy penetration in the energy portfolio decreases the level of CO 2 emissions, while increasing the share of non-renewable energy sources in the energy mix increases the level of atmospheric emissions, thus increasing climate change and their impacts. Copyright © 2018 Elsevier B.V. All rights reserved.
Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldenson, N.; Mauger, G.; Leung, L. R.
Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less
sPHENIX: The next generation heavy ion detector at RHIC
NASA Astrophysics Data System (ADS)
Campbell, Sarah;
2017-04-01
sPHENIX is a new collaboration and future detector project at Brookhaven National Laboratory’s Relativistic Heavy Ion Collider (RHIC). It seeks to answer fundamental questions on the nature of the quark gluon plasma (QGP), including its coupling strength and temperature dependence, by using a suite of precision jet and upsilon measurements that probe different length scales of the QGP. This is possible with a full acceptance, |η| < 1 and 0-2π in φ, electromagentic and hadronic calorimeters and precision tracking enabled by a 1.5 T superconducting magnet. With the increased luminosity afforded by accelerator upgrades, sPHENIX is going to perform high statistics measurements extending the kinematic reach at RHIC to overlap the LHC’s. This overlap is going to facilitate a better understanding of the role of temperature, density and parton virtuality in QGP dynamics and, specifically, jet quenching. This paper focuses on key future measurements and the current state of the sPHENIX project.
imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel
Grapov, Dmitry; Newman, John W.
2012-01-01
Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Availability and implementation: Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010). Contact: John.Newman@ars.usda.gov Supplementary Information: Installation instructions, tutorials and users manual are available at http://sourceforge.net/projects/imdev/. PMID:22815358
Final project report for NEET pulsed ion beam project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kucheyev, S. O.
The major goal of this project was to develop and demonstrate a novel experimental approach to access the dynamic regime of radiation damage formation in nuclear materials. In particular, the project exploited a pulsed-ion-beam method in order to gain insight into defect interaction dynamics by measuring effective defect interaction time constants and defect diffusion lengths. This project had the following four major objectives: (i) the demonstration of the pulsed ion beam method for a prototypical nuclear ceramic material, SiC; (ii) the evaluation of the robustness of the pulsed beam method from studies of defect generation rate effects; (iii) the measurementmore » of the temperature dependence of defect dynamics and thermally activated defect-interaction processes by pulsed ion beam techniques; and (iv) the demonstration of alternative characterization techniques to study defect dynamics. As we describe below, all these objectives have been met.« less
Building integral projection models: a user's guide.
Rees, Mark; Childs, Dylan Z; Ellner, Stephen P
2014-05-01
In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. © 2014 The Authors. Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.
Statistical Content in Middle Grades Mathematics Textbooks
ERIC Educational Resources Information Center
Pickle, Maria Consuelo Capiral
2012-01-01
This study analyzed the treatment and scope of statistical concepts in four, widely-used, contemporary, middle grades mathematics textbook series: "Glencoe Math Connects," "Prentice Hall Mathematics," "Connected Mathematics Project," and "University of Chicago School Mathematics Project." There were three…
Static Methods in the Design of Nonlinear Automatic Control Systems,
1984-06-27
227 Chapter VI. Ways of Decrease of the Number of Statistical Nodes During the Research of Nonlinear Systems...at present occupies the central place. This region of research was called the statistical dynamics of nonlinear H automatic control systems...receives further development in the numerous research of Soviet and C foreign scientists. Special role in the development of the statistical dynamics of
Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin
2017-09-01
N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
Statistical Ensemble of Large Eddy Simulations
NASA Technical Reports Server (NTRS)
Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)
2001-01-01
A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.
Hunting statistics: what data for what use? An account of an international workshop
Nichols, J.D.; Lancia, R.A.; Lebreton, J.D.
2001-01-01
Hunting interacts with the underlying dynamics of game species in several different ways and is, at the same time, a source of valuable information not easily obtained from populations that are not subjected to hunting. Specific questions, including the sustainability of hunting activities, can be addressed using hunting statistics. Such investigations will frequently require that hunting statistics be combined with data from other sources of population-level information. Such reflections served as a basis for the meeting, ?Hunting Statistics: What Data for What Use,? held on January 15-18, 2001 in Saint-Benoist, France. We review here the 20 talks held during the workshop and the contribution of hunting statistics to our knowledge of the population dynamics of game species. Three specific topics (adaptive management, catch-effort models, and dynamics of exploited populations) were highlighted as important themes and are more extensively presented as boxes.
On initial Brain Activity Mapping of episodic and semantic memory code in the hippocampus.
Tsien, Joe Z; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Wang, Phillip Lei; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-10-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
On Initial Brain Activity Mapping of Associative Memory Code in the Hippocampus
Tsien, Joe Z.; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Lei Wang, Phillip; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-01-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. PMID:23838072
Helweg, David A.; Keener, Victoria; Burgett, Jeff M.
2016-07-14
In the subtropical and tropical Pacific islands, changing climate is predicted to influence precipitation and freshwater availability, and thus is predicted to impact ecosystems goods and services available to ecosystems and human communities. The small size of high Hawaiian Islands, plus their complex microlandscapes, require downscaling of global climate models to provide future projections of greater skill and spatial resolution. Two different climate modeling approaches (physics-based dynamical downscaling and statistics-based downscaling) have produced dissimilar projections. Because of these disparities, natural resource managers and decision makers have low confidence in using the modeling results and are therefore are unwilling to include climate-related projections in their decisions. In September 2015, the Pacific Islands Climate Science Center (PICSC), the Pacific Islands Climate Change Cooperative (PICCC), and the Pacific Regional Integrated Sciences and Assessments (Pacific RISA) program convened a 2-day facilitated workshop in which the two modeling teams, plus key model users and resource managers, were brought together for a comparison of the two approaches, culminating with a discussion of how to provide predictions that are useable by resource managers. The proceedings, discussions, and outcomes of this Workshop are summarized in this Open-File Report.
Statistical and dynamical assessment of land-ocean-atmosphere interactions across North Africa
NASA Astrophysics Data System (ADS)
Yu, Yan
North Africa is highly vulnerable to hydrologic variability and extremes, including impacts of climate change. The current understanding of oceanic versus terrestrial drivers of North African droughts and pluvials is largely model-based, with vast disagreement among models in terms of the simulated oceanic impacts and vegetation feedbacks. Regarding oceanic impacts, the relative importance of the tropical Pacific, tropical Indian, and tropical Atlantic Oceans in regulating the North African rainfall variability, as well as the underlying mechanism, remains debated among different modeling studies. Classic theory of land-atmosphere interactions across the Sahel ecotone, largely based on climate modeling experiments, has promoted positive vegetation-rainfall feedbacks associated with a dominant surface albedo mechanism. However, neither the proposed positive vegetation-rainfall feedback with its underlying albedo mechanism, nor its relative importance compared with oceanic drivers, has been convincingly demonstrated up to now using observational data. Here, the multivariate Generalized Equilibrium Feedback Assessment (GEFA) is applied in order to identify the observed oceanic and terrestrial drivers of North African climate and quantify their impacts. The reliability of the statistical GEFA method is first evaluated against dynamical experiments within the Community Earth System Model (CESM). In order to reduce the sampling error caused by short data records, the traditional GEFA approach is refined through stepwise GEFA, in which unimportant forcings are dropped through stepwise selection. In order to evaluate GEFA's reliability in capturing oceanic impacts, the atmospheric response to a sea-surface temperature (SST) forcing across the tropical Pacific, tropical Indian, and tropical Atlantic Ocean is estimated independently through ensembles of dynamical experiments and compared with GEFA-based assessments. Furthermore, GEFA's performance in capturing terrestrial impacts is evaluated through ensembles of fully coupled CESM dynamical experiments, with modified leaf area index (LAI) and soil moisture across the Sahel or West African Monsoon (WAM) region. The atmospheric responses to oceanic and terrestrial forcings are generally consistent between the dynamical experiments and statistical GEFA, confirming GEFA's capability of isolating the individual impacts of oceanic and terrestrial forcings on North African climate. Furthermore, with the incorporation of stepwise selection, GEFA can now provide reliable estimates of the oceanic and terrestrial impacts on the North African climate with the typical length of observational datasets, thereby enhancing the method's applicability. After the successful validation of GEFA, the key observed oceanic and terrestrial drivers of North African climate are identified through the application of GEFA to gridded observations, remote sensing products, and reanalyses. According to GEFA, oceanic drivers dominate over terrestrial drivers in terms of their observed impacts on North African climate in most seasons. Terrestrial impacts are comparable to, or more important than, oceanic impacts on rainfall during the post-monsoon across the Sahel and WAM region, and after the short rain across the Horn of Africa (HOA). The key ocean basins that regulate North African rainfall are typically located in the tropics. While the observed impacts of SST variability across the tropical Pacific and tropical Atlantic Oceans on the Sahel rainfall are largely consistent with previous model-based findings, minimal impacts from tropical Indian Ocean variability on Sahel rainfall are identified in observations, in contrast to previous modeling studies. The current observational analysis verifies model-hypothesized positive vegetation-rainfall feedback across the Sahel and HOA, which is confined to the post-monsoon and post-short rains season, respectively. However, the observed positive vegetation feedback to rainfall in the semi-arid Sahel and HOA is largely due to moisture recycling, rather than the classic albedo mechanism. Future projections of Sahel rainfall remain highly uncertain in terms of both sign and magnitude within phases three and five of the Coupled Model Intercomparison Project (CMIP3 and CMIP5). The GEFA-based observational analyses will provide a benchmark for evaluating climate models, which will facilitate effective process-based model weighting for more reliable projections of regional climate, as well as model development.
Statistical wave climate projections for coastal impact assessments
NASA Astrophysics Data System (ADS)
Camus, P.; Losada, I. J.; Izaguirre, C.; Espejo, A.; Menéndez, M.; Pérez, J.
2017-09-01
Global multimodel wave climate projections are obtained at 1.0° × 1.0° scale from 30 Coupled Model Intercomparison Project Phase 5 (CMIP5) global circulation model (GCM) realizations. A semi-supervised weather-typing approach based on a characterization of the ocean wave generation areas and the historical wave information from the recent GOW2 database are used to train the statistical model. This framework is also applied to obtain high resolution projections of coastal wave climate and coastal impacts as port operability and coastal flooding. Regional projections are estimated using the collection of weather types at spacing of 1.0°. This assumption is feasible because the predictor is defined based on the wave generation area and the classification is guided by the local wave climate. The assessment of future changes in coastal impacts is based on direct downscaling of indicators defined by empirical formulations (total water level for coastal flooding and number of hours per year with overtopping for port operability). Global multimodel projections of the significant wave height and peak period are consistent with changes obtained in previous studies. Statistical confidence of expected changes is obtained due to the large number of GCMs to construct the ensemble. The proposed methodology is proved to be flexible to project wave climate at different spatial scales. Regional changes of additional variables as wave direction or other statistics can be estimated from the future empirical distribution with extreme values restricted to high percentiles (i.e., 95th, 99th percentiles). The statistical framework can also be applied to evaluate regional coastal impacts integrating changes in storminess and sea level rise.
Vortex dynamics and Lagrangian statistics in a model for active turbulence.
James, Martin; Wilczek, Michael
2018-02-14
Cellular suspensions such as dense bacterial flows exhibit a turbulence-like phase under certain conditions. We study this phenomenon of "active turbulence" statistically by using numerical tools. Following Wensink et al. (Proc. Natl. Acad. Sci. U.S.A. 109, 14308 (2012)), we model active turbulence by means of a generalized Navier-Stokes equation. Two-point velocity statistics of active turbulence, both in the Eulerian and the Lagrangian frame, is explored. We characterize the scale-dependent features of two-point statistics in this system. Furthermore, we extend this statistical study with measurements of vortex dynamics in this system. Our observations suggest that the large-scale statistics of active turbulence is close to Gaussian with sub-Gaussian tails.
Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon
2015-11-03
Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jacquemin, Ingrid; Henrot, Alexandra-Jane; Beckers, Veronique; Berckmans, Julie; Debusscher, Bos; Dury, Marie; Minet, Julien; Hamdi, Rafiq; Dendoncker, Nicolas; Tychon, Bernard; Hambuckers, Alain; François, Louis
2016-04-01
The interactions between land surface and climate are complex. Climate changes can affect ecosystem structure and functions, by altering photosynthesis and productivity or inducing thermal and hydric stresses on plant species. These changes then impact socio-economic systems, through e.g., lower farming or forestry incomes. Ultimately, it can lead to permanent changes in land use structure, especially when associated with other non-climatic factors, such as urbanization pressure. These interactions and changes have feedbacks on the climate systems, in terms of changing: (1) surface properties (albedo, roughness, evapotranspiration, etc.) and (2) greenhouse gas emissions (mainly CO2, CH4, N2O). In the framework of the MASC project (« Modelling and Assessing Surface Change impacts on Belgian and Western European climate »), we aim at improving regional climate model projections at the decennial scale over Belgium and Western Europe by combining high-resolution models of climate, land surface dynamics and socio-economic processes. The land surface dynamics (LSD) module is composed of a dynamic vegetation model (CARAIB) calculating the productivity and growth of natural and managed vegetation, and an agent-based model (CRAFTY), determining the shifts in land use and land cover. This up-scaled LSD module is made consistent with the surface scheme of the regional climate model (RCM: ALARO) to allow simulations of the RCM with a fully dynamic land surface for the recent past and the period 2000-2030. In this contribution, we analyze the results of the first simulations performed with the CARAIB dynamic vegetation model over Belgium at a resolution of 1km. This analysis is performed at the species level, using a set of 17 species for natural vegetation (trees and grasses) and 10 crops, especially designed to represent the Belgian vegetation. The CARAIB model is forced with surface atmospheric variables derived from the monthly global CRU climatology or ALARO outputs (from a 4 km resolution simulation) for the recent past and the decennial projections. Evidently, these simulations lead to a first analysis of the impact of climate change on carbon stocks (e.g., biomass, soil carbon) and fluxes (e.g., gross and net primary productivities (GPP and NPP) and net ecosystem production (NEP)). The surface scheme is based on two land use/land cover databases, ECOPLAN for the Flemish region and, for the Walloon region, the COS-Wallonia database and the Belgian agricultural statistics for agricultural land. Land use and land cover are fixed through time (reference year: 2007) in these simulations, but a first attempt of coupling between CARAIB and CRAFTY will be made to establish dynamic land use change scenarios for the next decades. A simulation with variable land use would allow an analysis of land use change impacts not only on crop yields and the land carbon budget, but also on climate relevant parameters, such as surface albedo, roughness length and evapotranspiration towards a coupling with the RCM.
Projected continent-wide declines of the emperor penguin under climate change
NASA Astrophysics Data System (ADS)
Jenouvrier, Stéphanie; Holland, Marika; Stroeve, Julienne; Serreze, Mark; Barbraud, Christophe; Weimerskirch, Henri; Caswell, Hal
2014-08-01
Climate change has been projected to affect species distribution and future trends of local populations, but projections of global population trends are rare. We analyse global population trends of the emperor penguin (Aptenodytes forsteri), an iconic Antarctic top predator, under the influence of sea ice conditions projected by coupled climate models assessed in the Intergovernmental Panel on Climate Change (IPCC) effort. We project the dynamics of all 45 known emperor penguin colonies by forcing a sea-ice-dependent demographic model with local, colony-specific, sea ice conditions projected through to the end of the twenty-first century. Dynamics differ among colonies, but by 2100 all populations are projected to be declining. At least two-thirds are projected to have declined by >50% from their current size. The global population is projected to have declined by at least 19%. Because criteria to classify species by their extinction risk are based on the global population dynamics, global analyses are critical for conservation. We discuss uncertainties arising in such global projections and the problems of defining conservation criteria for species endangered by future climate change.
Contingency and statistical laws in replicate microbial closed ecosystems.
Hekstra, Doeke R; Leibler, Stanislas
2012-05-25
Contingency, the persistent influence of past random events, pervades biology. To what extent, then, is each course of ecological or evolutionary dynamics unique, and to what extent are these dynamics subject to a common statistical structure? Addressing this question requires replicate measurements to search for emergent statistical laws. We establish a readily replicated microbial closed ecosystem (CES), sustaining its three species for years. We precisely measure the local population density of each species in many CES replicates, started from the same initial conditions and kept under constant light and temperature. The covariation among replicates of the three species densities acquires a stable structure, which could be decomposed into discrete eigenvectors, or "ecomodes." The largest ecomode dominates population density fluctuations around the replicate-average dynamics. These fluctuations follow simple power laws consistent with a geometric random walk. Thus, variability in ecological dynamics can be studied with CES replicates and described by simple statistical laws. Copyright © 2012 Elsevier Inc. All rights reserved.
Soft x-ray speckle from rough surfaces
NASA Astrophysics Data System (ADS)
Porter, Matthew Stanton
Dynamic light scattering has been of great use in determining diffusion times for polymer solutions. At the same time, polymer thin films are becoming of increasing importance, especially in the semiconductor industry where they are used as photoresists and interlevel dielectrics. As the dimensions of these devices decrease we will reach a point where lasers will no longer be able to probe the length scales of interest. Current laser wavelengths limit the size of observable diffusion lengths to 180-700 nm. This dissertation will discuss attempts at pushing dynamic fight scattering experiments into the soft x-ray region so that we can examine fluctuations in polymer thin films on the molecular length scale. The dissertation explores the possibility of carrying out a dynamic light scattering experiment in the soft x-ray regime. A detailed account of how to meet the basic requirements for a coherent scattering experiment in the soft x-ray regime win be given. In addition, a complete description of the chamber design will be discussed. We used our custom designed scattering chamber to collect reproducible coherent soft x-ray scattering data from etched silicon wafers and from polystyrene coated silicon wafers. The data from the silicon wafers followed the statistics for a well-developed speckle pattern while the data from the polystyrene films exhibited Poisson statistics. We used the data from both the etched wafers and the polystyrene coated wafers to place a lower limit of ~20 Å on the RMS surface roughness of samples which will produce well defined speckle patterns for the current detector setup. Future experiments which use the criteria set forth in this dissertation have the opportunity to be even more successful than this dissertation project.
Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus
2016-01-01
The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach “Task-related Edge Density” (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function. PMID:27341204
Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus
2016-01-01
The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach "Task-related Edge Density" (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function.
NASA Astrophysics Data System (ADS)
Tong, Xiaowei; Wang, Kelin; Yue, Yuemin; Brandt, Martin; Liu, Bo; Zhang, Chunhua; Liao, Chujie; Fensholt, Rasmus
2017-02-01
To alleviate the severe rocky desertification and improve the ecological degradation conditions in Southwest China, the national and local Chinese governments have implemented a series of Ecological Restoration Projects (ERPs) since the late 1990s. This study proposed a remote sensing based approach to evaluate the long-term efforts of the ERPs started in 2000. The method applies a time-series trend analysis of satellite based vegetation data corrected for climatic influences to reveal human induced vegetation changes. The improved residual method is combined with statistics on the invested project funds to derive an index, Project Effectiveness Index (PEI), measuring the project effectiveness at county scale. High effectiveness is detected in the Guangxi Province, moderate effectiveness in the Guizhou Province, and low and no effectiveness in the Yunnan Province. Successful implementations are closely related to the combined influences from climatic conditions and human management. The landforms of Peak Forest Plain and Peak Cluster Depression regions in the Guangxi Province are characterized by temperate climate with sufficient rainfall generally leading to a high effectiveness. For the karst regions of the Yunnan and Guizhou Provinces with rough terrain and lower rainfall combined with poor management practices (unsuitable species selection, low compensation rate for peasants), only low or even no effect of project implementations can be observed. However, the effectiveness distribution is not homogeneous and counties with high project effectiveness in spite of complex natural conditions were identified, while counties with negative vegetation trends despite relatively favorable conditions and high investments were also distinguished. The proposed framework is expected to be of high relevance in general monitoring of the successfulness of ecological conservation projects in relation to invested funds.
Climate change impacts on the Lehman-Baker Creek drainage in the Great Basin National Park
NASA Astrophysics Data System (ADS)
Volk, J. M.
2013-12-01
Global climate models (GCMs) forced by increased CO2 emissions forecast anomalously dry and warm trends over the southwestern U.S. for the 21st century. The effect of warmer conditions may result in decreased surface water resources within the Great Basin physiographic region critical for ecology, irrigation and municipal water supply. Here we use downscaled GCM output from the A2 and B1 greenhouse gas emission scenarios to force a Precipitation-Runoff Modeling System (PRMS) watershed model developed for the Lehman and Baker Creeks Drainage (LBCD) in the Great Basin National Park, NV for a century long time period. The goal is to quantify the effects of rising temperature to the water budget in the LBCD at monthly and annual timescales. Dynamically downscaled GCM projections are attained from the NSF EPSCoR Nevada Infrastructure for Climate Change Science, Education, and Outreach project and statistically downscaled output is retrieved from the "U.S. Bias Corrected and Downscaled WCRP CMIP3 Climate Projections". Historical daily climate and streamflow data have been collected simultaneously for periods extending 20 years or longer. Mann-Kendal trend test results showed a statistically significant (α= 0.05) long-term rising trend from 1895 to 2012 in annual and monthly average temperatures for the study area. A grid-based, PRMS watershed model of the LBCD has been created within ArcGIS 10, and physical parameters have been estimated at a spatial resolution of 100m. Simulation results will be available soon. Snow cover is expected to decrease and peak runoff to occur earlier in the spring, resulting in increased runoff, decreased infiltration/recharge, decreased baseflows, and decreased evapo-transpiration.
ERIC Educational Resources Information Center
Dierker, Lisa; Alexander, Jalen; Cooper, Jennifer L.; Selya, Arielle; Rose, Jennifer; Dasgupta, Nilanjana
2016-01-01
Introductory statistics needs innovative, evidence-based teaching practices that support and engage diverse students. To evaluate the success of a multidisciplinary, project-based course, we compared experiences of under-represented (URM) and non-underrepresented students in 4 years of the course. While URM students considered the material more…
Monitoring software development through dynamic variables
NASA Technical Reports Server (NTRS)
Doerflinger, Carl W.; Basili, Victor R.
1983-01-01
Research conducted by the Software Engineering Laboratory (SEL) on the use of dynamic variables as a tool to monitor software development is described. Project independent measures which may be used in a management tool for monitoring software development are identified. Several FORTRAN projects with similar profiles are examined. The staff was experienced in developing these types of projects. The projects developed serve similar functions. Because these projects are similar some underlying relationships exist that are invariant between projects. These relationships, once well defined, may be used to compare the development of different projects to determine whether they are evolving the same way previous projects in this environment evolved.
Visual exploration of high-dimensional data through subspace analysis and dynamic projections
Liu, S.; Wang, B.; Thiagarajan, J. J.; ...
2015-06-01
Here, we introduce a novel interactive framework for visualizing and exploring high-dimensional datasets based on subspace analysis and dynamic projections. We assume the high-dimensional dataset can be represented by a mixture of low-dimensional linear subspaces with mixed dimensions, and provide a method to reliably estimate the intrinsic dimension and linear basis of each subspace extracted from the subspace clustering. Subsequently, we use these bases to define unique 2D linear projections as viewpoints from which to visualize the data. To understand the relationships among the different projections and to discover hidden patterns, we connect these projections through dynamic projections that createmore » smooth animated transitions between pairs of projections. We introduce the view transition graph, which provides flexible navigation among these projections to facilitate an intuitive exploration. Finally, we provide detailed comparisons with related systems, and use real-world examples to demonstrate the novelty and usability of our proposed framework.« less
Visual Exploration of High-Dimensional Data through Subspace Analysis and Dynamic Projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S.; Wang, B.; Thiagarajan, Jayaraman J.
2015-06-01
We introduce a novel interactive framework for visualizing and exploring high-dimensional datasets based on subspace analysis and dynamic projections. We assume the high-dimensional dataset can be represented by a mixture of low-dimensional linear subspaces with mixed dimensions, and provide a method to reliably estimate the intrinsic dimension and linear basis of each subspace extracted from the subspace clustering. Subsequently, we use these bases to define unique 2D linear projections as viewpoints from which to visualize the data. To understand the relationships among the different projections and to discover hidden patterns, we connect these projections through dynamic projections that create smoothmore » animated transitions between pairs of projections. We introduce the view transition graph, which provides flexible navigation among these projections to facilitate an intuitive exploration. Finally, we provide detailed comparisons with related systems, and use real-world examples to demonstrate the novelty and usability of our proposed framework.« less
NASA Technical Reports Server (NTRS)
Murphy, Kyle R.; Mann, Ian R.; Rae, I. Jonathan; Sibeck, David G.; Watt, Clare E. J.
2016-01-01
Wave-particle interactions play a crucial role in energetic particle dynamics in the Earths radiation belts. However, the relative importance of different wave modes in these dynamics is poorly understood. Typically, this is assessed during geomagnetic storms using statistically averaged empirical wave models as a function of geomagnetic activity in advanced radiation belt simulations. However, statistical averages poorly characterize extreme events such as geomagnetic storms in that storm-time ultralow frequency wave power is typically larger than that derived over a solar cycle and Kp is a poor proxy for storm-time wave power.
Evolution of Ada technology in the flight dynamics area: Implementation/testing phase analysis
NASA Technical Reports Server (NTRS)
Quimby, Kelvin L.; Esker, Linda; Miller, John; Smith, Laurie; Stark, Mike; Mcgarry, Frank
1989-01-01
An analysis is presented of the software engineering issues related to the use of Ada for the implementation and system testing phases of four Ada projects developed in the flight dynamics area. These projects reflect an evolving understanding of more effective use of Ada features. In addition, the testing methodology used on these projects has changed substantially from that used on previous FORTRAN projects.
Dynamic systems and the role of evaluation: The case of the Green Communities project.
Anzoise, Valentina; Sardo, Stefania
2016-02-01
The crucial role evaluation can play in the co-development of project design and its implementation will be addressed through the analysis of a case study, the Green Communities (GC) project, funded by the Italian Ministry of Environment within the EU Interregional Operational Program (2007-2013) "Renewable Energy and Energy Efficiency". The project's broader goals included an attempt to trigger a change in Italian local development strategies, especially for mountain and inland areas, which would be tailored to the real needs of communities, and based on a sustainable exploitation and management of the territorial assets. The goal was not achieved, and this paper addresses the issues of how GC could have been more effective in fostering a vision of change, and which design adaptations and evaluation procedures would have allowed the project to better cope with the unexpected consequences and resistances it encountered. The conclusions drawn are that projects should be conceived, designed and carried out as dynamic systems, inclusive of a dynamic and engaged evaluation enabling the generation of feedbacks loops, iteratively interpreting the narratives and dynamics unfolding within the project, and actively monitoring the potential of various relationships among project participants for generating positive social change. Copyright © 2015 Elsevier Ltd. All rights reserved.
Probing the dynamical and X-ray mass proxies of the cluster of galaxies Abell S1101
NASA Astrophysics Data System (ADS)
Rabitz, Andreas; Zhang, Yu-Ying; Schwope, Axel; Verdugo, Miguel; Reiprich, Thomas H.; Klein, Matthias
2017-01-01
Context. The galaxy cluster Abell S1101 (S1101 hereafter) deviates significantly from the X-ray luminosity versus velocity dispersion relation (L-σ) of galaxy clusters in our previous study. Given reliable X-ray luminosity measurement combining XMM-Newton and ROSAT, this could most likely be caused by the bias in the velocity dispersion due to interlopers and low member statistic in the previous sample of member galaxies, which was solely based on 20 galaxy redshifts drawn from the literature. Aims: We intend to increase the galaxy member statistics to perform precision measurements of the velocity dispersion and dynamical mass of S1101. We aim for a detailed substructure and dynamical state characterization of this cluster, and a comparison of mass estimates derived from (I) the velocity dispersion (Mvir), (II) the caustic mass computation (Mcaustic), and (III) mass proxies from X-ray observations and the Sunyaev-Zel'dovich (SZ) effect. Methods: We carried out new optical spectroscopic observations of the galaxies in this cluster field with VIMOS, obtaining a sample of 60 member galaxies for S1101. We revised the cluster redshift and velocity dispersion measurements based on this sample and also applied the Dressler-Shectman substructure test. Results: The completeness of cluster members within r200 was significantly improved for this cluster. Tests for dynamical substructure do not show evidence of major disturbances or merging activities in S1101. We find good agreement between the dynamical cluster mass measurements and X-ray mass estimates, which confirms the relaxed state of the cluster displayed in the 2D substructure test. The SZ mass proxy is slightly higher than the other estimates. The updated measurement of σ erased the deviation of S1101 in the L-σ relation. We also noticed a background structure in the cluster field of S1101. This structure is a galaxy group that is very close to the cluster S1101 in projection but at almost twice its redshift. However the mass of this structure is too low to significantly bias the observed bolometric X-ray luminosity of S1101. Hence, we can conclude that the deviation of S1101 in the L-σ relation in our previous study can be explained by low member statistics and galaxy interlopers, which are known to introduce biases in the estimated velocity dispersion. We have made use of VLT/VIMOS observations taken with the ESO Telescope at the Paranal Observatory under programme 087.A-0096.
Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function
ERIC Educational Resources Information Center
Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.
2011-01-01
In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.
NASA Astrophysics Data System (ADS)
Abe, Steffen; Krieger, Lars; Deckert, Hagen
2017-04-01
The changes of fluid pressures related to the injection of fluids into the deep underground, for example during geothermal energy production, can potentially reactivate faults and thus cause induced seismic events. Therefore, an important aspect in the planning and operation of such projects, in particular in densely populated regions such as the Upper Rhine Graben in Germany, is the estimation and mitigation of the induced seismic risk. The occurrence of induced seismicity depends on a combination of hydraulic properties of the underground, mechanical and geometric parameters of the fault, and the fluid injection regime. In this study we are therefore employing a numerical model to investigate the impact of fluid pressure changes on the dynamics of the faults and the resulting seismicity. The approach combines a model of the fluid flow around a geothermal well based on a 3D finite difference discretisation of the Darcy-equation with a 2D block-slider model of a fault. The models are coupled so that the evolving pore pressure at the relevant locations of the hydraulic model is taken into account in the calculation of the stick-slip dynamics of the fault model. Our modelling approach uses two subsequent modelling steps. Initially, the fault model is run by applying a fixed deformation rate for a given duration and without the influence of the hydraulic model in order to generate the background event statistics. Initial tests have shown that the response of the fault to hydraulic loading depends on the timing of the fluid injection relative to the seismic cycle of the fault. Therefore, multiple snapshots of the fault's stress- and displacement state are generated from the fault model. In a second step, these snapshots are then used as initial conditions in a set of coupled hydro-mechanical model runs including the effects of the fluid injection. This set of models is then compared with the background event statistics to evaluate the change in the probability of seismic events. The event data such as location, magnitude, and source characteristics can be used as input for numerical wave propagation models. This allows the translation of seismic event statistics generated by the model into ground shaking probabilities.
Using SMAP data to improve drought early warning over the US Great Plains
NASA Astrophysics Data System (ADS)
Fu, R.; Fernando, N.; Tang, W.
2015-12-01
A drought prone region such as the Great Plains of the United States (US GP) requires credible and actionable drought early warning. Such information cannot simply be extracted from available climate forecasts because of their large uncertainties at regional scales, and unclear connections to the needs of the decision makers. In particular, current dynamic seasonal predictions and climate projections, such as those produced by the NOAA North American Multi-Model Ensemble experiment (NMME) are much more reliable for winter and spring than for the summer season for the US GP. To mitigate the weaknesses of dynamic prediction/projections, we have identified three key processes behind the spring-to-summer dry memory through observational studies, as the scientific basis for a statistical drought early warning system. This system uses percentile soil moisture anomalies in spring as a key input to provide a probabilistic summer drought early warning. The latter outperforms the dynamic prediction over the US Southern Plains and has been used by the Texas state water agency to support state drought preparedness. A main source of uncertainty for this drought early warning system is the soil moisture input obtained from the NOAA Climate Forecasting System (CFS). We are testing use of the beta version of NASA Soil Moisture Active Passive (SMAP) soil moisture data, along with the Soil Moisture and Ocean Salinity (SMOS), and the long-term Essential Climate Variable Soil Moisture (ECV-SM) soil moisture data, to reduce this uncertainty. Preliminary results based on ECV-SM suggests satellite based soil moisture data could improve early warning of rainfall anomalies over the western US GP with less dense vegetation. The skill degrades over the eastern US GP where denser vegetation is found. We evaluate our SMAP-based drought early warning for 2015 summer against observations.
Zi, Tan; Schmidt, Michelle; Johnson, Thomas E.; Nover, Daniel M.; Clark, Christopher M.
2017-01-01
A warming climate increases thermal inputs to lakes with potential implications for water quality and aquatic ecosystems. In a previous study, we used a dynamic water column temperature and mixing simulation model to simulate chronic (7-day average) maximum temperatures under a range of potential future climate projections at selected sites representative of different U.S. regions. Here, to extend results to lakes where dynamic models have not been developed, we apply a novel machine learning approach that uses Gaussian Process regression to describe the model response surface as a function of simplified lake characteristics (depth, surface area, water clarity) and climate forcing (winter and summer air temperatures and potential evapotranspiration). We use this approach to extrapolate predictions from the simulation model to the statistical sample of U.S. lakes in the National Lakes Assessment (NLA) database. Results provide a national-scale scoping assessment of the potential thermal risk to lake water quality and ecosystems across the U.S. We suggest a small fraction of lakes will experience less risk of summer thermal stress events due to changes in stratification and mixing dynamics, but most will experience increases. The percentage of lakes in the NLA with simulated 7-day average maximum water temperatures in excess of 30°C is projected to increase from less than 2% to approximately 22% by the end of the 21st century, which could significantly reduce the number of lakes that can support cold water fisheries. Site-specific analysis of the full range of factors that influence thermal profiles in individual lakes is needed to develop appropriate adaptation strategies. PMID:29121058
DOE Office of Scientific and Technical Information (OSTI.GOV)
CAP,JEROME S.
2000-08-24
Sandia has recently completed the flight certification test series for the Multi-Spectral Thermal Imaging satellite (MTI), which is a small satellite for which Sandia was the system integrator. A paper was presented at the 16th Aerospace Testing Seminar discussing plans for performing the structural dynamics certification program for that satellite. The testing philosophy was originally based on a combination of system level vibroacoustic tests and component level shock and vibration tests. However, the plans evolved to include computational analyses using both Finite Element Analysis and Statistical Energy Analysis techniques. This paper outlines the final certification process and discuss lessons learnedmore » including both things that went well and things that should/could have been done differently.« less
Mathematical Optimization Techniques
NASA Technical Reports Server (NTRS)
Bellman, R. (Editor)
1963-01-01
The papers collected in this volume were presented at the Symposium on Mathematical Optimization Techniques held in the Santa Monica Civic Auditorium, Santa Monica, California, on October 18-20, 1960. The objective of the symposium was to bring together, for the purpose of mutual education, mathematicians, scientists, and engineers interested in modern optimization techniques. Some 250 persons attended. The techniques discussed included recent developments in linear, integer, convex, and dynamic programming as well as the variational processes surrounding optimal guidance, flight trajectories, statistical decisions, structural configurations, and adaptive control systems. The symposium was sponsored jointly by the University of California, with assistance from the National Science Foundation, the Office of Naval Research, the National Aeronautics and Space Administration, and The RAND Corporation, through Air Force Project RAND.
NASA Astrophysics Data System (ADS)
González, J. F.; Levato, H.; Grosso, M.
We present preliminary results of a long-term project devoted to the observational study of the binary star population in open clusters and its connection with the dynamical and evolutionary properties of the clusters. We report the discovery of 17 double-lined spectroscopic binaries, 30 radial velocity variables and about 30 suspected variables. In the 17 clusters of our sample the binary frequency ranges between 20 and 40 %, and reaches typically 60 % if all suspected binaries are included. We study the spatial distribution of the binary stars with respect to the cluster center and we discuss the statistical correlation of the mass-ratio distribution with the cluster age.
NASA Astrophysics Data System (ADS)
Kohut, J. T.; Manderson, J.; Palamara, L. J.; Saba, V. S.; Saba, G.; Hare, J. A.; Curchitser, E. N.; Moore, P.; Seibel, B.; DiDomenico, G.
2016-12-01
Through a multidisciplinary study group of experts in marine ecology, physical oceanography and stock assessment from the fishing industry, government and academia we developed a method to explicitly account for shifting habitat distributions in fish population assessments. We used data from field surveys throughout the Northwest Atlantic Ocean to develop a parametric thermal niche model for an important short-lived pelagic forage fish, Atlantic Butterfish. This niche model was coupled to a hindcast of daily bottom water temperature derived from a regional numerical ocean model in order to project daily thermal habitat suitability over the last 40 years. This ecological hindcast was used to estimate the proportion of thermal habitat suitability available on the U.S. Northeast Shelf that was sampled on fishery-independent surveys, accounting for the relative motions of thermal habitat and the trajectory of sampling on the survey. The method and habitat based estimates of availability was integrated into the catchability estimate used to scale population size in the butterfish stock assessment model accepted by the reviewers of the 59th NEFSC stock assessment review, as well as the mid-Atlantic Council's Scientific and Statistical Committee. The contribution of the availability estimate (along with an estimate of detectability) allowed for the development of fishery reference points, a change in stock status from unknown to known, and the establishment of a directed fishery with an allocation of 20,000 metric tons of quota. This presentation will describe how a community based workgroup utilized ocean observing technologies combined with ocean models to better understand the physical ocean that structures marine ecosystems. Using these approaches we will discuss opportunities to inform ecological hindcasts and climate projections with mechanistic models that link species-specific physiology to climate-based thermal scenarios.
NASA Astrophysics Data System (ADS)
Kohut, J. T.; Manderson, J.; Palamara, L. J.; Saba, V. S.; Saba, G.; Hare, J. A.; Curchitser, E. N.; Moore, P.; Seibel, B.; DiDomenico, G.
2016-02-01
Through a multidisciplinary study group of experts in marine ecology, physical oceanography and stock assessment from the fishing industry, government and academia we developed a method to explicitly account for shifting habitat distributions in fish population assessments. We used data from field surveys throughout the Northwest Atlantic Ocean to develop a parametric thermal niche model for an important short-lived pelagic forage fish, Atlantic Butterfish. This niche model was coupled to a hindcast of daily bottom water temperature derived from a regional numerical ocean model in order to project daily thermal habitat suitability over the last 40 years. This ecological hindcast was used to estimate the proportion of thermal habitat suitability available on the U.S. Northeast Shelf that was sampled on fishery-independent surveys, accounting for the relative motions of thermal habitat and the trajectory of sampling on the survey. The method and habitat based estimates of availability was integrated into the catchability estimate used to scale population size in the butterfish stock assessment model accepted by the reviewers of the 59th NEFSC stock assessment review, as well as the mid-Atlantic Council's Scientific and Statistical Committee. The contribution of the availability estimate (along with an estimate of detectability) allowed for the development of fishery reference points, a change in stock status from unknown to known, and the establishment of a directed fishery with an allocation of 20,000 metric tons of quota. This presentation will describe how a community based workgroup utilized ocean observing technologies combined with ocean models to better understand the physical ocean that structures marine ecosystems. Using these approaches we will discuss opportunities to inform ecological hindcasts and climate projections with mechanistic models that link species-specific physiology to climate-based thermal scenarios.
Dynamics of statistical distance: Quantum limits for two-level clocks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braunstein, S.L.; Milburn, G.J.
1995-03-01
We study the evolution of statistical distance on the Bloch sphere under unitary and nonunitary dynamics. This corresponds to studying the limits to clock precision for a clock constructed from a two-state system. We find that the initial motion away from pure states under nonunitary dynamics yields the greatest accuracy for a one-tick'' clock; in this case the clock's precision is not limited by the largest frequency of the system.
The use and misuse of aircraft and missile RCS statistics
NASA Astrophysics Data System (ADS)
Bishop, Lee R.
1991-07-01
Both static and dynamic radar cross sections measurements are used for RCS predictions, but the static data are less complete than the dynamic. Integrated dynamics RCS data also have limitations for prediction radar detection performance. When raw static data are properly used, good first-order detection estimates are possible. The research to develop more-usable RCS statistics is reviewed, and windowing techniques for creating probability density functions from static RCS data are discussed.
Generic dynamical phase transition in one-dimensional bulk-driven lattice gases with exclusion
NASA Astrophysics Data System (ADS)
Lazarescu, Alexandre
2017-06-01
Dynamical phase transitions are crucial features of the fluctuations of statistical systems, corresponding to boundaries between qualitatively different mechanisms of maintaining unlikely values of dynamical observables over long periods of time. They manifest themselves in the form of non-analyticities in the large deviation function of those observables. In this paper, we look at bulk-driven exclusion processes with open boundaries. It is known that the standard asymmetric simple exclusion process exhibits a dynamical phase transition in the large deviations of the current of particles flowing through it. That phase transition has been described thanks to specific calculation methods relying on the model being exactly solvable, but more general methods have also been used to describe the extreme large deviations of that current, far from the phase transition. We extend those methods to a large class of models based on the ASEP, where we add arbitrary spatial inhomogeneities in the rates and short-range potentials between the particles. We show that, as for the regular ASEP, the large deviation function of the current scales differently with the size of the system if one considers very high or very low currents, pointing to the existence of a dynamical phase transition between those two regimes: high current large deviations are extensive in the system size, and the typical states associated to them are Coulomb gases, which are highly correlated; low current large deviations do not depend on the system size, and the typical states associated to them are anti-shocks, consistently with a hydrodynamic behaviour. Finally, we illustrate our results numerically on a simple example, and we interpret the transition in terms of the current pushing beyond its maximal hydrodynamic value, as well as relate it to the appearance of Tracy-Widom distributions in the relaxation statistics of such models. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Alexandre Lazarescu was selected by the Editorial Board of J. Phys. A as an Emerging Talent.
A "Sweet 16" of Rules About Teamwork
NASA Technical Reports Server (NTRS)
Laufer, Alexander (Editor)
2002-01-01
The following "Sweet 16" rules included in this paper derive from a longer paper by APPL Director Dr. Edward Hoffman and myself entitled " 99 Rules for Managing Faster, Better, Cheaper Projects." Our sources consisted mainly of "war stories" told by master project managers in my book Simultaneous Management: Managing Projects in a Dynamic Environment (AMACOM, The American Management Association, 1996). The Simultaneous Management model was a result of 10 years of intensive research and testing conducted with the active participation of master project managers from leading private organizations such as AT&T, DuPont, Exxon, General Motors, IBM, Motorola and Procter & Gamble. In a more recent study, led by Dr. Hoffman, we learned that master project managers in leading public organizations employ most of these rules as well. Both studies, in private and public organizations, found that a dynamic environment calls for dynamic management, and that is especially clear in how successful project managers think about their teams.
36 CFR 64.8 - Project selection criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... after project approval. (b) Projects which are located or originate in Standard Metropolitan Statistical..., Federal, or local plans. (g) The degree to which the project advances new ideas in recreation/conservation...
Software conversion history of the Flight Dynamics System (FDS)
NASA Technical Reports Server (NTRS)
Liu, K.
1984-01-01
This report summarizes the overall history of the Flight Dynamics System (FDS) applications software conversion project. It describes the background and nature of the project; traces the actual course of conversion; assesses the process, product, and personnel involved; and offers suggestions for future projects. It also contains lists of pertinent reference material and examples of supporting data.
What to expect from dynamical modelling of galactic haloes - II. The spherical Jeans equation
NASA Astrophysics Data System (ADS)
Wang, Wenting; Han, Jiaxin; Cole, Shaun; More, Surhud; Frenk, Carlos; Schaller, Matthieu
2018-06-01
The spherical Jeans equation (SJE) is widely used in dynamical modelling of the Milky Way (MW) halo potential. We use haloes and galaxies from the cosmological Millennium-II simulation and hydrodynamical APOSTLE (A Project of Simulations of The Local Environment) simulations to investigate the performance of the SJE in recovering the underlying mass profiles of MW mass haloes. The best-fitting halo mass and concentration parameters scatter by 25 per cent and 40 per cent around their input values, respectively, when dark matter particles are used as tracers. This scatter becomes as large as a factor of 3 when using star particles instead. This is significantly larger than the estimated statistical uncertainty associated with the use of the SJE. The existence of correlated phase-space structures that violate the steady-state assumption of the SJE as well as non-spherical geometries is the principal source of the scatter. Binary haloes show larger scatter because they are more aspherical in shape and have a more perturbed dynamical state. Our results confirm that the number of independent phase-space structures sets an intrinsic limiting precision on dynamical inferences based on the steady-state assumption. Modelling with a radius-independent velocity anisotropy, or using tracers within a limited outer radius, result in significantly larger scatter, but the ensemble-averaged measurement over the whole halo sample is approximately unbiased.
Mori-Zwanzig theory for dissipative forces in coarse-grained dynamics in the Markov limit
NASA Astrophysics Data System (ADS)
Izvekov, Sergei
2017-01-01
We derive alternative Markov approximations for the projected (stochastic) force and memory function in the coarse-grained (CG) generalized Langevin equation, which describes the time evolution of the center-of-mass coordinates of clusters of particles in the microscopic ensemble. This is done with the aid of the Mori-Zwanzig projection operator method based on the recently introduced projection operator [S. Izvekov, J. Chem. Phys. 138, 134106 (2013), 10.1063/1.4795091]. The derivation exploits the "generalized additive fluctuating force" representation to which the projected force reduces in the adopted projection operator formalism. For the projected force, we present a first-order time expansion which correctly extends the static fluctuating force ansatz with the terms necessary to maintain the required orthogonality of the projected dynamics in the Markov limit to the space of CG phase variables. The approximant of the memory function correctly accounts for the momentum dependence in the lowest (second) order and indicates that such a dependence may be important in the CG dynamics approaching the Markov limit. In the case of CG dynamics with a weak dependence of the memory effects on the particle momenta, the expression for the memory function presented in this work is applicable to non-Markov systems. The approximations are formulated in a propagator-free form allowing their efficient evaluation from the microscopic data sampled by standard molecular dynamics simulations. A numerical application is presented for a molecular liquid (nitromethane). With our formalism we do not observe the "plateau-value problem" if the friction tensors for dissipative particle dynamics (DPD) are computed using the Green-Kubo relation. Our formalism provides a consistent bottom-up route for hierarchical parametrization of DPD models from atomistic simulations.
From Weakly Chaotic Dynamics to Deterministic Subdiffusion via Copula Modeling
NASA Astrophysics Data System (ADS)
Nazé, Pierre
2018-03-01
Copula modeling consists in finding a probabilistic distribution, called copula, whereby its coupling with the marginal distributions of a set of random variables produces their joint distribution. The present work aims to use this technique to connect the statistical distributions of weakly chaotic dynamics and deterministic subdiffusion. More precisely, we decompose the jumps distribution of Geisel-Thomae map into a bivariate one and determine the marginal and copula distributions respectively by infinite ergodic theory and statistical inference techniques. We verify therefore that the characteristic tail distribution of subdiffusion is an extreme value copula coupling Mittag-Leffler distributions. We also present a method to calculate the exact copula and joint distributions in the case where weakly chaotic dynamics and deterministic subdiffusion statistical distributions are already known. Numerical simulations and consistency with the dynamical aspects of the map support our results.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
Brands, H; Maassen, S R; Clercx, H J
1999-09-01
In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.
Sensitivity of Regional Hydropower Generation to the Projected Changes in Future Watershed Hydrology
NASA Astrophysics Data System (ADS)
Kao, S. C.; Naz, B. S.; Gangrade, S.
2015-12-01
Hydropower is a key contributor to the renewable energy portfolio due to its established development history and the diverse benefits it provides to the electric power systems. With the projected change in the future watershed hydrology, including shift of snowmelt timing, increasing occurrence of extreme precipitation, and change in drought frequencies, there is a need to investigate how the regional hydropower generation may change correspondingly. To evaluate the sensitivity of watershed storage and hydropower generation to future climate change, a lumped Watershed Runoff-Energy Storage (WRES) model is developed to simulate the annual and seasonal hydropower generation at various hydropower areas in the United States. For each hydropower study area, the WRES model use the monthly precipitation and naturalized (unregulated) runoff as inputs to perform a runoff mass balance calculation for the total monthly runoff storage in all reservoirs and retention facilities in the watershed, and simulate the monthly regulated runoff release and hydropower generation through the system. The WRES model is developed and calibrated using the historic (1980-2009) monthly precipitation, runoff, and generation data, and then driven by a large set of dynamically- and statistically-downscaled Coupled Model Intercomparison Project Phase 5 climate projections to simulate the change of watershed storage and hydropower generation under different future climate scenarios. The results among different hydropower regions, storage capacities, emission scenarios, and timescales are compared and discussed in this study.
2014-12-01
moving relative to the water in which they are immersed, reflecting the true school movement dynamics . There has also been work to implement this...Engineering Department Woods Hole Oceanographic Institution 98 Water Street, MS #11 Woods Hole, MA 02543 9. SPONSORING/MONITORING AGENCY NAME(S) AND...were measured with multi-beam sonars and quantified in terms of important aspects offish dynamics ; and predictions were made of echo statistics of a
NASA Astrophysics Data System (ADS)
Xu, Jin; Li, Zheng; Li, Shuliang; Zhang, Yanyan
2015-07-01
There is still a lack of effective paradigms and tools for analysing and discovering the contents and relationships of project knowledge contexts in the field of project management. In this paper, a new framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps under big data environments is proposed and developed. The conceptual paradigm, theoretical underpinning, extended topic model, and illustration examples of the ontology model for project knowledge maps are presented, with further research work envisaged.
Statistical mechanics of influence maximization with thermal noise
NASA Astrophysics Data System (ADS)
Lynn, Christopher W.; Lee, Daniel D.
2017-03-01
The problem of optimally distributing a budget of influence among individuals in a social network, known as influence maximization, has typically been studied in the context of contagion models and deterministic processes, which fail to capture stochastic interactions inherent in real-world settings. Here, we show that by introducing thermal noise into influence models, the dynamics exactly resemble spins in a heterogeneous Ising system. In this way, influence maximization in the presence of thermal noise has a natural physical interpretation as maximizing the magnetization of an Ising system given a budget of external magnetic field. Using this statistical mechanical formulation, we demonstrate analytically that for small external-field budgets, the optimal influence solutions exhibit a highly non-trivial temperature dependence, focusing on high-degree hub nodes at high temperatures and on easily influenced peripheral nodes at low temperatures. For the general problem, we present a projected gradient ascent algorithm that uses the magnetic susceptibility to calculate locally optimal external-field distributions. We apply our algorithm to synthetic and real-world networks, demonstrating that our analytic results generalize qualitatively. Our work establishes a fruitful connection with statistical mechanics and demonstrates that influence maximization depends crucially on the temperature of the system, a fact that has not been appreciated by existing research.
I-39 dynamic message sign project summary
DOT National Transportation Integrated Search
2005-01-01
The Illinois Department of Transportation (IDOT) sought to deploy a message sign system consisting of permanently-mounted dynamic message signs around the Rockford metropolitan area. The project goal was to begin building a system of remotely-activat...
Experimental Determination of Dynamical Lee-Yang Zeros
NASA Astrophysics Data System (ADS)
Brandner, Kay; Maisi, Ville F.; Pekola, Jukka P.; Garrahan, Juan P.; Flindt, Christian
2017-05-01
Statistical physics provides the concepts and methods to explain the phase behavior of interacting many-body systems. Investigations of Lee-Yang zeros—complex singularities of the free energy in systems of finite size—have led to a unified understanding of equilibrium phase transitions. The ideas of Lee and Yang, however, are not restricted to equilibrium phenomena. Recently, Lee-Yang zeros have been used to characterize nonequilibrium processes such as dynamical phase transitions in quantum systems after a quench or dynamic order-disorder transitions in glasses. Here, we experimentally realize a scheme for determining Lee-Yang zeros in such nonequilibrium settings. We extract the dynamical Lee-Yang zeros of a stochastic process involving Andreev tunneling between a normal-state island and two superconducting leads from measurements of the dynamical activity along a trajectory. From the short-time behavior of the Lee-Yang zeros, we predict the large-deviation statistics of the activity which is typically difficult to measure. Our method paves the way for further experiments on the statistical mechanics of many-body systems out of equilibrium.
Radiation from quantum weakly dynamical horizons in loop quantum gravity.
Pranzetti, Daniele
2012-07-06
We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.
Application of Tube Dynamics to Non-Statistical Reaction Processes
NASA Astrophysics Data System (ADS)
Gabern, F.; Koon, W. S.; Marsden, J. E.; Ross, S. D.; Yanao, T.
2006-06-01
A technique based on dynamical systems theory is introduced for the computation of lifetime distributions and rates of chemical reactions and scattering phenomena, even in systems that exhibit non-statistical behavior. In particular, we merge invariant manifold tube dynamics with Monte Carlo volume determination for accurate rate calculations. This methodology is applied to a three-degree-of-freedom model problem and some ideas on how it might be extended to higher-degree-of-freedom systems are presented.
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Dynamic Projection Mapping onto Deforming Non-Rigid Surface Using Deformable Dot Cluster Marker.
Narita, Gaku; Watanabe, Yoshihiro; Ishikawa, Masatoshi
2017-03-01
Dynamic projection mapping for moving objects has attracted much attention in recent years. However, conventional approaches have faced some issues, such as the target objects being limited to rigid objects, and the limited moving speed of the targets. In this paper, we focus on dynamic projection mapping onto rapidly deforming non-rigid surfaces with a speed sufficiently high that a human does not perceive any misalignment between the target object and the projected images. In order to achieve such projection mapping, we need a high-speed technique for tracking non-rigid surfaces, which is still a challenging problem in the field of computer vision. We propose the Deformable Dot Cluster Marker (DDCM), a novel fiducial marker for high-speed tracking of non-rigid surfaces using a high-frame-rate camera. The DDCM has three performance advantages. First, it can be detected even when it is strongly deformed. Second, it realizes robust tracking even in the presence of external and self occlusions. Third, it allows millisecond-order computational speed. Using DDCM and a high-speed projector, we realized dynamic projection mapping onto a deformed sheet of paper and a T-shirt with a speed sufficiently high that the projected images appeared to be printed on the objects.
NASA Astrophysics Data System (ADS)
Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.
2017-12-01
Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.
Ecological Research Division Theoretical Ecology Program. [Contains abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-10-01
This report presents the goals of the Theoretical Ecology Program and abstracts of research in progress. Abstracts cover both theoretical research that began as part of the terrestrial ecology core program and new projects funded by the theoretical program begun in 1988. Projects have been clustered into four major categories: Ecosystem dynamics; landscape/scaling dynamics; population dynamics; and experiment/sample design.
Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points
ERIC Educational Resources Information Center
Ekol, George
2015-01-01
This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…
Waiting time distribution revealing the internal spin dynamics in a double quantum dot
NASA Astrophysics Data System (ADS)
Ptaszyński, Krzysztof
2017-07-01
Waiting time distribution and the zero-frequency full counting statistics of unidirectional electron transport through a double quantum dot molecule attached to spin-polarized leads are analyzed using the quantum master equation. The waiting time distribution exhibits a nontrivial dependence on the value of the exchange coupling between the dots and the gradient of the applied magnetic field, which reveals the oscillations between the spin states of the molecule. The zero-frequency full counting statistics, on the other hand, is independent of the aforementioned quantities, thus giving no insight into the internal dynamics. The fact that the waiting time distribution and the zero-frequency full counting statistics give a nonequivalent information is associated with two factors. Firstly, it can be explained by the sensitivity to different timescales of the dynamics of the system. Secondly, it is associated with the presence of the correlation between subsequent waiting times, which makes the renewal theory, relating the full counting statistics and the waiting time distribution, no longer applicable. The study highlights the particular usefulness of the waiting time distribution for the analysis of the internal dynamics of mesoscopic systems.
Atmospheric Rivers in the Mid-latitudes: A Modeling Study for Current and Future Climates
NASA Astrophysics Data System (ADS)
Shields, C. A.; Kiehl, J. T.
2015-12-01
Atmospheric rivers (ARs) are dynamically-driven narrow intense bands of moisture that transport significant amounts of moisture from the tropics to mid-latitudes and are thus an important aspect the Earth's hydrological cycle. They are often associated with extratropical cyclones whose low level circulation is able to tap into tropical moisture and transport it northward. The "Pineapple Express" is an example of an AR that impacts the west coast of California predominately in the winter months and can produce heavy amounts of precipitation in a short period of time (hours up to several days). This work will focus on three mid-latitude AR regions including the west coast of California, the Pacific Northwest, and the United Kingdom as modeled by a suite of high-resolution CESM (Community Earth System Model) simulations for 20th century and RCP8.5 future climate scenarios. The CESM version employed utilizes half-degree resolution atmosphere/land components (~0.5o) coupled to the standard (1o) ocean/ice components. We use the high-resolution atmosphere because it is able to more accurately represent extreme, regional precipitation. CESM realistically captures ARs as spatial and temporal statistics show. Projections for future climate statistics for all three regions as well as analysis of the dynamical and thermodynamical mechanisms driving ARs, such as vorticity, jets and the steering flow, and water vapor transport, and will presented. Finally, teleconnections to climate variability processes, such as ENSO will be explored.
Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.
2014-01-01
The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate scenarios.
Project implementation plan : variable dynamic testbed vehicle
DOT National Transportation Integrated Search
1997-02-01
This document is the project implementation plan for the Variable Dynamic Testbed Vehicle (VDTV) program, sponsored by the Jet Propulsion Laboratory for the Office of Crash Avoidance Research (OCAR) programs in support of Thrust One of the National H...
An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.
Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun
2016-12-01
The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.
FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions
Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro
2017-01-01
Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396
Effects of the magnetic field direction on the Tsallis statistic
NASA Astrophysics Data System (ADS)
González-Casanova, Diego F.; Lazarian, A.; Cho, J.
2018-04-01
We extend the use of the Tsallis statistic to measure the differences in gas dynamics relative to the mean magnetic field present from natural eddy-type motions existing in magnetohydrodynamical (MHD) turbulence. The variation in gas dynamics was estimated using the Tsallis parameters on the incremental probability distribution function of the observables (intensity and velocity centroid) obtained from compressible MHD simulations. We find that the Tsallis statistic is susceptible to the anisotropy produced by the magnetic field, even when anisotropy is present the Tsallis statistic can be used to determine MHD parameters such as the Sonic Mach number. We quantize the goodness of the Tsallis parameters using the coefficient of determination to measure the differences in the gas dynamics. These parameters also determine the level of magnetization and compressibility of the medium. To further simulate realistic spectroscopic observational data, we introduced smoothing, noise, and cloud boundaries to the MHD simulations.
NASA Astrophysics Data System (ADS)
Kuleshov, Yuriy; Jones, David; Hendon, Harry; Charles, Andrew; Shelton, Kay; de Wit, Roald; Cottrill, Andrew; Nakaegawa, Toshiyuki; Atalifo, Terry; Prakash, Bipendra; Seuseu, Sunny; Kaniaha, Salesa
2013-04-01
Over the past few years, significant progress in developing climate science for the Pacific has been achieved through a number of research projects undertaken under the Australian government International Climate Change Adaptation Initiative (ICCAI). Climate change has major impact on Pacific Island Countries and advancement in understanding past, present and futures climate in the region is vital for island nation to develop adaptation strategies to their rapidly changing environment. This new science is now supporting new services for a wide range of stakeholders in the Pacific through the National Meteorological Agencies of the region. Seasonal climate prediction is particularly important for planning in agriculture, tourism and other weather-sensitive industries, with operational services provided by all National Meteorological Services in the region. The interaction between climate variability and climate change, for example during droughts or very warm seasons, means that much of the early impacts of climate change are being felt through seasonal variability. A means to reduce these impacts is to improve forecasts to support decision making. Historically, seasonal climate prediction has been developed based on statistical past relationship. Statistical methods relate meteorological variables (e.g. temperature and rainfall) to indices which describe large-scale environment (e.g. ENSO indices) using historical data. However, with observed climate change, statistical approaches based on historical data are getting less accurate and less reliable. Recognising the value of seasonal forecasts, we have used outputs of a dynamical model POAMA (Predictive Ocean Atmosphere Model for Australia), to develop web-based information tools (http://poama.bom.gov.au/experimental/pasap/index.shtml) which are now used by climate services in 15 partner countries in the Pacific for preparing seasonal climate outlooks. Initial comparison conducted during 2012 has shown that the predictive skill of POAMA is consistently higher than skill of statistical-based method. Presently, under the Pacific-Australia Climate Change Science and Adaptation Planning (PACCSAP) program, we are developing dynamical model-based seasonal climate prediction for climate extremes. Of particular concern are tropical cyclones which are the most destructive weather systems that impact on coastal areas of Australia and Pacific Island Countries. To analyse historical cyclone data, we developed a consolidate archive for the Southern Hemisphere and North-Western Pacific (http://www.bom.gov.au/cyclone/history/tracks/). Using dynamical climate models (POAMA and Japan Meteorological Agency's model), we work on improving accuracy of seasonal forecasts of tropical cyclone activity for the regions of Western Pacific. Improved seasonal climate prediction based on dynamical models will further enhance climate services in Australia and Pacific Island Countries.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Code of Federal Regulations, 2013 CFR
2013-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Code of Federal Regulations, 2012 CFR
2012-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs
ERIC Educational Resources Information Center
Carr, Nathan T.
2008-01-01
Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…
Effectiveness of Project Based Learning in Statistics for Lower Secondary Schools
ERIC Educational Resources Information Center
Siswono, Tatag Yuli Eko; Hartono, Sugi; Kohar, Ahmad Wachidul
2018-01-01
Purpose: This study aimed at investigating the effectiveness of implementing Project Based Learning (PBL) on the topic of statistics at a lower secondary school in Surabaya city, Indonesia, indicated by examining student learning outcomes, student responses, and student activity. Research Methods: A quasi experimental method was conducted over two…
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models
Snijders, Tom A.B.; Steglich, Christian E.G.
2014-01-01
Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of generalized linear statistical models they aim to be realistic detailed representations of network dynamics in empirical data sets. Statistical parallels to micro-macro considerations can be found in the estimation of parameters determining local actor behavior from empirical data, and the assessment of goodness of fit from the correspondence with network-level descriptives. This article studies several network-level consequences of dynamic actor-based models applied to represent cross-sectional network data. Two examples illustrate how network-level characteristics can be obtained as emergent features implied by micro-specifications of actor-based models. PMID:25960578
Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series
NASA Technical Reports Server (NTRS)
Vautard, R.; Ghil, M.
1989-01-01
Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.
Ea, Vuthy; Sexton, Tom; Gostan, Thierry; Herviou, Laurie; Baudement, Marie-Odile; Zhang, Yunzhe; Berlivet, Soizik; Le Lay-Taha, Marie-Noëlle; Cathala, Guy; Lesne, Annick; Victor, Jean-Marc; Fan, Yuhong; Cavalli, Giacomo; Forné, Thierry
2015-08-15
In higher eukaryotes, the genome is partitioned into large "Topologically Associating Domains" (TADs) in which the chromatin displays favoured long-range contacts. While a crumpled/fractal globule organization has received experimental supports at higher-order levels, the organization principles that govern chromatin dynamics within these TADs remain unclear. Using simple polymer models, we previously showed that, in mouse liver cells, gene-rich domains tend to adopt a statistical helix shape when no significant locus-specific interaction takes place. Here, we use data from diverse 3C-derived methods to explore chromatin dynamics within mouse and Drosophila TADs. In mouse Embryonic Stem Cells (mESC), that possess large TADs (median size of 840 kb), we show that the statistical helix model, but not globule models, is relevant not only in gene-rich TADs, but also in gene-poor and gene-desert TADs. Interestingly, this statistical helix organization is considerably relaxed in mESC compared to liver cells, indicating that the impact of the constraints responsible for this organization is weaker in pluripotent cells. Finally, depletion of histone H1 in mESC alters local chromatin flexibility but not the statistical helix organization. In Drosophila, which possesses TADs of smaller sizes (median size of 70 kb), we show that, while chromatin compaction and flexibility are finely tuned according to the epigenetic landscape, chromatin dynamics within TADs is generally compatible with an unconstrained polymer configuration. Models issued from polymer physics can accurately describe the organization principles governing chromatin dynamics in both mouse and Drosophila TADs. However, constraints applied on this dynamics within mammalian TADs have a peculiar impact resulting in a statistical helix organization.
Lu, Zhao; Sun, Jing; Butts, Kenneth
2016-02-03
A giant leap has been made in the past couple of decades with the introduction of kernel-based learning as a mainstay for designing effective nonlinear computational learning algorithms. In view of the geometric interpretation of conditional expectation and the ubiquity of multiscale characteristics in highly complex nonlinear dynamic systems [1]-[3], this paper presents a new orthogonal projection operator wavelet kernel, aiming at developing an efficient computational learning approach for nonlinear dynamical system identification. In the framework of multiresolution analysis, the proposed projection operator wavelet kernel can fulfill the multiscale, multidimensional learning to estimate complex dependencies. The special advantage of the projection operator wavelet kernel developed in this paper lies in the fact that it has a closed-form expression, which greatly facilitates its application in kernel learning. To the best of our knowledge, it is the first closed-form orthogonal projection wavelet kernel reported in the literature. It provides a link between grid-based wavelets and mesh-free kernel-based methods. Simulation studies for identifying the parallel models of two benchmark nonlinear dynamical systems confirm its superiority in model accuracy and sparsity.
Quick-look guide to the crustal dynamics project's data information system
NASA Technical Reports Server (NTRS)
Noll, Carey E.; Behnke, Jeanne M.; Linder, Henry G.
1987-01-01
Described are the contents of the Crustal Dynamics Project Data Information System (DIS) and instructions on the use of this facility. The main purpose of the DIS is to store all geodetic data products acquired by the Project in a central data bank and to maintain information about the archive of all Project-related data. Access and use of the DIS menu-driven system is described as well as procedures for contacting DIS staff and submitting data requests.
Comparing estimates of climate change impacts from process-based and statistical crop models
NASA Astrophysics Data System (ADS)
Lobell, David B.; Asseng, Senthold
2017-01-01
The potential impacts of climate change on crop productivity are of widespread interest to those concerned with addressing climate change and improving global food security. Two common approaches to assess these impacts are process-based simulation models, which attempt to represent key dynamic processes affecting crop yields, and statistical models, which estimate functional relationships between historical observations of weather and yields. Examples of both approaches are increasingly found in the scientific literature, although often published in different disciplinary journals. Here we compare published sensitivities to changes in temperature, precipitation, carbon dioxide (CO2), and ozone from each approach for the subset of crops, locations, and climate scenarios for which both have been applied. Despite a common perception that statistical models are more pessimistic, we find no systematic differences between the predicted sensitivities to warming from process-based and statistical models up to +2 °C, with limited evidence at higher levels of warming. For precipitation, there are many reasons why estimates could be expected to differ, but few estimates exist to develop robust comparisons, and precipitation changes are rarely the dominant factor for predicting impacts given the prominent role of temperature, CO2, and ozone changes. A common difference between process-based and statistical studies is that the former tend to include the effects of CO2 increases that accompany warming, whereas statistical models typically do not. Major needs moving forward include incorporating CO2 effects into statistical studies, improving both approaches’ treatment of ozone, and increasing the use of both methods within the same study. At the same time, those who fund or use crop model projections should understand that in the short-term, both approaches when done well are likely to provide similar estimates of warming impacts, with statistical models generally requiring fewer resources to produce robust estimates, especially when applied to crops beyond the major grains.
Tsallis q-triplet, intermittent turbulence and Portevin-Le Chatelier effect
NASA Astrophysics Data System (ADS)
Iliopoulos, A. C.; Aifantis, E. C.
2018-05-01
In this paper, we extend a previous study concerning Portevin-LeChatelier (PLC) effect and Tsallis statistics (Iliopoulos et al., 2015). In particular, we estimate Tsallis' q-triplet, namely {qstat, qsens, qrel} for two sets of stress serration time series concerning the deformation of Cu-15%Al alloy corresponding to different deformation temperatures and thus types (A and B) of PLC bands. The results concerning the stress serrations analysis reveal that Tsallis q- triplet attains values different from unity ({qstat, qsens, qrel} ≠ {1,1,1}). In particular, PLC type A bands' serrations were found to follow Tsallis super-q-Gaussian, non-extensive, sub-additive, multifractal statistics indicating that the underlying dynamics are at the edge of chaos, characterized by global long range correlations and power law scaling. For PLC type B bands' serrations, the results revealed a Tsallis sub-q-Gaussian, non-extensive, super-additive, multifractal statistical profile. In addition, our results reveal also significant differences in statistical and dynamical features, indicating important variations of the stress field dynamics in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. We also estimate parameters commonly used for characterizing fully developed turbulence, such as structure functions and flatness coefficient (F), in order to provide further information about jerky flow underlying dynamics. Finally, we use two multifractal models developed to describe turbulence, namely Arimitsu and Arimitsu (A&A) [2000, 2001] theoretical model which is based on Tsallis statistics and p-model to estimate theoretical multifractal spectrums f(a). Furthermore, we estimate flatness coefficient (F) using a theoretical formula based on Tsallis statistics. The theoretical results are compared with the experimental ones showing a remarkable agreement between modeling and experiment. Finally, the results of this study verify, as well as, extend previous studies which stated that type B and type A PLC bands underlying dynamics are connected with distinct dynamical behavior, namely chaotic behavior for the first and self-organized critical (SOC) behavior for the latter, while they shed new light concerning the turbulent character of the PLC jerky flow.
28 CFR 22.25 - Final disposition of identifiable materials.
Code of Federal Regulations, 2011 CFR
2011-07-01
... RESEARCH AND STATISTICAL INFORMATION § 22.25 Final disposition of identifiable materials. Upon completion of a research or statistical project the security of identifiable research or statistical information...
28 CFR 22.25 - Final disposition of identifiable materials.
Code of Federal Regulations, 2010 CFR
2010-07-01
... RESEARCH AND STATISTICAL INFORMATION § 22.25 Final disposition of identifiable materials. Upon completion of a research or statistical project the security of identifiable research or statistical information...
NASA Astrophysics Data System (ADS)
Cailleret, Maxime; Snell, Rebecca; von Waldow, Harald; Kotlarski, Sven; Bugmann, Harald
2015-04-01
Different levels of uncertainty should be considered in climate impact projections by Dynamic Vegetation Models (DVMs), particularly when it comes to managing climate risks. Such information is useful to detect the key processes and uncertainties in the climate model - impact model chain and may be used to support recommendations for future improvements in the simulation of both climate and biological systems. In addition, determining which uncertainty source is dominant is an important aspect to recognize the limitations of climate impact projections by a multi-model ensemble mean approach. However, to date, few studies have clarified how each uncertainty source (baseline climate data, greenhouse gas emission scenario, climate model, and DVM) affects the projection of ecosystem properties. Focusing on one greenhouse gas emission scenario, we assessed the uncertainty in the projections of a forest landscape model (LANDCLIM) and a stand-scale forest gap model (FORCLIM) that is caused by linking climate data with an impact model. LANDCLIM was used to assess the uncertainty in future landscape properties of the Visp valley in Switzerland that is due to (i) the use of different 'baseline' climate data (gridded data vs. data from weather stations), and (ii) differences in climate projections among 10 GCM-RCM chains. This latter point was also considered for the projections of future forest properties by FORCLIM at several sites along an environmental gradient in Switzerland (14 GCM-RCM chains), for which we also quantified the uncertainty caused by (iii) the model chain specific statistical properties of the climate time-series, and (iv) the stochasticity of the demographic processes included in the model, e.g., the annual number of saplings that establish, or tree mortality. Using methods of variance decomposition analysis, we found that (i) The use of different baseline climate data strongly impacts the prediction of forest properties at the lowest and highest, but not so much at medium elevations. (ii) Considering climate change, the variability that is due to the GCM-RCM chains is much greater than the variability induced by the uncertainty in the initial climatic conditions. (iii) The uncertainties caused by the intrinsic stochasticity in the DVMs and by the random generation of the climate time-series are negligible. Overall, our results indicate that DVMs are quite sensitive to the climate data, highlighting particularly (1) the limitations of using one single multi-model average climate change scenario in climate impact studies and (2) the need to better consider the uncertainty in climate model outputs for projecting future vegetation changes.
Teaching the principles of statistical dynamics
Ghosh, Kingshuk; Dill, Ken A.; Inamdar, Mandar M.; Seitaridou, Effrosyni; Phillips, Rob
2012-01-01
We describe a simple framework for teaching the principles that underlie the dynamical laws of transport: Fick’s law of diffusion, Fourier’s law of heat flow, the Newtonian viscosity law, and the mass-action laws of chemical kinetics. In analogy with the way that the maximization of entropy over microstates leads to the Boltzmann distribution and predictions about equilibria, maximizing a quantity that E. T. Jaynes called “caliber” over all the possible microtrajectories leads to these dynamical laws. The principle of maximum caliber also leads to dynamical distribution functions that characterize the relative probabilities of different microtrajectories. A great source of recent interest in statistical dynamics has resulted from a new generation of single-particle and single-molecule experiments that make it possible to observe dynamics one trajectory at a time. PMID:23585693
Teaching the principles of statistical dynamics.
Ghosh, Kingshuk; Dill, Ken A; Inamdar, Mandar M; Seitaridou, Effrosyni; Phillips, Rob
2006-02-01
We describe a simple framework for teaching the principles that underlie the dynamical laws of transport: Fick's law of diffusion, Fourier's law of heat flow, the Newtonian viscosity law, and the mass-action laws of chemical kinetics. In analogy with the way that the maximization of entropy over microstates leads to the Boltzmann distribution and predictions about equilibria, maximizing a quantity that E. T. Jaynes called "caliber" over all the possible microtrajectories leads to these dynamical laws. The principle of maximum caliber also leads to dynamical distribution functions that characterize the relative probabilities of different microtrajectories. A great source of recent interest in statistical dynamics has resulted from a new generation of single-particle and single-molecule experiments that make it possible to observe dynamics one trajectory at a time.
2 kWe Solar Dynamic Ground Test Demonstration Project. Volume 3; Fabrication and Test Report
NASA Technical Reports Server (NTRS)
Alexander, Dennis
1997-01-01
The Solar Dynamic Ground Test Demonstration (SDGTD) project has successfully designed and fabricated a complete solar-powered closed Brayton electrical power generation system and tested it in a relevant thermal vacuum facility at NASA Lewis Research Center (LeRC). In addition to completing technical objectives, the project was completed 3-l/2 months early, and under budget.
The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics
NASA Astrophysics Data System (ADS)
Pavlos, George
2015-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.
NASA Astrophysics Data System (ADS)
Shekoyan, V.; Dehipawala, S.; Liu, Ernest; Tulsee, Vivek; Armendariz, R.; Tremberger, G.; Holden, T.; Marchese, P.; Cheung, T.
2012-10-01
Digital solar image data is available to users with access to standard, mass-market software. Many scientific projects utilize the Flexible Image Transport System (FITS) format, which requires specialized software typically used in astrophysical research. Data in the FITS format includes photometric and spatial calibration information, which may not be useful to researchers working with self-calibrated, comparative approaches. This project examines the advantages of using mass-market software with readily downloadable image data from the Solar Dynamics Observatory for comparative analysis over with the use of specialized software capable of reading data in the FITS format. Comparative analyses of brightness statistics that describe the solar disk in the study of magnetic energy using algorithms included in mass-market software have been shown to give results similar to analyses using FITS data. The entanglement of magnetic energy associated with solar eruptions, as well as the development of such eruptions, has been characterized successfully using mass-market software. The proposed algorithm would help to establish a publicly accessible, computing network that could assist in exploratory studies of all FITS data. The advances in computer, cell phone and tablet technology could incorporate such an approach readily for the enhancement of high school and first-year college space weather education on a global scale. Application to ground based data such as that contained in the Baryon Oscillation Spectroscopic Survey is discussed.
Towards bridging the gap between climate change projections and maize producers in South Africa
NASA Astrophysics Data System (ADS)
Landman, Willem A.; Engelbrecht, Francois; Hewitson, Bruce; Malherbe, Johan; van der Merwe, Jacobus
2018-05-01
Multi-decadal regional projections of future climate change are introduced into a linear statistical model in order to produce an ensemble of austral mid-summer maximum temperature simulations for southern Africa. The statistical model uses atmospheric thickness fields from a high-resolution (0.5° × 0.5°) reanalysis-forced simulation as predictors in order to develop a linear recalibration model which represents the relationship between atmospheric thickness fields and gridded maximum temperatures across the region. The regional climate model, the conformal-cubic atmospheric model (CCAM), projects maximum temperatures increases over southern Africa to be in the order of 4 °C under low mitigation towards the end of the century or even higher. The statistical recalibration model is able to replicate these increasing temperatures, and the atmospheric thickness-maximum temperature relationship is shown to be stable under future climate conditions. Since dry land crop yields are not explicitly simulated by climate models but are sensitive to maximum temperature extremes, the effect of projected maximum temperature change on dry land crops of the Witbank maize production district of South Africa, assuming other factors remain unchanged, is then assessed by employing a statistical approach similar to the one used for maximum temperature projections.
Probabilistic framework for assessing the ice sheet contribution to sea level change.
Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael
2013-02-26
Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.
Excerpts from Managing CQI in Radiology and Diagnostic Imaging Services: A CQI Handbook.
Joseph, E D; Lesher, C; Zage, R
1994-01-01
Continuous quality improvement (CQI) is currently the most popular and influential quality management program used in healthcare organizations. It is an effective methodology for identifying and acting on opportunities to improve the efficiency, effectiveness and value of services provided to customers. CQI implementation can be broken down into four components: (1) achievement objectives and goal identification, (2) system process analysis, (3) action planning and implementation, and (4) performance measurement and follow-up. As the project team establishes goals, it should consider customer and staff needs, what constitutes "quality," existing guidelines and regulations, and how results will be measured. Many techniques can be used to analyze the procedure or function targeted for improvement, including charts and diagrams, formal monitoring, data collection and statistical analysis. After the project team has identified potential service improvements, they develop an action plan, which may include education, recruitment, reassignment or equipment acquisition. The team must consider the impact of proposed changes and the financial and logistical feasibility of various proposals. The dynamic challenges of radiology and diagnostic imaging cannot be addressed through single, isolated actions; efforts to improve quality should be continuous. Accordingly, the project team should measure and analyze results of the action plan, reappraise goals and look for opportunities to further improve service.
Simulating the impact of long-term care policy on family eldercare hours.
Ansah, John P; Matchar, David B; Love, Sean R; Malhotra, Rahul; Do, Young Kyung; Chan, Angelique; Eberlein, Robert
2013-04-01
To understand the effect of current and future long-term care (LTC) policies on family eldercare hours for older adults (60 years of age and older) in Singapore. The Social Isolation Health and Lifestyles Survey, the Survey on Informal Caregiving, and the Singapore Government's Ministry of Health and Department of Statistics. An LTC Model was created using system dynamics methodology and parameterized using available reports and data as well as informal consultation with LTC experts. In the absence of policy change, among the elderly living at home with limitations in their activities of daily living (ADLs), the proportion of those with greater ADL limitations will increase. In addition, by 2030, average family eldercare hours per week are projected to increase by 41 percent from 29 to 41 hours. All policy levers considered would moderate or significantly reduce family eldercare hours. System dynamics modeling was useful in providing policy makers with an overview of the levers available to them and in demonstrating the interdependence of policies and system components. © Health Research and Educational Trust.
Ruiz, Daniel; Cerón, Viviana; Molina, Adriana M; Quiñónes, Martha L; Jiménez, Mónica M; Ahumada, Martha; Gutiérrez, Patricia; Osorio, Salua; Mantilla, Gilma; Connor, Stephen J; Thomson, Madeleine C
2014-07-01
As part of the Integrated National Adaptation Pilot project and the Integrated Surveillance and Control System, the Colombian National Institute of Health is working on the design and implementation of a Malaria Early Warning System framework, supported by seasonal climate forecasting capabilities, weather and environmental monitoring, and malaria statistical and dynamic models. In this report, we provide an overview of the local ecoepidemiologic settings where four malaria process-based mathematical models are currently being implemented at a municipal level. The description includes general characteristics, malaria situation (predominant type of infection, malaria-positive cases data, malaria incidence, and seasonality), entomologic conditions (primary and secondary vectors, mosquito densities, and feeding frequencies), climatic conditions (climatology and long-term trends), key drivers of epidemic outbreaks, and non-climatic factors (populations at risk, control campaigns, and socioeconomic conditions). Selected pilot sites exhibit different ecoepidemiologic settings that must be taken into account in the development of the integrated surveillance and control system. © The American Society of Tropical Medicine and Hygiene.
Tsallis non-extensive statistics and solar wind plasma complexity
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.
2015-03-01
This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).
Problems with the North American Monsoon in CMIP/IPCC GCM Precipitation
NASA Astrophysics Data System (ADS)
Schiffer, N. J.; Nesbitt, S. W.
2011-12-01
Successful water management in the Desert Southwest and surrounding areas hinges on anticipating the timing and distribution of precipitation. IPCC AR4 models predict a more arid climate, more extreme precipitation events, and an earlier peak in springtime streamflow in the North American Monsoon region as the area warms. This study aims to assess the summertime skill with which general circulation models (GCMs) simulate precipitation and related dynamics over this region, a necessary precursor to reliable hydroclimate projections. Thirty-year climatologies of several GCMs in the third and fifth Climate Model Intercomparison Projects (CMIP) are statistically evaluated against each other and observed climatology for their skill in representing the location, timing, variability, character, and large-scale forcing of precipitation over the southwestern United States and northwestern Mexico. The results of this study will lend greater credence to more detailed, higher resolution studies, based on the CMIP and IPCC models, of the region's future hydrology. Our ultimate goal is to provide guidance such that decision-makers can plan future water management with more confidence.
Orbit Alignment in Triple Stars
NASA Astrophysics Data System (ADS)
Tokovinin, Andrei
2017-08-01
The statistics of the angle Φ between orbital angular momenta in hierarchical triple systems with known inner visual or astrometric orbits are studied. A correlation between apparent revolution directions proves the partial orbit alignment known from earlier works. The alignment is strong in triples with outer projected separation less than ˜50 au, where the average Φ is about 20^\\circ . In contrast, outer orbits wider than 1000 au are not aligned with the inner orbits. It is established that the orbit alignment decreases with the increasing mass of the primary component. The average eccentricity of inner orbits in well-aligned triples is smaller than in randomly aligned ones. These findings highlight the role of dissipative interactions with gas in defining the orbital architecture of low-mass triple systems. On the other hand, chaotic dynamics apparently played a role in shaping more massive hierarchies. The analysis of projected configurations and triples with known inner and outer orbits indicates that the distribution of Φ is likely bimodal, where 80% of triples have {{Φ }}< 70^\\circ and the remaining ones are randomly aligned.
Galaxy Zoo: Infrared and Optical Morphology
NASA Astrophysics Data System (ADS)
Carla Shanahan, Jesse; Lintott, Chris; Zoo, Galaxy
2018-01-01
We present the detailed, visual morphologies of approximately 60,000 galaxies observed by the UKIRT Infrared Deep Sky Survey and then classified by participants in the Galaxy Zoo project. Our sample is composed entirely of nearby objects with redshifts of z ≤ 0.3, which enables us to robustly analyze their morphological characteristics including smoothness, bulge properties, spiral structure, and evidence of bars or rings. The determination of these features is made via a consensus-based analysis of the Galaxy Zoo project data in which inconsistent and outlying classifications are statistically down-weighted. We then compare these classifications of infrared morphology to the objects’ optical classifications in the Galaxy Zoo 2 release (Willett et al. 2013). It is already known that morphology is an effective tool for uncovering a galaxy’s dynamical past, and previous studies have shown significant correlations with physical characteristics such as stellar mass distribution and star formation history. We show that majority of the sample has agreement or expected differences between the optical and infrared classifications, but also present a preliminary analysis of a subsample of objects with striking discrepancies.
Statistics for clinical nursing practice: an introduction.
Rickard, Claire M
2008-11-01
Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.
Natural neural projection dynamics underlying social behavior
Gunaydin, Lisa A.; Grosenick, Logan; Finkelstein, Joel C.; Kauvar, Isaac V.; Fenno, Lief E.; Adhikari, Avishek; Lammel, Stephan; Mirzabekov, Julie J.; Airan, Raag D.; Zalocusky, Kelly A.; Tye, Kay M.; Anikeeva, Polina; Malenka, Robert C.; Deisseroth, Karl
2014-01-01
Social interaction is a complex behavior essential for many species, and is impaired in major neuropsychiatric disorders. Pharmacological studies have implicated certain neurotransmitter systems in social behavior, but circuit-level understanding of endogenous neural activity during social interaction is lacking. We therefore developed and applied a new methodology, termed fiber photometry, to optically record natural neural activity in genetically- and connectivity-defined projections to elucidate the real-time role of specified pathways in mammalian behavior. Fiber photometry revealed that activity dynamics of a ventral tegmental area (VTA)-to-nucleus accumbens (NAc) projection could encode and predict key features of social but not novel-object interaction. Consistent with this observation, optogenetic control of cells specifically contributing to this projection was sufficient to modulate social behavior, which was mediated by type-1 dopamine receptor signaling downstream in the NAc. Direct observation of projection-specific activity in this way captures a fundamental and previously inaccessible dimension of circuit dynamics. PMID:24949967
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramamurthy, Byravamurthy
2014-05-05
In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daffron, James Y.
2003-02-27
Unexploded Ordnance (UXO) removal and investigation projects typically involve multiple organizations including Government entities, private contractors, and technical experts. Resources are split into functional ''teams'' who perform the work and interface with the clients. The projects typically generate large amounts of data that must be shared among the project team members, the clients, and the public. The ability to efficiently communicate and control information is essential to project success. Web-based project collaboration is an effective management and communication tool when applied to ordnance and explosives (OE) projects. During a recent UXO/OE removal project at the Jefferson Proving Ground (JPG) inmore » Madison, IN, American Technologies, Inc. (ATI) successfully used the Project Commander(reg sign) (www.ProCommander.com) project collaboration website as a dynamic project and information management tool.« less
NASA Technical Reports Server (NTRS)
Null, Cynthia H.
2009-01-01
In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.
Inferring monopartite projections of bipartite networks: an entropy-based approach
NASA Astrophysics Data System (ADS)
Saracco, Fabio; Straka, Mika J.; Di Clemente, Riccardo; Gabrielli, Andrea; Caldarelli, Guido; Squartini, Tiziano
2017-05-01
Bipartite networks are currently regarded as providing a major insight into the organization of many real-world systems, unveiling the mechanisms driving the interactions occurring between distinct groups of nodes. One of the most important issues encountered when modeling bipartite networks is devising a way to obtain a (monopartite) projection on the layer of interest, which preserves as much as possible the information encoded into the original bipartite structure. In the present paper we propose an algorithm to obtain statistically-validated projections of bipartite networks, according to which any two nodes sharing a statistically-significant number of neighbors are linked. Since assessing the statistical significance of nodes similarity requires a proper statistical benchmark, here we consider a set of four null models, defined within the exponential random graph framework. Our algorithm outputs a matrix of link-specific p-values, from which a validated projection is straightforwardly obtainable, upon running a multiple hypothesis testing procedure. Finally, we test our method on an economic network (i.e. the countries-products World Trade Web representation) and a social network (i.e. MovieLens, collecting the users’ ratings of a list of movies). In both cases non-trivial communities are detected: while projecting the World Trade Web on the countries layer reveals modules of similarly-industrialized nations, projecting it on the products layer allows communities characterized by an increasing level of complexity to be detected; in the second case, projecting MovieLens on the films layer allows clusters of movies whose affinity cannot be fully accounted for by genre similarity to be individuated.
Statistical-dynamical modeling of the cloud-to-ground lightning activity in Portugal
NASA Astrophysics Data System (ADS)
Sousa, J. F.; Fragoso, M.; Mendes, S.; Corte-Real, J.; Santos, J. A.
2013-10-01
The present study employs a dataset of cloud-to-ground discharges over Portugal, collected by the Portuguese lightning detection network in the period of 2003-2009, to identify dynamically coherent lightning regimes in Portugal and to implement a statistical-dynamical modeling of the daily discharges over the country. For this purpose, the high-resolution MERRA reanalysis is used. Three lightning regimes are then identified for Portugal: WREG, WREM and SREG. WREG is a typical cold-core cut-off low. WREM is connected to strong frontal systems driven by remote low pressure systems at higher latitudes over the North Atlantic. SREG is a combination of an inverted trough and a mid-tropospheric cold-core nearby Portugal. The statistical-dynamical modeling is based on logistic regressions (statistical component) developed for each regime separately (dynamical component). It is shown that the strength of the lightning activity (either strong or weak) for each regime is consistently modeled by a set of suitable dynamical predictors (65-70% of efficiency). The difference of the equivalent potential temperature in the 700-500 hPa layer is the best predictor for the three regimes, while the best 4-layer lifted index is still important for all regimes, but with much weaker significance. Six other predictors are more suitable for a specific regime. For the purpose of validating the modeling approach, a regional-scale climate model simulation is carried out under a very intense lightning episode.
Statistical benchmark for BosonSampling
NASA Astrophysics Data System (ADS)
Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas
2016-03-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.
Mainstreaming Modeling and Simulation to Accelerate Public Health Innovation
Sepulveda, Martin-J.; Mabry, Patricia L.
2014-01-01
Dynamic modeling and simulation are systems science tools that examine behaviors and outcomes resulting from interactions among multiple system components over time. Although there are excellent examples of their application, they have not been adopted as mainstream tools in population health planning and policymaking. Impediments to their use include the legacy and ease of use of statistical approaches that produce estimates with confidence intervals, the difficulty of multidisciplinary collaboration for modeling and simulation, systems scientists’ inability to communicate effectively the added value of the tools, and low funding for population health systems science. Proposed remedies include aggregation of diverse data sets, systems science training for public health and other health professionals, changing research incentives toward collaboration, and increased funding for population health systems science projects. PMID:24832426
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sylvester, Linda; Omitaomu, Olufemi A.; Parish, Esther S.
2016-09-01
Oak Ridge National Laboratory (ORNL) and the City of Knoxville, Tennessee have partnered to work on a Laboratory Directed Research and Development (LDRD) project towards investigating climate change, mitigation, and adaptation measures in mid-sized cities. ORNL has statistically and dynamically downscaled ten Global Climate Models (GCMs) to both 1 km and 4 km resolutions. The processing and summary of those ten gridded datasets for use in a web-based tool is described. The summaries of each model are shown individually to assist in determining the similarities and differences between the model scenarios. The variables of minimum and maximum daily temperature andmore » total monthly precipitation are summarized for the area of Knoxville, Tennessee for the periods of 1980-2005 and 2025-2050.« less
Immediate effects of cryotherapy on static and dynamic balance.
Douglas, Matthew; Bivens, Serena; Pesterfield, Jennifer; Clemson, Nathan; Castle, Whitney; Sole, Gisela; Wassinger, Craig A
2013-02-01
Cryotherapy is commonly used in physical therapy with many known benefits; however several investigations have reported decreased functional performance following therapeutic application thereof. The purpose of this study was to determine the effect of cryotherapy applied to the ankle on static and dynamic standing balance. It was hypothesized that balance would be decreased after cryotherapy application. Twenty individuals (aged 18 to 40 years) participated in this research project. Each participant was tested under two conditions: an experimental condition where subjects received ice water immersion of the foot and ankle for 15 minutes immediately before balance testing and a control condition completed at room temperature. A Biodex® Balance System was used to quantify balance using anterior/posterior (AP), medial/lateral (ML), and overall balance indices. Paired t-tests were used to compare the balance indices for the two conditions with alpha set at 0.05 a priori. Effect size was also calculated to account for the multiple comparisons made. The static balance indices did not display statistically significant differences between the post-cryotherapy and the control conditions with low effect sizes. Dynamic ML indices significantly increased following the cryotherapy application compared to the control exhibiting a moderate effect size indicating decreased balance following cryotherapy application. No differences were noted between experimental and control conditions for the dynamic AP or overall balance indices while a small effect size was noted for both. The results suggest that cryotherapy to the ankle has a negative effect on the ML component of dynamic balance following ice water immersion. Immediate return to play following cryotherapy application is cautioned given the decreased dynamic ML balance and potential for increased injury risk. 3b Case-control study.
Old, L.; Wojtak, R.; Pearce, F. R.; ...
2017-12-20
With the advent of wide-field cosmological surveys, we are approaching samples of hundreds of thousands of galaxy clusters. While such large numbers will help reduce statistical uncertainties, the control of systematics in cluster masses is crucial. Here we examine the effects of an important source of systematic uncertainty in galaxy-based cluster mass estimation techniques: the presence of significant dynamical substructure. Dynamical substructure manifests as dynamically distinct subgroups in phase-space, indicating an ‘unrelaxed’ state. This issue affects around a quarter of clusters in a generally selected sample. We employ a set of mock clusters whose masses have been measured homogeneously withmore » commonly used galaxy-based mass estimation techniques (kinematic, richness, caustic, radial methods). We use these to study how the relation between observationally estimated and true cluster mass depends on the presence of substructure, as identified by various popular diagnostics. We find that the scatter for an ensemble of clusters does not increase dramatically for clusters with dynamical substructure. However, we find a systematic bias for all methods, such that clusters with significant substructure have higher measured masses than their relaxed counterparts. This bias depends on cluster mass: the most massive clusters are largely unaffected by the presence of significant substructure, but masses are significantly overestimated for lower mass clusters, by ~ 10 percent at 10 14 and ≳ 20 percent for ≲ 10 13.5. Finally, the use of cluster samples with different levels of substructure can therefore bias certain cosmological parameters up to a level comparable to the typical uncertainties in current cosmological studies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Old, L.; Wojtak, R.; Pearce, F. R.
With the advent of wide-field cosmological surveys, we are approaching samples of hundreds of thousands of galaxy clusters. While such large numbers will help reduce statistical uncertainties, the control of systematics in cluster masses is crucial. Here we examine the effects of an important source of systematic uncertainty in galaxy-based cluster mass estimation techniques: the presence of significant dynamical substructure. Dynamical substructure manifests as dynamically distinct subgroups in phase-space, indicating an ‘unrelaxed’ state. This issue affects around a quarter of clusters in a generally selected sample. We employ a set of mock clusters whose masses have been measured homogeneously withmore » commonly used galaxy-based mass estimation techniques (kinematic, richness, caustic, radial methods). We use these to study how the relation between observationally estimated and true cluster mass depends on the presence of substructure, as identified by various popular diagnostics. We find that the scatter for an ensemble of clusters does not increase dramatically for clusters with dynamical substructure. However, we find a systematic bias for all methods, such that clusters with significant substructure have higher measured masses than their relaxed counterparts. This bias depends on cluster mass: the most massive clusters are largely unaffected by the presence of significant substructure, but masses are significantly overestimated for lower mass clusters, by ~ 10 percent at 10 14 and ≳ 20 percent for ≲ 10 13.5. Finally, the use of cluster samples with different levels of substructure can therefore bias certain cosmological parameters up to a level comparable to the typical uncertainties in current cosmological studies.« less
2016-05-01
Algorithm for Overcoming the Curse of Dimensionality for Certain Non-convex Hamilton-Jacobi Equations, Projections and Differential Games Yat Tin...subproblems. Our approach is expected to have wide applications in continuous dynamic games , control theory problems, and elsewhere. Mathematics...differential dynamic games , control theory problems, and dynamical systems coming from the physical world, e.g. [11]. An important application is to
Developing Statistical Literacy with Year 9 Students: A Collaborative Research Project
ERIC Educational Resources Information Center
Sharma, Sashi
2013-01-01
Advances in technology and communication have increased the amount of statistical information delivered through everyday media. The importance of statistics in everyday life has led to calls for increased attention to statistical literacy in the mathematics curriculum (Watson 2006). Gal (2004) sees statistical literacy as the need for students to…
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Hamman, Josheph J; Hamlet, Alan F.; Fuller, Roger; Grossman, Eric E.
2016-01-01
Current understanding of the combined effects of sea level rise (SLR), storm surge, and changes in river flooding on near-coastal environments is very limited. This project uses a suite of numerical models to examine the combined effects of projected future climate change on flooding in the Skagit floodplain and estuary. Statistically and dynamically downscaled global climate model scenarios from the ECHAM-5 GCM were used as the climate forcings. Unregulated daily river flows were simulated using the VIC hydrology model, and regulated river flows were simulated using the SkagitSim reservoir operations model. Daily tidal anomalies (TA) were calculated using a regression approach based on ENSO and atmospheric pressure forcing simulated by the WRF regional climate model. A 2-D hydrodynamic model was used to estimate water surface elevations in the Skagit floodplain using resampled hourly hydrographs keyed to regulated daily flood flows produced by the reservoir simulation model, and tide predictions adjusted for SLR and TA. Combining peak annual TA with projected sea level rise, the historical (1970–1999) 100-yr peak high water level is exceeded essentially every year by the 2050s. The combination of projected sea level rise and larger floods by the 2080s yields both increased flood inundation area (+ 74%), and increased average water depth (+ 25 cm) in the Skagit floodplain during a 100-year flood. Adding sea level rise to the historical FEMA 100-year flood resulted in a 35% increase in inundation area by the 2040's, compared to a 57% increase when both SLR and projected changes in river flow were combined.
A statistical physics viewpoint on the dynamics of the bouncing ball
NASA Astrophysics Data System (ADS)
Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric
2016-06-01
We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.
NASA Astrophysics Data System (ADS)
Palomino-Lemus, Reiner; Córdoba-Machado, Samir; Quishpe-Vásquez, César; García-Valdecasas-Ojeda, Matilde; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María
2017-04-01
In this study the Principal Component Regression (PCR) method has been used as statistical downscaling technique for simulating boreal winter precipitation in Tropical America during the period 1950-2010, and then for generating climate change projections for 2071-2100 period. The study uses the Global Precipitation Climatology Centre (GPCC, version 6) data set over the Tropical America region [30°N-30°S, 120°W-30°W] as predictand variable in the downscaling model. The mean monthly sea level pressure (SLP) from the National Center for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR reanalysis project), has been used as predictor variable, covering a more extended area [30°N-30°S, 180°W-30°W]. Also, the SLP outputs from 20 GCMs, taken from the Coupled Model Intercomparison Project (CMIP5) have been used. The model data include simulations with historical atmospheric concentrations and future projections for the representative concentration pathways RCP2.6, RCP4.5, and RCP8.5. The ability of the different GCMs to simulate the winter precipitation in the study area for present climate (1971-2000) was analyzed by calculating the differences between the simulated and observed precipitation values. Additionally, the statistical significance at 95% confidence level of these differences has been estimated by means of the bilateral rank sum test of Wilcoxon-Mann-Whitney. Finally, to project winter precipitation in the area for the period 2071-2100, the downscaling model, recalibrated for the total period 1950-2010, was applied to the SLP outputs of the GCMs under the RCP2.6, RCP4.5, and RCP8.5 scenarios. The results show that, generally, for present climate the statistical downscaling shows a high ability to faithfully reproduce the precipitation field, while the simulations performed directly by using not downscaled outputs of GCMs strongly distort the precipitation field. For future climate, the projected predictions under the RCP4.5 and RCP8.5 scenarios show large areas with significant changes. For the RCP2.6 scenario, projected results present a predominance of very moderate decreases in rainfall, although significant in some models. Keywords: climate change projections, precipitation, Tropical America, statistical downscaling. Acknowledgements: This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and CGL2013-48539-R (MINECO-Spain, FEDER).
Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects
Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.
2012-01-01
Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305
NASA Astrophysics Data System (ADS)
Lanorte, R.; Lasaponara, R.; De Santis, F.; Aromando, A.; Nole, G.
2012-04-01
Daily estimates of fire danger using multitemporal satellite MODIS data: the experience of FIRE-SAT in the Basilicata Region (Italy) A. Lanorte, F. De Santis , A. Aromando, G. Nolè, R. Lasaponara, CNR-IMAA, Potenza, Italy In the recent years the Basilicata Region (Southern Italy) has been characterized by an increasing incidence of fire disturbance which also tends to affect protected (Regional and national parks) and natural vegetated areas. FIRE_SAT project has been funded by the Civil Protection of the Basilicata Region in order to set up a low cost methodology for fire danger/risk monitoring based on satellite Earth Observation techniques. To this aim, NASA Moderate Resolution Imaging Spectroradiometer (MODIS) data were used. The spectral capability and daily availability makes MODIS products especially suitable for estimating the variations of fuel characteristics. This work presents new significant results obtained in the context of FIRE-SAT project. In order to obtain a dynamical indicator of fire susceptibility based on multitemporal MODIS satellite data, up-datable in short-time periods (daily), we used the spatial/temporal variations of following parameters: (1) Relative Greenness Index (2) Live and dead fuel moisture content (3) Temperature In particular, the dead fuel moisture content is a key factor in fire ignition. Dead fuel moisture dynamics are significantly faster than those observed for live fuel. Dead fine vegetation exhibits moisture and density values dependent on rapid atmospheric changes and strictly linked to local meteorological conditions. For this reason, commonly, the estimation of dead fuel moisture content is based on meteorological variables. In this study we propose to use MODIS data to estimate meteorological data (specifically Relative Humidity) at an adequate spatial and temporal resolution. The assessment of dead fuel moisture content plays a decisive role in determining a fire dynamic danger index in combination with other factors. This greatly improves the reliability of fire danger maps obtained on the basis of a integrated approach of the dynamic factors mentioned above and the static factors (fuel physical properties, morphological parameters and social-historical factors). The validation of the fire danger indices was carried out by the use of statistics of occurred forest fires. The validation results show satisfactory agreement with the fire danger map taking into account that . fire events are indirect indicator of fire danger; indeed, many factor influence fire ignition and spread such as human pressure, fire-fighting conditions, wind, etc.. Therefore, in this study we have defined and used several fire statistic data useful for the validation of the fire danger maps in order to create the basic elements for the design of a validation protocol.
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
Activity statistics in a colloidal glass former: Experimental evidence for a dynamical transition
NASA Astrophysics Data System (ADS)
Abou, Bérengère; Colin, Rémy; Lecomte, Vivien; Pitard, Estelle; van Wijland, Frédéric
2018-04-01
In a dense colloidal suspension at a volume fraction below the glass transition, we follow the trajectories of an assembly of tracers over a large time window. We define a local activity, which quantifies the local tendency of the system to rearrange. We determine the statistics of the time integrated activity, and we argue that it develops a low activity tail that comes together with the onset of glassy-like behavior and heterogeneous dynamics. These rare events may be interpreted as the reflection of an underlying dynamic phase transition.
Mathematical Representation Ability by Using Project Based Learning on the Topic of Statistics
NASA Astrophysics Data System (ADS)
Widakdo, W. A.
2017-09-01
Seeing the importance of the role of mathematics in everyday life, mastery of the subject areas of mathematics is a must. Representation ability is one of the fundamental ability that used in mathematics to make connection between abstract idea with logical thinking to understanding mathematics. Researcher see the lack of mathematical representation and try to find alternative solution to dolve it by using project based learning. This research use literature study from some books and articles in journals to see the importance of mathematical representation abiliy in mathemtics learning and how project based learning able to increase this mathematical representation ability on the topic of Statistics. The indicators for mathematical representation ability in this research classifies namely visual representation (picture, diagram, graph, or table); symbolize representation (mathematical statement. Mathematical notation, numerical/algebra symbol) and verbal representation (written text). This article explain about why project based learning able to influence student’s mathematical representation by using some theories in cognitive psychology, also showing the example of project based learning that able to use in teaching statistics, one of mathematics topic that very useful to analyze data.
Dose fractionation theorem in 3-D reconstruction (tomography)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glaeser, R.M.
It is commonly assumed that the large number of projections for single-axis tomography precludes its application to most beam-labile specimens. However, Hegerl and Hoppe have pointed out that the total dose required to achieve statistical significance for each voxel of a computed 3-D reconstruction is the same as that required to obtain a single 2-D image of that isolated voxel, at the same level of statistical significance. Thus a statistically significant 3-D image can be computed from statistically insignificant projections, as along as the total dosage that is distributed among these projections is high enough that it would have resultedmore » in a statistically significant projection, if applied to only one image. We have tested this critical theorem by simulating the tomographic reconstruction of a realistic 3-D model created from an electron micrograph. The simulations verify the basic conclusions of high absorption, signal-dependent noise, varying specimen contrast and missing angular range. Furthermore, the simulations demonstrate that individual projections in the series of fractionated-dose images can be aligned by cross-correlation because they contain significant information derived from the summation of features from different depths in the structure. This latter information is generally not useful for structural interpretation prior to 3-D reconstruction, owing to the complexity of most specimens investigated by single-axis tomography. These results, in combination with dose estimates for imaging single voxels and measurements of radiation damage in the electron microscope, demonstrate that it is feasible to use single-axis tomography with soft X-ray microscopy of frozen-hydrated specimens.« less
1984-12-01
AD-RI59 367 STATISTICS FROM THE OPERATION OF THE LOW-LEVEL WIND I/i SHEAR ALERT SYSTEM (L..(U) NATIONAL CENTER FOR ATOMSPHERIC RESEARCH BOULDER CO...NATIONAL BUREAU OF STANDARDS-1963A % % Oh b DOT/FAAIPM-84132 Statistics from the Operation of the Program Engineering Low-Level Wind Shear Alert System and...The Operation of The Low-Level Wind December 1984 Shear Alert System (LLWAS) During The JAWS Project: 6. Performing Organization Code An Interim Report
NASA Astrophysics Data System (ADS)
Song, Seok Goo; Kwak, Sangmin; Lee, Kyungbook; Park, Donghee
2017-04-01
It is a critical element to predict the intensity and variability of strong ground motions in seismic hazard assessment. The characteristics and variability of earthquake rupture process may be a dominant factor in determining the intensity and variability of near-source strong ground motions. Song et al. (2014) demonstrated that the variability of earthquake rupture scenarios could be effectively quantified in the framework of 1-point and 2-point statistics of earthquake source parameters, constrained by rupture dynamics and past events. The developed pseudo-dynamic source modeling schemes were also validated against the recorded ground motion data of past events and empirical ground motion prediction equations (GMPEs) at the broadband platform (BBP) developed by the Southern California Earthquake Center (SCEC). Recently we improved the computational efficiency of the developed pseudo-dynamic source-modeling scheme by adopting the nonparametric co-regionalization algorithm, introduced and applied in geostatistics initially. We also investigated the effect of earthquake rupture process on near-source ground motion characteristics in the framework of 1-point and 2-point statistics, particularly focusing on the forward directivity region. Finally we will discuss whether the pseudo-dynamic source modeling can reproduce the variability (standard deviation) of empirical GMPEs and the efficiency of 1-point and 2-point statistics to address the variability of ground motions.
A solar cycle dependence of nonlinearity in magnetospheric activity
NASA Astrophysics Data System (ADS)
Johnson, Jay R.; Wing, Simon
2005-04-01
The nonlinear dependencies inherent to the historical Kp data stream (1932-2003) are examined using mutual information and cumulant-based cost as discriminating statistics. The discriminating statistics are compared with surrogate data streams that are constructed using the corrected amplitude adjustment Fourier transform (CAAFT) method and capture the linear properties of the original Kp data. Differences are regularly seen in the discriminating statistics a few years prior to solar minima, while no differences are apparent at the time of solar maxima. These results suggest that the dynamics of the magnetosphere tend to be more linear at solar maximum than at solar minimum. The strong nonlinear dependencies tend to peak on a timescale around 40-50 hours and are statistically significant up to 1 week. Because the solar wind driver variables, VBs, and dynamical pressure exhibit a much shorter decorrelation time for nonlinearities, the results seem to indicate that the nonlinearity is related to internal magnetospheric dynamics. Moreover, the timescales for the nonlinearity seem to be on the same order as that for storm/ring current relaxation. We suggest that the strong solar wind driving that occurs around solar maximum dominates the magnetospheric dynamics, suppressing the internal magnetospheric nonlinearity. On the other hand, in the descending phase of the solar cycle just prior to solar minimum, when magnetospheric activity is weaker, the dynamics exhibit a significant nonlinear internal magnetospheric response that may be related to increased solar wind speed.
Emergency Airway Response Team Simulation Training: A Nursing Perspective.
Crimlisk, Janet T; Krisciunas, Gintas P; Grillone, Gregory A; Gonzalez, R Mauricio; Winter, Michael R; Griever, Susan C; Fernandes, Eduarda; Medzon, Ron; Blansfield, Joseph S; Blumenthal, Adam
Simulation-based education is an important tool in the training of professionals in the medical field, especially for low-frequency, high-risk events. An interprofessional simulation-based training program was developed to enhance Emergency Airway Response Team (EART) knowledge, team dynamics, and personnel confidence. This quality improvement study evaluated the EART simulation training results of nurse participants. Twenty-four simulation-based classes of 4-hour sessions were conducted during a 12-week period. Sixty-three nurses from the emergency department (ED) and the intensive care units (ICUs) completed the simulation. Participants were evaluated before and after the simulation program with a knowledge-based test and a team dynamics and confidence questionnaire. Additional comparisons were made between ED and ICU nurses and between nurses with previous EART experience and those without previous EART experience. Comparison of presimulation (presim) and postsimulation (postsim) results indicated a statistically significant gain in both team dynamics and confidence and Knowledge Test scores (P < .01). There were no differences in scores between ED and ICU groups in presim or postsim scores; nurses with previous EART experience demonstrated significantly higher presim scores than nurses without EART experience, but there were no differences between these nurse groups at postsim. This project supports the use of simulation training to increase nurses' knowledge, confidence, and team dynamics in an EART response. Importantly, nurses with no previous experience achieved outcome scores similar to nurses who had experience, suggesting that emergency airway simulation is an effective way to train both new and experienced nurses.
Crookes, D J; Blignaut, J N; de Wit, M P; Esler, K J; Le Maitre, D C; Milton, S J; Mitchell, S A; Cloete, J; de Abreu, P; Fourie nee Vlok, H; Gull, K; Marx, D; Mugido, W; Ndhlovu, T; Nowell, M; Pauw, M; Rebelo, A
2013-05-15
Can markets assist by providing support for ecological restoration, and if so, under what conditions? The first step in addressing this question is to develop a consistent methodology for economic evaluation of ecological restoration projects. A risk analysis process was followed in which a system dynamics model was constructed for eight diverse case study sites where ecological restoration is currently being pursued. Restoration costs vary across each of these sites, as do the benefits associated with restored ecosystem functioning. The system dynamics model simulates the ecological, hydrological and economic benefits of ecological restoration and informs a portfolio mapping exercise where payoffs are matched against the likelihood of success of a project, as well as a number of other factors (such as project costs and risk measures). This is the first known application that couples ecological restoration with system dynamics and portfolio mapping. The results suggest an approach that is able to move beyond traditional indicators of project success, since the effect of discounting is virtually eliminated. We conclude that systems dynamic modelling with portfolio mapping can guide decisions on when markets for restoration activities may be feasible. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Billings, Paul H.
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…
Semiconductor Laser Complex Dynamics: From Optical Neurons to Optical Rogue Waves
2017-02-11
laser dynamics for innovative applications. The results of the project were published in 5 high- impact journal papers and were presented as invited or...stochastic phenomena and ii) to exploit the laser dynamics for innovative applications. The results of the project were published in 5 high-impact...RESULTS AND DISCUSSION The results of our research were published in 5 articles in high-impact journals in the fields of photonics and nonlinear physics
The GenABEL Project for statistical genomics.
Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.
Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects
NASA Technical Reports Server (NTRS)
Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian;
2015-01-01
Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex process-based crop models is a rather new idea. We demonstrate herewith that statistical methods can play an important role in analyzing simulated yield data sets obtained from the ensembles of process-based crop models. Formal statistical analysis is helpful to estimate the effects of different climatic variables on yield, and to describe the between-model variability of these effects.
Statistical Challenges in Military Research
2016-07-30
University of Tennessee Health Science Center currently has five NIH/DOD funded grant projects addressing tobacco, alcohol abuse, and obesity prevention in... American Statistical Association (Section on Defense and National Security), Joint Statistical Meetings, Chicago, IL, August 2016 The opinions
Response statistics of rotating shaft with non-linear elastic restoring forces by path integration
NASA Astrophysics Data System (ADS)
Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael
2017-07-01
Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Testing for significance of phase synchronisation dynamics in the EEG.
Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J
2013-06-01
A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.
Learning predictive statistics from temporal sequences: Dynamics and strategies.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-10-01
Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.
ERIC Educational Resources Information Center
Tuck, Sarah
2014-01-01
This article reflects critically on "The Social Dynamics of Art Research: Contemporary Photography in Belfast", an engaged research project conducted with photographers, community activists, academics and visual artists in Belfast. Through a critical examination of the project's theoretical architecture and methodological framework this…
Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Youngsoo; Carlberg, Kevin Thomas
Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over allmore » space and time in a weighted ℓ 2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.« less
77 FR 62517 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
...-based vital statistics at the national level, referred to as the U.S. National Vital Statistics System... days of this notice. Proposed Project Vital Statistics Training Application, OMB No. 0920-0217--Revision exp. 5/31/2013--National Center for Health Statistics (NCHS), Centers for Disease Control and...
75 FR 15709 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... statistics at the national level, referred to as the U.S. National Vital Statistics System (NVSS), depends on.... Proposed Project Vital Statistics Training Application (OMB No. 0920-0217 exp. 7/31/ 2010)--Extension--National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background...
Play It Again: Teaching Statistics with Monte Carlo Simulation
ERIC Educational Resources Information Center
Sigal, Matthew J.; Chalmers, R. Philip
2016-01-01
Monte Carlo simulations (MCSs) provide important information about statistical phenomena that would be impossible to assess otherwise. This article introduces MCS methods and their applications to research and statistical pedagogy using a novel software package for the R Project for Statistical Computing constructed to lessen the often steep…
NASA Astrophysics Data System (ADS)
Oktavia, Y.
2018-03-01
This research aims to: (1) Analyze the level of socio-cultural dynamics of agibusiness aquaculture actors. (2) Analyze the influence of socio-cultural dynamics on convergence communication of capacity development of aquaculture agribusiness actors.Data was collected by questionnaire and interview of group members on agribusiness. Data analyze was done by descriptive and inferential statistics with using SEM method. The result of descriptive statistics on 284 agribusiness members showed that: Socio-cultural dynamics of agibusiness aquaculture actors was in low category, as shown by lack of the role of customary institutions and quality of local leadership.The communication convergence is significantly and positively influenced by the communication behavior of agribusiness actors in access information.
NASA Astrophysics Data System (ADS)
Koparan, Timur; Güven, Bülent
2015-07-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.
Use of System Dynamics Modeling in Medical Education and Research Projects.
Bozikov, Jadranka; Relic, Danko; Dezelic, Gjuro
2018-01-01
The paper reviews experiences and accomplishments in application of system dynamics modeling in education, training and research projects at the Andrija Stampar School of Public Health, a branch of the Zagreb University School of Medicine, Croatia. A number of simulation models developed over the past 40 years are briefly described with regard to real problems concerned, objectives and modeling methods and techniques used. Many of them have been developed as the individual students' projects as a part of their graduation, MSc or PhD theses and subsequently published in journals or conference proceedings. Some of them were later used in teaching and simulation training. System dynamics modeling proved to be not only powerful method for research and decision making but also a useful tool in medical and nursing education enabling better understanding of dynamic systems' behavior.
Appendix A, Plan Projects as amended for Financial Constraint
DOT National Transportation Integrated Search
1995-04-13
The Ohio-Kentucky-Indiana Regional Council of Governments (OKI) provides : coordinated regional transportation planning for an eight county area. This : document contains tables showing, by county, statistical data on road project : projections for o...
Kopp-Schneider, Annette; Prieto, Pilar; Kinsner-Ovaskainen, Agnieszka; Stanzel, Sven
2013-06-01
In the framework of toxicology, a testing strategy can be viewed as a series of steps which are taken to come to a final prediction about a characteristic of a compound under study. The testing strategy is performed as a single-step procedure, usually called a test battery, using simultaneously all information collected on different endpoints, or as tiered approach in which a decision tree is followed. Design of a testing strategy involves statistical considerations, such as the development of a statistical prediction model. During the EU FP6 ACuteTox project, several prediction models were proposed on the basis of statistical classification algorithms which we illustrate here. The final choice of testing strategies was not based on statistical considerations alone. However, without thorough statistical evaluations a testing strategy cannot be identified. We present here a number of observations made from the statistical viewpoint which relate to the development of testing strategies. The points we make were derived from problems we had to deal with during the evaluation of this large research project. A central issue during the development of a prediction model is the danger of overfitting. Procedures are presented to deal with this challenge. Copyright © 2012 Elsevier Ltd. All rights reserved.
VALUE - Validating and Integrating Downscaling Methods for Climate Change Research
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose
2013-04-01
Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of this exercise will directly provide end users with important information about the uncertainty of regional climate scenarios, and will furthermore provide the basis for further developing downscaling methods. This presentation will provide background information on VALUE and discuss the identified characteristics and the validation framework.
Novel electrolytes for use in new and improved batteries: An NMR study
NASA Astrophysics Data System (ADS)
Berman, Marc B.
This thesis focuses on the use of nuclear magnetic resonance (NMR) spectroscopy in order to study materials for use as electrolytes in batteries. The details of four projects are described in this thesis as well as a brief theoretical background of NMR. Structural and dynamics properties were determined using several NMR techniques such as static, MAS, PFG diffusion, and relaxation to understand microscopic and macroscopic properties of the materials described within. Nuclei investigate were 1H, 2H, 7Li, 13C, 19F, 23Na, and 27Al. The first project focuses on an exciting new material to be used as a solid electrolyte membrane. T. The second project focuses on the dynamics of ionic liquid-solvent mixtures and their comparison to molecular dynamics computer simulations. The third project involves a solvent-free film containing NaTFSI salt mixed in to PEO for use in sodium-ion batteries. This final project focuses on a composite electrolyte consisting of a ceramic and solid: LiI:PEO:LiAlO2.
MulVAL Extensions for Dynamic Asset Protection
2006-04-01
called Skybox Security and an AI-based project called CycSecure were identified as interesting and relatively mature projects, which deserve closer...dynamic asset protection solution. A critique of the Skybox Security and CycSecure solutions, with respect to the requirements of dynamic asset...particulièrement, un produit du commerce appelé Skybox Security et un projet d’IA appelé CycSecure ont été désignés comme étant des projets
1987-06-15
GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY00 N MODERNIZATION PROGRAM Phase 2 Final Project Report DT C JUNO 7 1989J1K PROJECT 20...CLASSIFICATION O THIS PAGE All other editions are obsolete. unclassified Honeywell JUNE 15, 1987 GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL ...SYSTEMIEQUIPMENT/MACHINING SPECIFICATIONS 33 9 VENDOR/ INDUSTRY ANALYSIS FINDING 39 10 MIS REQUIREMENTS/IMPROVEMENTS 45 11 COST BENEFIT ANALYSIS 48 12 IMPLEMENTATION
Measuring contemporary crustal motions; NASA’s Crustal Dynamics Project
Frey, H. V.; Bosworth, J. M.
1988-01-01
In this article we describe briefly the two space geodetic techniques and how they are used by the Crustal Dynamics Project, show some of the very exciting results that have emerged at the halfway point in the project's life, describe the availability and utilization of the data being collected, and consider what the future may hold when measurement accuracies eventually exceed even those now available and when other international groups become more heavily involved.
Johnson, Christine M; Sullivan, Jess; Buck, Cara L; Trexel, Julie; Scarpuzzi, Mike
2015-01-01
Anticipating the location of a temporarily obscured target-what Piaget (the construction of reality in the child. Basic Books, New York, 1954) called "object permanence"-is a critical skill, especially in hunters of mobile prey. Previous research with bottlenose dolphins found they could predict the location of a target that had been visibly displaced into an opaque container, but not one that was first placed in an opaque container and then invisibly displaced to another container. We tested whether, by altering the task to involve occlusion rather than containment, these animals could show more advanced object permanence skills. We projected dynamic visual displays at an underwater-viewing window and videotaped the animals' head moves while observing these displays. In Experiment 1, the animals observed a small black disk moving behind occluders that shifted in size, ultimately forming one large occluder. Nine out of ten subjects "tracked" the presumed movement of the disk behind this occluder on their first trial-and in a statistically significant number of subsequent trials-confirming their visible displacement abilities. In Experiment 2, we tested their invisible displacement abilities. The disk first disappeared behind a pair of moving occluders, which then moved behind a stationary occluder. The moving occluders then reappeared and separated, revealing that the disk was no longer behind them. The subjects subsequently looked to the correct stationary occluder on eight of their ten first trials, and in a statistically significant number of subsequent trials. Thus, by altering the stimuli to be more ecologically valid, we were able to show that the dolphins could indeed succeed at an invisible displacement task.
Wu, Junsong; Du, Junhua; Jiang, Xiangyun; Wang, Quan; Li, Xigong; Du, Jingyu; Lin, Xiangjin
2014-06-17
To explore the changes of range-of-motion (ROM) in patients with degenerative lumbar disease on the treatment of WavefleX dynamic stabilization system and examine the postoperative lumbar regularity and tendency of ROM. Nine patients with degenerative lumbar disease on the treatment of WavefleX dynamic stabilization system were followed up with respect to ROMs at 5 timepoints within 12 months. Records of ROM were made for instrumented segments, adjacent segments and total lumbar. Compared with preoperation, ROMs in non-fusional segments with WavefleX dynamic stabilization system decreased statistical significantly (P < 0.05 or P < 0.01) at different timepoints; ROMs in adjacent segments increased at some levels without wide statistical significance. The exception was single L3/4 at Month 12 (P < 0.05) versus control group simultaneously at the levels of L3/4, L4/5 and L5/S1, ROMs decreased at Months 6 and 12 with wide statistical significance (P < 0.05 or P < 0.01). ROMs in total lumbar had statistical significant decrease (P < 0.01) in both group of non-fusional segments and hybrid group of non-fusion and fusion. The trends of continuous augments were observed during follow-ups. Statistically significant augments were also acquired at 4 timepoints as compared to control group (P < 0.01). The treatment of degenerative lumbar diseases with WavefleX dynamic stabilization system may limit excessive extension/inflexion and preserve some motor functions. Moreover, it can sustain physiological lordosis, decrease and transfer disc load in adjacent segments to prevent early degeneration of adjacent segment. Trends of motor function augment in total lumbar need to be confirmed during future long-term follow-ups.
Samuel A. Cushman; Kevin S. McKelvey
2006-01-01
The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...
ERIC Educational Resources Information Center
Koparan, Timur; Güven, Bülent
2015-01-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35…
Tsallis statistics and neurodegenerative disorders
NASA Astrophysics Data System (ADS)
Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.
2016-08-01
In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnes, Cris William; Barber, John L.; Kober, Edward Martin
The Matter-Radiation Interactions in Extremes project will build the experimental facility for the time-dependent control of dynamic material performance. An x-ray free electron laser at up to 42-keV fundamental energy and with photon pulses down to sub-nanosecond spacing, MaRIE 1.0 is designed to meet the challenges of time-dependent mesoscale materials science. Those challenges will be outlined, the techniques of coherent diffractive imaging and dynamic polycrystalline diffraction described, and the resulting requirements defined for a coherent x-ray source. The talk concludes with the role of the MaRIE project and science in the future.
Dislocation dynamics: simulation of plastic flow of bcc metals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lassila, D H
This is the final report for the LDRD strategic initiative entitled ''Dislocation Dynamic: Simulation of Plastic Flow of bcc Metals'' (tracking code: 00-SI-011). This report is comprised of 6 individual sections. The first is an executive summary of the project and describes the overall project goal, which is to establish an experimentally validated 3D dislocation dynamics simulation. This first section also gives some information of LLNL's multi-scale modeling efforts associated with the plasticity of bcc metals, and the role of this LDRD project in the multiscale modeling program. The last five sections of this report are journal articles that weremore » produced during the course of the FY-2000 efforts.« less
ERIC Educational Resources Information Center
Wagler, Amy E.; Lesser, Lawrence M.
2018-01-01
The interaction between language and the learning of statistical concepts has been receiving increased attention. The Communication, Language, And Statistics Survey (CLASS) was developed in response to the need to focus on dynamics of language in light of the culturally and linguistically diverse environments of introductory statistics classrooms.…
Han, Sanguk; Saba, Farzaneh; Lee, Sanghyun; Mohamed, Yasser; Peña-Mora, Feniosky
2014-07-01
It is not unusual to observe that actual schedule and quality performances are different from planned performances (e.g., schedule delay and rework) during a construction project. Such differences often result in production pressure (e.g., being pressed to work faster). Previous studies demonstrated that such production pressure negatively affects safety performance. However, the process by which production pressure influences safety performance, and to what extent, has not been fully investigated. As a result, the impact of production pressure has not been incorporated much into safety management in practice. In an effort to address this issue, this paper examines how production pressure relates to safety performance over time by identifying their feedback processes. A conceptual causal loop diagram is created to identify the relationship between schedule and quality performances (e.g., schedule delays and rework) and the components related to a safety program (e.g., workers' perceptions of safety, safety training, safety supervision, and crew size). A case study is then experimentally undertaken to investigate this relationship with accident occurrence with the use of data collected from a construction site; the case study is used to build a System Dynamics (SD) model. The SD model, then, is validated through inequality statistics analysis. Sensitivity analysis and statistical screening techniques further permit an evaluation of the impact of the managerial components on accident occurrence. The results of the case study indicate that schedule delays and rework are the critical factors affecting accident occurrence for the monitored project. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dynamics of EEG functional connectivity during statistical learning.
Tóth, Brigitta; Janacsek, Karolina; Takács, Ádám; Kóbor, Andrea; Zavecz, Zsófia; Nemeth, Dezso
2017-10-01
Statistical learning is a fundamental mechanism of the brain, which extracts and represents regularities of our environment. Statistical learning is crucial in predictive processing, and in the acquisition of perceptual, motor, cognitive, and social skills. Although previous studies have revealed competitive neurocognitive processes underlying statistical learning, the neural communication of the related brain regions (functional connectivity, FC) has not yet been investigated. The present study aimed to fill this gap by investigating FC networks that promote statistical learning in humans. Young adults (N=28) performed a statistical learning task while 128-channels EEG was acquired. The task involved probabilistic sequences, which enabled to measure incidental/implicit learning of conditional probabilities. Phase synchronization in seven frequency bands was used to quantify FC between cortical regions during the first, second, and third periods of the learning task, respectively. Here we show that statistical learning is negatively correlated with FC of the anterior brain regions in slow (theta) and fast (beta) oscillations. These negative correlations increased as the learning progressed. Our findings provide evidence that dynamic antagonist brain networks serve a hallmark of statistical learning. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Zitis, Pavlos I.; Eftaxias, Konstantinos
2013-07-01
The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. Several authors have suggested that earthquake dynamics and the dynamics of economic (financial) systems can be analyzed within similar mathematical frameworks. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up with these different extreme events, in order to support the suggestion that a dynamical analogy exists between a financial crisis (in the form of share or index price collapse) and a single earthquake. We also investigate the existence of such an analogy by means of scale-free statistics (the Gutenberg-Richter distribution of event sizes). We show that the populations of: (i) fracto-electromagnetic events rooted in the activation of a single fault, emerging prior to a significant earthquake, (ii) the trade volume events of different shares/economic indices, prior to a collapse, and (iii) the price fluctuation (considered as the difference of maximum minus minimum price within a day) events of different shares/economic indices, prior to a collapse, follow both the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar parameter values. The obtained results imply the existence of a dynamic analogy between earthquakes and economic crises, which moreover follow the dynamics of seizures, magnetic storms and solar flares.
Bogren, Sara; Fornara, Andrea; Ludwig, Frank; del Puerto Morales, Maria; Steinhoff, Uwe; Fougt Hansen, Mikkel; Kazakova, Olga; Johansson, Christer
2015-01-01
This study presents classification of different magnetic single- and multi-core particle systems using their measured dynamic magnetic properties together with their nanocrystal and particle sizes. The dynamic magnetic properties are measured with AC (dynamical) susceptometry and magnetorelaxometry and the size parameters are determined from electron microscopy and dynamic light scattering. Using these methods, we also show that the nanocrystal size and particle morphology determines the dynamic magnetic properties for both single- and multi-core particles. The presented results are obtained from the four year EU NMP FP7 project, NanoMag, which is focused on standardization of analysis methods for magnetic nanoparticles. PMID:26343639
The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model
NASA Astrophysics Data System (ADS)
Verkley, Wim; Severijns, Camiel
2014-05-01
Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).
Seasonal prediction of East Asian summer rainfall using a multi-model ensemble system
NASA Astrophysics Data System (ADS)
Ahn, Joong-Bae; Lee, Doo-Young; Yoo, Jin‑Ho
2015-04-01
Using the retrospective forecasts of seven state-of-the-art coupled models and their multi-model ensemble (MME) for boreal summers, the prediction skills of climate models in the western tropical Pacific (WTP) and East Asian region are assessed. The prediction of summer rainfall anomalies in East Asia is difficult, while the WTP has a strong correlation between model prediction and observation. We focus on developing a new approach to further enhance the seasonal prediction skill for summer rainfall in East Asia and investigate the influence of convective activity in the WTP on East Asian summer rainfall. By analyzing the characteristics of the WTP convection, two distinct patterns associated with El Niño-Southern Oscillation developing and decaying modes are identified. Based on the multiple linear regression method, the East Asia Rainfall Index (EARI) is developed by using the interannual variability of the normalized Maritime continent-WTP Indices (MPIs), as potentially useful predictors for rainfall prediction over East Asia, obtained from the above two main patterns. For East Asian summer rainfall, the EARI has superior performance to the East Asia summer monsoon index or each MPI. Therefore, the regressed rainfall from EARI also shows a strong relationship with the observed East Asian summer rainfall pattern. In addition, we evaluate the prediction skill of the East Asia reconstructed rainfall obtained by hybrid dynamical-statistical approach using the cross-validated EARI from the individual models and their MME. The results show that the rainfalls reconstructed from simulations capture the general features of observed precipitation in East Asia quite well. This study convincingly demonstrates that rainfall prediction skill is considerably improved by using a hybrid dynamical-statistical approach compared to the dynamical forecast alone. Acknowledgements This work was carried out with the support of Rural Development Administration Cooperative Research Program for Agriculture Science and Technology Development under grant project PJ009353 and Korea Meteorological Administration Research and Development Program under grant CATER 2012-3100, Republic of Korea.
Project Physics Reader 1, Concepts of Motion.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Harvard Project Physics.
As a supplement to Project Physics Unit 1, 21 articles are presented in this reader. Concepts of motion are discussed under headings: motion, motion in words, representation of movement, introducing vectors, Galileo's discussion of projectile motion, Newton's laws of dynamics, the dynamics of a golf club, report on Tait's lecture on force, and bad…
Exploring Protein Structure and Dynamics through a Project-Oriented Biochemistry Laboratory Module
ERIC Educational Resources Information Center
Lipchock, James M.; Ginther, Patrick S.; Douglas, Bonnie B.; Bird, Kelly E.; Loria, J. Patrick
2017-01-01
Here, we present a 10-week project-oriented laboratory module designed to provide a course-based undergraduate research experience in biochemistry that emphasizes the importance of biomolecular structure and dynamics in enzyme function. This module explores the impact of mutagenesis on an important active site loop for a biomedically-relevant…
Mahmoud, Shereif H; Gan, Thian Y
2018-08-15
The implications of anthropogenic climate change, human activities and land use change (LUC) on the environment and ecosystem services in the coastal regions of Saudi Arabia were analyzed. Earth observations data was used to drive land use categories between 1970 and 2014. Next, a Markov-CA model was developed to characterize the dynamic of LUC between 2014 and 2100 and their impacts on regions' climate and environment. Non-parametric change point and trend detection algorithms were applied to temperature, precipitation and greenhouse gases data to investigate the presence of anthropogenic climate change. Lastly, climate models were used to project future climate change between 2014 and 2100. The analysis of LUC revealed that between 1970 and 2014, built up areas experienced the greatest growth during the study period, leading to a significant monotonic trend. Urban areas increased by 2349.61km 2 between 1970 and 2014, an average increase of >53.4km 2 /yr. The projected LUC between 2014 and 2100 indicate a continued increase in urban areas and irrigated cropland. Human alteration of land use from natural vegetation and forests to other uses after 1970, resulted in a loss, degradation, and fragmentation, all of which usually have devastating effects on the biodiversity of the region. Resulting in a statistically significant change point in temperature anomaly after 1968 with a warming trend of 0.24°C/decade and a downward trend in precipitation anomaly of 12.2mm/decade. Total greenhouse gas emissions including all anthropogenic sources showed a statistically significant positive trend of 78,090Kt/decade after 1991. This is reflected in the future projection of temperature anomaly between 1900 and 2100 with a future warming trend of 0.19°C/decade. In conclusion, human activities, industrial revelation, deforestation, land use transformation and increase in greenhouse gases had significant implications on the environment and ecosystem services of the study area. Copyright © 2018 Elsevier B.V. All rights reserved.
A heuristic method for consumable resource allocation in multi-class dynamic PERT networks
NASA Astrophysics Data System (ADS)
Yaghoubi, Saeed; Noori, Siamak; Mazdeh, Mohammad Mahdavi
2013-06-01
This investigation presents a heuristic method for consumable resource allocation problem in multi-class dynamic Project Evaluation and Review Technique (PERT) networks, where new projects from different classes (types) arrive to system according to independent Poisson processes with different arrival rates. Each activity of any project is operated at a devoted service station located in a node of the network with exponential distribution according to its class. Indeed, each project arrives to the first service station and continues its routing according to precedence network of its class. Such system can be represented as a queuing network, while the discipline of queues is first come, first served. On the basis of presented method, a multi-class system is decomposed into several single-class dynamic PERT networks, whereas each class is considered separately as a minisystem. In modeling of single-class dynamic PERT network, we use Markov process and a multi-objective model investigated by Azaron and Tavakkoli-Moghaddam in 2007. Then, after obtaining the resources allocated to service stations in every minisystem, the final resources allocated to activities are calculated by the proposed method.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Judicial Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL...: (1) That the information will only be used or revealed for research or statistical purposes; and (2... or statistical purposes; and (3) That participation in the project in question is voluntary and may...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Judicial Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL...: (1) That the information will only be used or revealed for research or statistical purposes; and (2... or statistical purposes; and (3) That participation in the project in question is voluntary and may...
The Effect of Student-Driven Projects on the Development of Statistical Reasoning
ERIC Educational Resources Information Center
Sovak, Melissa M.
2010-01-01
Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…
ERIC Educational Resources Information Center
Ramirez-Faghih, Caroline Ann
2012-01-01
The goal of this study was to examine the reciprocal relationship between statistical investigation and motivation of college students in a Mathematical Reasoning course (Math 1). Unlike previous studies in which students' projects or statistical investigations have been examined as the final product that shows evidence of statistical…
Statistical projection effects in a hydrodynamic pilot-wave system
NASA Astrophysics Data System (ADS)
Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.
2018-03-01
Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.
Emergent dynamic structures and statistical law in spherical lattice gas automata.
Yao, Zhenwei
2017-12-01
Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.
Emergent dynamic structures and statistical law in spherical lattice gas automata
NASA Astrophysics Data System (ADS)
Yao, Zhenwei
2017-12-01
Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.
NASA Astrophysics Data System (ADS)
Lawler, Samantha M.; Kavelaars, J. J.; Alexandersen, Mike; Bannister, Michele T.; Gladman, Brett; Petit, Jean-Marc; Shankman, Cory
2018-05-01
All surveys include observational biases, which makes it impossible to directly compare properties of discovered trans-Neptunian Objects (TNOs) with dynamical models. However, by carefully keeping track of survey pointings on the sky, detection limits, tracking fractions, and rate cuts, the biases from a survey can be modelled in Survey Simulator software. A Survey Simulator takes an intrinsic orbital model (from, for example, the output of a dynamical Kuiper belt emplacement simulation) and applies the survey biases, so that the biased simulated objects can be directly compared with real discoveries. This methodology has been used with great success in the Outer Solar System Origins Survey (OSSOS) and its predecessor surveys. In this chapter, we give four examples of ways to use the OSSOS Survey Simulator to gain knowledge about the true structure of the Kuiper Belt. We demonstrate how to statistically compare different dynamical model outputs with real TNO discoveries, how to quantify detection biases within a TNO population, how to measure intrinsic population sizes, and how to use upper limits from non-detections. We hope this will provide a framework for dynamical modellers to statistically test the validity of their models.
The evolution of labile traits in sex- and age-structured populations.
Childs, Dylan Z; Sheldon, Ben C; Rees, Mark
2016-03-01
Many quantitative traits are labile (e.g. somatic growth rate, reproductive timing and investment), varying over the life cycle as a result of behavioural adaptation, developmental processes and plastic responses to the environment. At the population level, selection can alter the distribution of such traits across age classes and among generations. Despite a growing body of theoretical research exploring the evolutionary dynamics of labile traits, a data-driven framework for incorporating such traits into demographic models has not yet been developed. Integral projection models (IPMs) are increasingly being used to understand the interplay between changes in labile characters, life histories and population dynamics. One limitation of the IPM approach is that it relies on phenotypic associations between parents and offspring traits to capture inheritance. However, it is well-established that many different processes may drive these associations, and currently, no clear consensus has emerged on how to model micro-evolutionary dynamics in an IPM framework. We show how to embed quantitative genetic models of inheritance of labile traits into age-structured, two-sex models that resemble standard IPMs. Commonly used statistical tools such as GLMs and their mixed model counterparts can then be used for model parameterization. We illustrate the methodology through development of a simple model of egg-laying date evolution, parameterized using data from a population of Great tits (Parus major). We demonstrate how our framework can be used to project the joint dynamics of species' traits and population density. We then develop a simple extension of the age-structured Price equation (ASPE) for two-sex populations, and apply this to examine the age-specific contributions of different processes to change in the mean phenotype and breeding value. The data-driven framework we outline here has the potential to facilitate greater insight into the nature of selection and its consequences in settings where focal traits vary over the lifetime through ontogeny, behavioural adaptation and phenotypic plasticity, as well as providing a potential bridge between theoretical and empirical studies of labile trait variation. © 2016 The Authors Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.
Advanced Hydraulic Fracturing Technology for Unconventional Tight Gas Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephen Holditch; A. Daniel Hill; D. Zhu
2007-06-19
The objectives of this project are to develop and test new techniques for creating extensive, conductive hydraulic fractures in unconventional tight gas reservoirs by statistically assessing the productivity achieved in hundreds of field treatments with a variety of current fracturing practices ranging from 'water fracs' to conventional gel fracture treatments; by laboratory measurements of the conductivity created with high rate proppant fracturing using an entirely new conductivity test - the 'dynamic fracture conductivity test'; and by developing design models to implement the optimal fracture treatments determined from the field assessment and the laboratory measurements. One of the tasks of thismore » project is to create an 'advisor' or expert system for completion, production and stimulation of tight gas reservoirs. A central part of this study is an extensive survey of the productivity of hundreds of tight gas wells that have been hydraulically fractured. We have been doing an extensive literature search of the SPE eLibrary, DOE, Gas Technology Institute (GTI), Bureau of Economic Geology and IHS Energy, for publicly available technical reports about procedures of drilling, completion and production of the tight gas wells. We have downloaded numerous papers and read and summarized the information to build a database that will contain field treatment data, organized by geographic location, and hydraulic fracture treatment design data, organized by the treatment type. We have conducted experimental study on 'dynamic fracture conductivity' created when proppant slurries are pumped into hydraulic fractures in tight gas sands. Unlike conventional fracture conductivity tests in which proppant is loaded into the fracture artificially; we pump proppant/frac fluid slurries into a fracture cell, dynamically placing the proppant just as it occurs in the field. From such tests, we expect to gain new insights into some of the critical issues in tight gas fracturing, in particular the roles of gel damage, polymer loading (water-frac versus gel frac), and proppant concentration on the created fracture conductivity. To achieve this objective, we have designed the experimental apparatus to conduct the dynamic fracture conductivity tests. The experimental apparatus has been built and some preliminary tests have been conducted to test the apparatus.« less
Predictive accuracy of particle filtering in dynamic models supporting outbreak projections.
Safarishahrbijari, Anahita; Teyhouee, Aydin; Waldner, Cheryl; Liu, Juxin; Osgood, Nathaniel D
2017-09-26
While a new generation of computational statistics algorithms and availability of data streams raises the potential for recurrently regrounding dynamic models with incoming observations, the effectiveness of such arrangements can be highly subject to specifics of the configuration (e.g., frequency of sampling and representation of behaviour change), and there has been little attempt to identify effective configurations. Combining dynamic models with particle filtering, we explored a solution focusing on creating quickly formulated models regrounded automatically and recurrently as new data becomes available. Given a latent underlying case count, we assumed that observed incident case counts followed a negative binomial distribution. In accordance with the condensation algorithm, each such observation led to updating of particle weights. We evaluated the effectiveness of various particle filtering configurations against each other and against an approach without particle filtering according to the accuracy of the model in predicting future prevalence, given data to a certain point and a norm-based discrepancy metric. We examined the effectiveness of particle filtering under varying times between observations, negative binomial dispersion parameters, and rates with which the contact rate could evolve. We observed that more frequent observations of empirical data yielded super-linearly improved accuracy in model predictions. We further found that for the data studied here, the most favourable assumptions to make regarding the parameters associated with the negative binomial distribution and changes in contact rate were robust across observation frequency and the observation point in the outbreak. Combining dynamic models with particle filtering can perform well in projecting future evolution of an outbreak. Most importantly, the remarkable improvements in predictive accuracy resulting from more frequent sampling suggest that investments to achieve efficient reporting mechanisms may be more than paid back by improved planning capacity. The robustness of the results on particle filter configuration in this case study suggests that it may be possible to formulate effective standard guidelines and regularized approaches for such techniques in particular epidemiological contexts. Most importantly, the work tentatively suggests potential for health decision makers to secure strong guidance when anticipating outbreak evolution for emerging infectious diseases by combining even very rough models with particle filtering method.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos
2014-05-01
When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.
NASA Astrophysics Data System (ADS)
Koparan, Timur
2016-02-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.
NASA Astrophysics Data System (ADS)
Qi, Di
Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.
Malaria vectors in South America: current and future scenarios.
Laporta, Gabriel Zorello; Linton, Yvonne-Marie; Wilkerson, Richard C; Bergo, Eduardo Sterlino; Nagaki, Sandra Sayuri; Sant'Ana, Denise Cristina; Sallum, Maria Anice Mureb
2015-08-19
Malaria remains a significant public health issue in South America. Future climate change may influence the distribution of the disease, which is dependent on the distribution of those Anopheles mosquitoes competent to transmit Plasmodium falciparum. Herein, predictive niche models of the habitat suitability for P. falciparum, the current primary vector Anopheles darlingi and nine other known and/or potential vector species of the Neotropical Albitarsis Complex, were used to document the current situation and project future scenarios under climate changes in South America in 2070. To build each ecological niche model, we employed topography, climate and biome, and the currently defined distribution of P. falciparum, An. darlingi and nine species comprising the Albitarsis Complex in South America. Current and future (i.e., 2070) distributions were forecast by projecting the fitted ecological niche model onto the current environmental situation and two scenarios of simulated climate change. Statistical analyses were performed between the parasite and each vector in both the present and future scenarios to address potential vector roles in the dynamics of malaria transmission. Current distributions of malaria vector species were associated with that of P. falciparum, confirming their role in transmission, especially An. darlingi, An. marajoara and An. deaneorum. Projected climate changes included higher temperatures, lower water availability and biome modifications. Regardless of future scenarios considered, the geographic distribution of P. falciparum was exacerbated in 2070 South America, with the distribution of the pathogen covering 35-46% of the continent. As the current primary vector An. darlingi showed low tolerance for drier environments, the projected climate change would significantly reduce suitable habitat, impacting both its distribution and abundance. Conversely, climate generalist members of the Albitarsis Complex showed significant spatial and temporal expansion potential in 2070, and we conclude these species will become more important in the dynamics of malaria transmission in South America. Our data suggest that climate and landscape effects will elevate the importance of members of the Albitarsis Complex in malaria transmission in South America in 2070, highlighting the need for further studies addressing the bionomics, ecology and behaviours of the species comprising the Albitarsis Complex.
Major results of the MAARBLE project
NASA Astrophysics Data System (ADS)
Daglis, Ioannis A.; Bourdarie, Sebastien; Horne, Richard B.; Khotyaintsev, Yuri; Mann, Ian R.; Santolik, Ondrej; Turner, Drew L.; Balasis, Georgios
2016-04-01
The goal of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project was to shed light on the ways the dynamic evolution of the Van Allen belts is influenced by low-frequency electromagnetic waves. MAARBLE was implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employed multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energisation and loss. Particular attention was paid to the role of ULF/VLF waves. Within MAARBLE we created a database containing properties of ULF and VLF waves, based on measurements from the Cluster, THEMIS and CHAMP missions and from the CARISMA and IMAGE ground magnetometer networks. The database is now available to the scientific community through the Cluster Science Archive as auxiliary content. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region has been developed. Multi-spacecraft particle measurements have been incorporated into data assimilation tools, leading to a more accurate estimate of the state of the radiation belts. The synergy of wave and particle observations is in the core of MAARBLE research studies of radiation belt dynamics. Results and conclusions from these studies will be presented in this paper. The MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project has received funding from the European Unions Seventh Framework Programme (FP7-SPACE 2011-1) under grant agreement no. 284520. The complete MAARBLE Team: Ioannis A. Daglis, Sebastien Bourdarie, Richard B. Horne, Yuri Khotyaintsev, Ian R. Mann, Ondrej Santolik, Drew L. Turner, Georgios Balasis, Anastasios Anastasiadis, Vassilis Angelopoulos, David Barona, Eleni Chatzichristou, Stavros Dimitrakoudis, Marina Georgiou, Omiros Giannakis, Sarah Glauert, Benjamin Grison, Zuzana Hrbackova, Andy Kale, Christos Katsavrias, Tobias Kersten, Ivana Kolmasova, Didier Lazaro, Eva Macusova, Vincent Maget, Meghan Mella, Nigel Meredith, Fiori-Anastasia Metallinou, David Milling, Louis Ozeke, Constantinos Papadimitriou, George Ropokis, Ingmar Sandberg, Maria Usanova, Iannis Dandouras, David Sibeck, Eftyhia Zesta.
Manpower Projection Model Project, Ventura County.
ERIC Educational Resources Information Center
Van Zant, John L.; Lawson, William H.
The final report on Phase 1 of the Manpower Projection Model (MPM) Project provides a guide for implementation of the model system by area Vocational Education Practitioners within any Standard Metropolitan Statistical Area (SMSA). A cooperative effort between Ventura County Superintendent of Schools Office and the Community College District, the…
A Solar Cycle Dependence of Nonlinearity in Magnetospheric Activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay R; Wing, Simon
2005-03-08
The nonlinear dependencies inherent to the historical K(sub)p data stream (1932-2003) are examined using mutual information and cumulant based cost as discriminating statistics. The discriminating statistics are compared with surrogate data streams that are constructed using the corrected amplitude adjustment Fourier transform (CAAFT) method and capture the linear properties of the original K(sub)p data. Differences are regularly seen in the discriminating statistics a few years prior to solar minima, while no differences are apparent at the time of solar maximum. These results suggest that the dynamics of the magnetosphere tend to be more linear at solar maximum than at solarmore » minimum. The strong nonlinear dependencies tend to peak on a timescale around 40-50 hours and are statistically significant up to one week. Because the solar wind driver variables, VB(sub)s and dynamical pressure exhibit a much shorter decorrelation time for nonlinearities, the results seem to indicate that the nonlinearity is related to internal magnetospheric dynamics. Moreover, the timescales for the nonlinearity seem to be on the same order as that for storm/ring current relaxation. We suggest that the strong solar wind driving that occurs around solar maximum dominates the magnetospheric dynamics suppressing the internal magnetospheric nonlinearity. On the other hand, in the descending phase of the solar cycle just prior to solar minimum, when magnetospheric activity is weaker, the dynamics exhibit a significant nonlinear internal magnetospheric response that may be related to increased solar wind speed.« less
2013-04-11
vehicle dynamics. Unclassified Unclassified Unclassified UU 9 Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc. 0 Name of...Technical Representative Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc. 1 Project Summary This project aims at addressing and...applications. This literature review is being summarized and incorporated into the paper. The commentary provided by Dr. Jayakumar was addressed and
2013-11-12
Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc. 0 Name of Contractor Computational Dynamics Inc. (CDI) 1809...Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc. 1 Project Summary This project aims at addressing and remedying the serious...Shabana, A.A., Jayakumar , P., and Letherwood, M., “Soil Models and Vehicle System Dynamics”, Applied Mechanics Reviews, Vol. 65(4), 2013, doi
Component model reduction via the projection and assembly method
NASA Technical Reports Server (NTRS)
Bernard, Douglas E.
1989-01-01
The problem of acquiring a simple but sufficiently accurate model of a dynamic system is made more difficult when the dynamic system of interest is a multibody system comprised of several components. A low order system model may be created by reducing the order of the component models and making use of various available multibody dynamics programs to assemble them into a system model. The difficulty is in choosing the reduced order component models to meet system level requirements. The projection and assembly method, proposed originally by Eke, solves this difficulty by forming the full order system model, performing model reduction at the the system level using system level requirements, and then projecting the desired modes onto the components for component level model reduction. The projection and assembly method is analyzed to show the conditions under which the desired modes are captured exactly; to the numerical precision of the algorithm.
Advanced structures technology and aircraft safety
NASA Technical Reports Server (NTRS)
Mccomb, H. G., Jr.
1983-01-01
NASA research and development on advanced aeronautical structures technology related to flight safety is reviewed. The effort is categorized as research in the technology base and projects sponsored by the Aircraft Energy Efficiency (ACEE) Project Office. Base technology research includes mechanics of composite structures, crash dynamics, and landing dynamics. The ACEE projects involve development and fabrication of selected composite structural components for existing commercial transport aircraft. Technology emanating from this research is intended to result in airframe structures with improved efficiency and safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haglund, R.F.; Tolk, N.H.
The Medical Free Electron Laser Program was awarded to develop, construct and operate a free-electron laser facility dedicated to biomedical and materials studies, with particular emphases on: fundamental studies of absorption and localization of electromagnetic energy on and near material surfaces, especially through electronic and other selective, non-statistical processes; non-thermal photon-materials interactions (e.g., electronic bond-breaking or vibrational energy transfer) in physical and biological materials as well as in long-wavelength biopolymer dynamics; development of FEL-based methods to study drug action and to characterize biomolecular properties and metabolic processes in biomembranes; clinical applications in otolaryngology, neurosurgery, ophthalmology and radiology stressing the usemore » of the laser for selective laser-tissue, laser-cellular and laser-molecule interactions in both therapeutic and diagnostic modalities.« less
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Statistical errors in molecular dynamics averages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiferl, S.K.; Wallace, D.C.
1985-11-15
A molecular dynamics calculation produces a time-dependent fluctuating signal whose average is a thermodynamic quantity of interest. The average of the kinetic energy, for example, is proportional to the temperature. A procedure is described for determining when the molecular dynamics system is in equilibrium with respect to a given variable, according to the condition that the mean and the bandwidth of the signal should be sensibly constant in time. Confidence limits for the mean are obtained from an analysis of a finite length of the equilibrium signal. The role of serial correlation in this analysis is discussed. The occurence ofmore » unstable behavior in molecular dynamics data is noted, and a statistical test for a level shift is described.« less
NASA Astrophysics Data System (ADS)
Faranda, D.; Yiou, P.; Alvarez-Castro, M. C. M.
2015-12-01
A combination of dynamical systems and statistical techniques allows for a robust assessment of the dynamical properties of the mid-latitude atmospheric circulation. Extremes at different spatial and time scales are not only associated to exceptionally intense weather structures (e.g. extra-tropical cyclones) but also to rapid changes of circulation regimes (thunderstorms, supercells) or the extreme persistence of weather structure (heat waves, cold spells). We will show how the dynamical systems theory of recurrence combined to the extreme value theory can take into account the spatial and temporal dependence structure of the mid-latitude circulation structures and provide information on the statistics of extreme events.
Pincus, Steven M; Schmidt, Peter J; Palladino-Negro, Paula; Rubinow, David R
2008-04-01
Enhanced statistical characterization of mood-rating data holds the potential to more precisely classify and sub-classify recurrent mood disorders like premenstrual dysphoric disorder (PMDD) and recurrent brief depressive disorder (RBD). We applied several complementary statistical methods to differentiate mood rating dynamics among women with PMDD, RBD, and normal controls (NC). We compared three subgroups of women: NC (n=8); PMDD (n=15); and RBD (n=9) on the basis of daily self-ratings of sadness, study lengths between 50 and 120 days. We analyzed mean levels; overall variability, SD; sequential irregularity, approximate entropy (ApEn); and a quantification of the extent of brief and staccato dynamics, denoted 'Spikiness'. For each of SD, irregularity (ApEn), and Spikiness, we showed highly significant subgroup differences, ANOVA0.001 for each statistic; additionally, many paired subgroup comparisons showed highly significant differences. In contrast, mean levels were indistinct among the subgroups. For SD, normal controls had much smaller levels than the other subgroups, with RBD intermediate. ApEn showed PMDD to be significantly more regular than the other subgroups. Spikiness showed NC and RBD data sets to be much more staccato than their PMDD counterparts, and appears to suitably characterize the defining feature of RBD dynamics. Compound criteria based on these statistical measures discriminated diagnostic subgroups with high sensitivity and specificity. Taken together, the statistical suite provides well-defined specifications of each subgroup. This can facilitate accurate diagnosis, and augment the prediction and evaluation of response to treatment. The statistical methodologies have broad and direct applicability to behavioral studies for many psychiatric disorders, and indeed to similar analyses of associated biological signals across multiple axes.
Seasonal drought predictability in Portugal using statistical-dynamical techniques
NASA Astrophysics Data System (ADS)
Ribeiro, A. F. S.; Pires, C. A. L.
2016-08-01
Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.
The GenABEL Project for statistical genomics
Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381
1981-01-01
Data are included on fertility and mortality projections for Czechoslovakia, 1981-2000; population projections, 1981-2000; population of reproductive age, 1981-2000; and natural growth of population, 1975-1980
Forecasting climate change impacts on plant populations over large spatial extents
Tredennick, Andrew T.; Hooten, Mevin B.; Aldridge, Cameron L.; ...
2016-10-24
Plant population models are powerful tools for predicting climate change impacts in one location, but are difficult to apply at landscape scales. Here, we overcome this limitation by taking advantage of two recent advances: remotely sensed, species-specific estimates of plant cover and statistical models developed for spatiotemporal dynamics of animal populations. Using computationally efficient model reparameterizations, we fit a spatiotemporal population model to a 28-year time series of sagebrush (Artemisia spp.) percent cover over a 2.5 × 5 km landscape in southwestern Wyoming while formally accounting for spatial autocorrelation. We include interannual variation in precipitation and temperature as covariates inmore » the model to investigate how climate affects the cover of sagebrush. We then use the model to forecast the future abundance of sagebrush at the landscape scale under projected climate change, generating spatially explicit estimates of sagebrush population trajectories that have, until now, been impossible to produce at this scale. Our broadscale and long-term predictions are rooted in small-scale and short-term population dynamics and provide an alternative to predictions offered by species distribution models that do not include population dynamics. Finally, our approach, which combines several existing techniques in a novel way, demonstrates the use of remote sensing data to model population responses to environmental change that play out at spatial scales far greater than the traditional field study plot.« less
Forecasting climate change impacts on plant populations over large spatial extents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tredennick, Andrew T.; Hooten, Mevin B.; Aldridge, Cameron L.
Plant population models are powerful tools for predicting climate change impacts in one location, but are difficult to apply at landscape scales. Here, we overcome this limitation by taking advantage of two recent advances: remotely sensed, species-specific estimates of plant cover and statistical models developed for spatiotemporal dynamics of animal populations. Using computationally efficient model reparameterizations, we fit a spatiotemporal population model to a 28-year time series of sagebrush (Artemisia spp.) percent cover over a 2.5 × 5 km landscape in southwestern Wyoming while formally accounting for spatial autocorrelation. We include interannual variation in precipitation and temperature as covariates inmore » the model to investigate how climate affects the cover of sagebrush. We then use the model to forecast the future abundance of sagebrush at the landscape scale under projected climate change, generating spatially explicit estimates of sagebrush population trajectories that have, until now, been impossible to produce at this scale. Our broadscale and long-term predictions are rooted in small-scale and short-term population dynamics and provide an alternative to predictions offered by species distribution models that do not include population dynamics. Finally, our approach, which combines several existing techniques in a novel way, demonstrates the use of remote sensing data to model population responses to environmental change that play out at spatial scales far greater than the traditional field study plot.« less