The Universal Plausibility Metric (UPM) & Principle (UPP).
Abel, David L
2009-12-03
Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of xi is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (xi < 1).
The Universal Plausibility Metric (UPM) & Principle (UPP)
2009-01-01
Background Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of ξ is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. Conclusion No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1). PMID:19958539
On Rosen's theory of gravity and cosmology
NASA Technical Reports Server (NTRS)
Barnes, R. C.
1980-01-01
Formal similarities between general relativity and Rosen's bimetric theory of gravity were used to analyze various bimetric cosmologies. The following results were found: (1) physically plausible model universes which have a flat static background metric, have a Robertson-Walker fundamental metric, and which allow co-moving coordinates do not exist in bimetric cosmology. (2) it is difficult to use the Robertson-Walker metric for both the background metric (gamma mu nu) and the fundamental metric tensor of Riemannian geometry( g mu nu) and require that g mu nu and gamma mu nu have different time dependences. (3) A consistency relation for using co-moving coordinates in bimetric cosmology was derived. (4) Certain spatially flat bimetric cosmologies of Babala were tested for the presence of particle horizons. (5) An analytic solution for Rosen's k = +1 model was found. (6) Rosen's singularity free k = +1 model arises from what appears to be an arbitary choice for the time dependent part of gamma mu nu.
The cosmological model with a wormhole and Hawking temperature near apparent horizon
NASA Astrophysics Data System (ADS)
Kim, Sung-Won
2018-05-01
In this paper, a cosmological model with an isotropic form of the Morris-Thorne type wormhole was derived in a similar way to the McVittie solution to the black hole in the expanding universe. By solving Einstein's field equation with plausible matter distribution, we found the exact solution of the wormhole embedded in Friedmann-Lemaître-Robertson-Walker universe. We also found the apparent cosmological horizons from the redefined metric and analyzed the geometric natures, including causal and dynamic structures. The Hawking temperature for thermal radiation was obtained by the WKB approximation using the Hamilton-Jacobi equation and Hamilton's equation, near the apparent cosmological horizon.
Enhanced timing abilities in percussionists generalize to rhythms without a musical beat.
Cameron, Daniel J; Grahn, Jessica A
2014-01-01
The ability to entrain movements to music is arguably universal, but it is unclear how specialized training may influence this. Previous research suggests that percussionists have superior temporal precision in perception and production tasks. Such superiority may be limited to temporal sequences that resemble real music or, alternatively, may generalize to musically implausible sequences. To test this, percussionists and nonpercussionists completed two tasks that used rhythmic sequences varying in musical plausibility. In the beat tapping task, participants tapped with the beat of a rhythmic sequence over 3 stages: finding the beat (as an initial sequence played), continuation of the beat (as a second sequence was introduced and played simultaneously), and switching to a second beat (the initial sequence finished, leaving only the second). The meters of the two sequences were either congruent or incongruent, as were their tempi (minimum inter-onset intervals). In the rhythm reproduction task, participants reproduced rhythms of four types, ranging from high to low musical plausibility: Metric simple rhythms induced a strong sense of the beat, metric complex rhythms induced a weaker sense of the beat, nonmetric rhythms had no beat, and jittered nonmetric rhythms also had no beat as well as low temporal predictability. For both tasks, percussionists performed more accurately than nonpercussionists. In addition, both groups were better with musically plausible than implausible conditions. Overall, the percussionists' superior abilities to entrain to, and reproduce, rhythms generalized to musically implausible sequences.
Flavour fields in steady state: stress tensor and free energy
NASA Astrophysics Data System (ADS)
Banerjee, Avik; Kundu, Arnab; Kundu, Sandipan
2016-02-01
The dynamics of a probe brane in a given gravitational background is governed by the Dirac-Born-Infeld action. The corresponding open string metric arises naturally in studying the fluctuations on the probe. In Gauge-String duality, it is known that in the presence of a constant electric field on the worldvolume of the probe, the open string metric acquires an event horizon and therefore the fluctuation modes on the probe experience an effective temperature. In this article, we bring together various properties of such a system to a formal definition and a subsequent narration of the effective thermodynamics and the stress tensor of the corresponding flavour fields, also including a non-vanishing chemical potential. In doing so, we point out a potentially infinitely-degenerate scheme-dependence of regularizing the free energy, which nevertheless yields a universal contribution in certain cases. This universal piece appears as the coefficient of a log-divergence in free energy when a space-filling probe brane is embedded in AdS d+1-background, for d = 2, 4, and is related to conformal anomaly. For the special case of d = 2, the universal factor has a striking resemblance to the well-known heat current formula in (1 + 1)-dimensional conformal field theory in steady-state, which endows a plausible physical interpretation to it. Interestingly, we observe a vanishing conformal anomaly in d = 6.
NASA Astrophysics Data System (ADS)
Kwakkel, Jan; Haasnoot, Marjolijn
2015-04-01
In response to climate and socio-economic change, in various policy domains there is increasingly a call for robust plans or policies. That is, plans or policies that performs well in a very large range of plausible futures. In the literature, a wide range of alternative robustness metrics can be found. The relative merit of these alternative conceptualizations of robustness has, however, received less attention. Evidently, different robustness metrics can result in different plans or policies being adopted. This paper investigates the consequences of several robustness metrics on decision making, illustrated here by the design of a flood risk management plan. A fictitious case, inspired by a river reach in the Netherlands is used. The performance of this system in terms of casualties, damages, and costs for flood and damage mitigation actions is explored using a time horizon of 100 years, and accounting for uncertainties pertaining to climate change and land use change. A set of candidate policy options is specified up front. This set of options includes dike raising, dike strengthening, creating more space for the river, and flood proof building and evacuation options. The overarching aim is to design an effective flood risk mitigation strategy that is designed from the outset to be adapted over time in response to how the future actually unfolds. To this end, the plan will be based on the dynamic adaptive policy pathway approach (Haasnoot, Kwakkel et al. 2013) being used in the Dutch Delta Program. The policy problem is formulated as a multi-objective robust optimization problem (Kwakkel, Haasnoot et al. 2014). We solve the multi-objective robust optimization problem using several alternative robustness metrics, including both satisficing robustness metrics and regret based robustness metrics. Satisficing robustness metrics focus on the performance of candidate plans across a large ensemble of plausible futures. Regret based robustness metrics compare the performance of a candidate plan with the performance of other candidate plans across a large ensemble of plausible futures. Initial results suggest that the simplest satisficing metric, inspired by the signal to noise ratio, results in very risk averse solutions. Other satisficing metrics, which handle the average performance and the dispersion around the average separately, provide substantial additional insights into the trade off between the average performance, and the dispersion around this average. In contrast, the regret-based metrics enhance insight into the relative merits of candidate plans, while being less clear on the average performance or the dispersion around this performance. These results suggest that it is beneficial to use multiple robustness metrics when doing a robust decision analysis study. Haasnoot, M., J. H. Kwakkel, W. E. Walker and J. Ter Maat (2013). "Dynamic Adaptive Policy Pathways: A New Method for Crafting Robust Decisions for a Deeply Uncertain World." Global Environmental Change 23(2): 485-498. Kwakkel, J. H., M. Haasnoot and W. E. Walker (2014). "Developing Dynamic Adaptive Policy Pathways: A computer-assisted approach for developing adaptive strategies for a deeply uncertain world." Climatic Change.
When does the future begin? Time metrics matter, connecting present and future selves.
Lewis, Neil A; Oyserman, Daphna
2015-06-01
People assume they should attend to the present; their future self can handle the future. This seemingly plausible rule of thumb can lead people astray, in part because some future events require current action. In order for the future to energize and motivate current action, it must feel imminent. To create this sense of imminence, we manipulated time metric--the units (e.g., days, years) in which time is considered. People interpret accessible time metrics in two ways: If preparation for the future is under way (Studies 1 and 2), people interpret metrics as implying when a future event will occur. If preparation is not under way (Studies 3-5), they interpret metrics as implying when preparation should start (e.g., planning to start saving 4 times sooner for a retirement in 10,950 days instead of 30 years). Time metrics mattered not because they changed how distal or important future events felt (Study 6), but because they changed how connected and congruent their current and future selves felt (Study 7). © The Author(s) 2015.
Matott, L Shawn; Jiang, Zhengzheng; Rabideau, Alan J; Allen-King, Richelle M
2015-01-01
Numerous isotherm expressions have been developed for describing sorption of hydrophobic organic compounds (HOCs), including "dual-mode" approaches that combine nonlinear behavior with a linear partitioning component. Choosing among these alternative expressions for describing a given dataset is an important task that can significantly influence subsequent transport modeling and/or mechanistic interpretation. In this study, a series of numerical experiments were undertaken to identify "best-in-class" isotherms by refitting 10 alternative models to a suite of 13 previously published literature datasets. The corrected Akaike Information Criterion (AICc) was used for ranking these alternative fits and distinguishing between plausible and implausible isotherms for each dataset. The occurrence of multiple plausible isotherms was inversely correlated with dataset "richness", such that datasets with fewer observations and/or a narrow range of aqueous concentrations resulted in a greater number of plausible isotherms. Overall, only the Polanyi-partition dual-mode isotherm was classified as "plausible" across all 13 of the considered datasets, indicating substantial statistical support consistent with current advances in sorption theory. However, these findings are predicated on the use of the AICc measure as an unbiased ranking metric and the adoption of a subjective, but defensible, threshold for separating plausible and implausible isotherms. Copyright © 2015 Elsevier B.V. All rights reserved.
Formal matched asymptotics for degenerate Ricci flow neckpinches
NASA Astrophysics Data System (ADS)
Angenent, Sigurd B.; Isenberg, James; Knopf, Dan
2011-08-01
Gu and Zhu (2008 Commun. Anal. Geom. 16 467-94) have shown that type-II Ricci flow singularities develop from nongeneric rotationally symmetric Riemannian metrics on S^{n+1}\\,(n\\geq 2) . In this paper, we describe and provide plausibility arguments for a detailed asymptotic profile and rate of curvature blow-up that we predict such solutions exhibit.
A biologically plausible computational model for auditory object recognition.
Larson, Eric; Billimoria, Cyrus P; Sen, Kamal
2009-01-01
Object recognition is a task of fundamental importance for sensory systems. Although this problem has been intensively investigated in the visual system, relatively little is known about the recognition of complex auditory objects. Recent work has shown that spike trains from individual sensory neurons can be used to discriminate between and recognize stimuli. Multiple groups have developed spike similarity or dissimilarity metrics to quantify the differences between spike trains. Using a nearest-neighbor approach the spike similarity metrics can be used to classify the stimuli into groups used to evoke the spike trains. The nearest prototype spike train to the tested spike train can then be used to identify the stimulus. However, how biological circuits might perform such computations remains unclear. Elucidating this question would facilitate the experimental search for such circuits in biological systems, as well as the design of artificial circuits that can perform such computations. Here we present a biologically plausible model for discrimination inspired by a spike distance metric using a network of integrate-and-fire model neurons coupled to a decision network. We then apply this model to the birdsong system in the context of song discrimination and recognition. We show that the model circuit is effective at recognizing individual songs, based on experimental input data from field L, the avian primary auditory cortex analog. We also compare the performance and robustness of this model to two alternative models of song discrimination: a model based on coincidence detection and a model based on firing rate.
NASA Astrophysics Data System (ADS)
Hervik, S.; Málek, T.; Pravda, V.; Pravdová, A.
2015-12-01
We study type II universal metrics of the Lorentzian signature. These metrics simultaneously solve vacuum field equations of all theories of gravitation with the Lagrangian being a polynomial curvature invariant constructed from the metric, the Riemann tensor and its covariant derivatives of an arbitrary order. We provide examples of type II universal metrics for all composite number dimensions. On the other hand, we have no examples for prime number dimensions and we prove the non-existence of type II universal spacetimes in five dimensions. We also present type II vacuum solutions of selected classes of gravitational theories, such as Lovelock, quadratic and L({{Riemann}}) gravities.
Quantum-metric contribution to the pair mass in spin-orbit-coupled Fermi superfluids
NASA Astrophysics Data System (ADS)
Iskin, M.
2018-03-01
As a measure of the quantum distance between Bloch states in the Hilbert space, the quantum metric was introduced to solid-state physics through the real part of the so-called geometric Fubini-Study tensor, the imaginary part of which corresponds to the Berry curvature measuring the emergent gauge field in momentum space. Here, we first derive the Ginzburg-Landau theory near the critical superfluid transition temperature and then identify and analyze the geometric effects on the effective mass tensor of the Cooper pairs. By showing that the quantum-metric contribution accounts for a sizable fraction of the pair mass in a surprisingly large parameter regime throughout the BCS-Bose-Einstein condensate crossover, we not only reveal the physical origin of its governing role in the superfluid density tensor but also hint at its plausible roles in many other observables.
Is thermodynamic irreversibility a consequence of the expansion of the Universe?
NASA Astrophysics Data System (ADS)
Osváth, Szabolcs
2018-02-01
This paper explains thermodynamic irreversibility by applying the expansion of the Universe to thermodynamic systems. The effect of metric expansion is immeasurably small on shorter scales than intergalactic distances. Multi-particle systems, however, are chaotic, and amplify any small disturbance exponentially. Metric expansion gives rise to time-asymmetric behaviour in thermodynamic systems in a short time (few nanoseconds in air, few ten picoseconds in water). In contrast to existing publications, this paper explains without any additional assumptions the rise of thermodynamic irreversibility from the underlying reversible mechanics of particles. Calculations for the special case which assumes FLRW metric, slow motions (v ≪ c) and approximates space locally by Euclidean space show that metric expansion causes entropy increase in isolated systems. The rise of time-asymmetry, however, is not affected by these assumptions. Any influence of the expansion of the Universe on the local metric causes a coupling between local mechanics and evolution of the Universe.
Perceptual Real-Time 2D-to-3D Conversion Using Cue Fusion.
Leimkuhler, Thomas; Kellnhofer, Petr; Ritschel, Tobias; Myszkowski, Karol; Seidel, Hans-Peter
2018-06-01
We propose a system to infer binocular disparity from a monocular video stream in real-time. Different from classic reconstruction of physical depth in computer vision, we compute perceptually plausible disparity, that is numerically inaccurate, but results in a very similar overall depth impression with plausible overall layout, sharp edges, fine details and agreement between luminance and disparity. We use several simple monocular cues to estimate disparity maps and confidence maps of low spatial and temporal resolution in real-time. These are complemented by spatially-varying, appearance-dependent and class-specific disparity prior maps, learned from example stereo images. Scene classification selects this prior at runtime. Fusion of prior and cues is done by means of robust MAP inference on a dense spatio-temporal conditional random field with high spatial and temporal resolution. Using normal distributions allows this in constant-time, parallel per-pixel work. We compare our approach to previous 2D-to-3D conversion systems in terms of different metrics, as well as a user study and validate our notion of perceptually plausible disparity.
Grading the Metrics: Performance-Based Funding in the Florida State University System
ERIC Educational Resources Information Center
Cornelius, Luke M.; Cavanaugh, Terence W.
2016-01-01
A policy analysis of Florida's 10-factor Performance-Based Funding system for state universities. The focus of the article is on the system of performance metrics developed by the state Board of Governors and their impact on institutions and their missions. The paper also discusses problems and issues with the metrics, their ongoing evolution, and…
Stites, Steven; Vansaghi, Lisa; Pingleton, Susan; Cox, Glendon; Paolo, Anthony
2005-12-01
The authors report the development of a new metric for distributing university funds to support faculty efforts in education in the department of internal medicine at the University of Kansas School of Medicine. In 2003, a committee defined the educational value unit (EVU), which describes and measures the specific types of educational work done by faculty members, such as core education, clinical teaching, and administration of educational programs. The specific work profile of each faculty member was delineated. A dollar value was calculated for each 0.1 EVU. The metric was prospectively applied and a faculty survey was performed to evaluate the faculty's perception of the metric. Application of the metric resulted in a decrease in university support for 34 faculty and an increase in funding for 23 faculty. Total realignment of funding was US$1.6 million, or an absolute value of US$29,072 +/- 38,320.00 in average shift of university salary support per faculty member. Survey results showed that understanding of the purpose of university funding was enhanced, and that faculty members perceived a more equitable alignment of teaching effort with funding. The EVU metric resulted in a dramatic realignment of university funding for educational efforts in the department of internal medicine. The metric was easily understood, quickly implemented, and perceived to be fair by the faculty. By aligning specific salary support with faculty's educational responsibilities, a foundation was created for applying mission-based incentive programs.
Language Games: University Responses to Ranking Metrics
ERIC Educational Resources Information Center
Heffernan, Troy A.; Heffernan, Amanda
2018-01-01
League tables of universities that measure performance in various ways are now commonplace, with numerous bodies providing their own rankings of how institutions throughout the world are seen to be performing on a range of metrics. This paper uses Lyotard's notion of language games to theorise that universities are regaining some power over being…
NASA Astrophysics Data System (ADS)
Pitts, J. Brian
2016-02-01
What if gravity satisfied the Klein-Gordon equation? Both particle physics from the 1920-30s and the 1890s Neumann-Seeliger modification of Newtonian gravity with exponential decay suggest considering a "graviton mass term" for gravity, which is algebraic in the potential. Unlike Nordström's "massless" theory, massive scalar gravity is strictly special relativistic in the sense of being invariant under the Poincaré group but not the 15-parameter Bateman-Cunningham conformal group. It therefore exhibits the whole of Minkowski space-time structure, albeit only indirectly concerning volumes. Massive scalar gravity is plausible in terms of relativistic field theory, while violating most interesting versions of Einstein's principles of general covariance, general relativity, equivalence, and Mach. Geometry is a poor guide to understanding massive scalar gravity(s): matter sees a conformally flat metric due to universal coupling, but gravity also sees the rest of the flat metric (barely or on long distances) in the mass term. What is the 'true' geometry, one might wonder, in line with Poincaré's modal conventionality argument? Infinitely many theories exhibit this bimetric 'geometry,' all with the total stress-energy's trace as source; thus geometry does not explain the field equations. The irrelevance of the Ehlers-Pirani-Schild construction to a critique of conventionalism becomes evident when multi-geometry theories are contemplated. Much as Seeliger envisaged, the smooth massless limit indicates underdetermination of theories by data between massless and massive scalar gravities-indeed an unconceived alternative. At least one version easily could have been developed before General Relativity; it then would have motivated thinking of Einstein's equations along the lines of Einstein's newly re-appreciated "physical strategy" and particle physics and would have suggested a rivalry from massive spin 2 variants of General Relativity (massless spin 2, Pauli and Fierz found in 1939). The Putnam-Grünbaum debate on conventionality is revisited with an emphasis on the broad modal scope of conventionalist views. Massive scalar gravity thus contributes to a historically plausible rational reconstruction of much of 20th-21st century space-time philosophy in the light of particle physics. An appendix reconsiders the Malament-Weatherall-Manchak conformal restriction of conventionality and constructs the 'universal force' influencing the causal structure. Subsequent works will discuss how massive gravity could have provided a template for a more Kant-friendly space-time theory that would have blocked Moritz Schlick's supposed refutation of synthetic a priori knowledge, and how Einstein's false analogy between the Neumann-Seeliger-Einstein modification of Newtonian gravity and the cosmological constant Λ generated lasting confusion that obscured massive gravity as a conceptual possibility.
NASA Astrophysics Data System (ADS)
Austin, Rickey W.
In Einstein's theory of Special Relativity (SR), one method to derive relativistic kinetic energy is via applying the classical work-energy theorem to relativistic momentum. This approach starts with a classical based work-energy theorem and applies SR's momentum to the derivation. One outcome of this derivation is relativistic kinetic energy. From this derivation, it is rather straight forward to form a kinetic energy based time dilation function. In the derivation of General Relativity a common approach is to bypass classical laws as a starting point. Instead a rigorous development of differential geometry and Riemannian space is constructed, from which classical based laws are derived. This is in contrast to SR's approach of starting with classical laws and applying the consequences of the universal speed of light by all observers. A possible method to derive time dilation due to Newtonian gravitational potential energy (NGPE) is to apply SR's approach to deriving relativistic kinetic energy. It will be shown this method gives a first order accuracy compared to Schwarzschild's metric. The SR's kinetic energy and the newly derived NGPE derivation are combined to form a Riemannian metric based on these two energies. A geodesic is derived and calculations compared to Schwarzschild's geodesic for an orbiting test mass about a central, non-rotating, non-charged massive body. The new metric results in high accuracy calculations when compared to Einsteins General Relativity's prediction. The new method provides a candidate approach for starting with classical laws and deriving General Relativity effects. This approach mimics SR's method of starting with classical mechanics when deriving relativistic equations. As a compliment to introducing General Relativity, it provides a plausible scaffolding method from classical physics when teaching introductory General Relativity. A straight forward path from classical laws to General Relativity will be derived. This derivation provides a minimum first order accuracy to Schwarzschild's solution to Einstein's field equations.
Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion about Scholarly Metrics
ERIC Educational Resources Information Center
DeSanto, Dan; Nichols, Aaron
2017-01-01
This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014-2015. The survey asked faculty about: familiarity with scholarly metrics, metric-seeking habits, help-seeking habits, and the role of metrics in their department's tenure and promotion process. The survey also gathered faculty…
Carlisle, Daren M.; Bryant, Wade L.
2011-01-01
Many physicochemical factors potentially impair stream ecosystems in urbanizing basins, but few studies have evaluated their relative importance simultaneously, especially in different environmental settings. We used data collected in 25 to 30 streams along a gradient of urbanization in each of 6 metropolitan areas (MAs) to evaluate the relative importance of 11 physicochemical factors on the condition of algal, macroinvertebrate, and fish assemblages. For each assemblage, biological condition was quantified using 2 separate metrics, nonmetric multidimensional scaling ordination site scores and the ratio of observed/expected taxa, both derived in previous studies. Separate linear regression models with 1 or 2 factors as predictors were developed for each MA and assemblage metric. Model parsimony was evaluated based on Akaike’s Information Criterion for small sample size (AICc) and Akaike weights, and variable importance was estimated by summing the Akaike weights across models containing each stressor variable. Few of the factors were strongly correlated (Pearson |r| > 0.7) within MAs. Physicochemical factors explained 17 to 81% of variance in biological condition. Most (92 of 118) of the most plausible models contained 2 predictors, and generally more variance could be explained by the additive effects of 2 factors than by any single factor alone. None of the factors evaluated was universally important for all MAs or biological assemblages. The relative importance of factors varied for different measures of biological condition, biological assemblages, and MA. Our results suggest that the suite of physicochemical factors affecting urban stream ecosystems varies across broad geographic areas, along gradients of urban intensity, and among basins within single MAs.
Foundations for a theory of gravitation theories
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Lee, D. L.; Lightman, A. P.
1972-01-01
A foundation is laid for future analyses of gravitation theories. This foundation is applicable to any theory formulated in terms of geometric objects defined on a 4-dimensional spacetime manifold. The foundation consists of (1) a glossary of fundamental concepts; (2) a theorem that delineates the overlap between Lagrangian-based theories and metric theories; (3) a conjecture (due to Schiff) that the Weak Equivalence Principle implies the Einstein Equivalence Principle; and (4) a plausibility argument supporting this conjecture for the special case of relativistic, Lagrangian-based theories.
Defining Sustainability Metric Targets in an Institutional Setting
ERIC Educational Resources Information Center
Rauch, Jason N.; Newman, Julie
2009-01-01
Purpose: The purpose of this paper is to expand on the development of university and college sustainability metrics by implementing an adaptable metric target strategy. Design/methodology/approach: A combined qualitative and quantitative methodology is derived that both defines what a sustainable metric target might be and describes the path a…
Plausibility Arguments and Universal Gravitation
ERIC Educational Resources Information Center
Cunha, Ricardo F. F.; Tort, A. C.
2017-01-01
Newton's law of universal gravitation underpins our understanding of the dynamics of the Solar System and of a good portion of the observable universe. Generally, in the classroom or in textbooks, the law is presented initially in a qualitative way and at some point during the exposition its mathematical formulation is written on the blackboard…
The metric system: An introduction
NASA Astrophysics Data System (ADS)
Lumley, Susan M.
On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.
Inflaton and metric fluctuations in the early universe from a 5D vacuum state
NASA Astrophysics Data System (ADS)
Membiela, Agustin; Bellini, Mauricio
2006-04-01
In this Letter we complete a previously introduced formalism to study the gauge-invariant metric fluctuations from a noncompact Kaluza Klein theory of gravity, to study the evolution of the early universe. The evolution of both, metric and inflaton field fluctuations are reciprocally related. We obtain that <δρ>/ρ depends on the coupling of Φ with δφ and the spectral index of its spectrum is 0.9483
Reconstructing the metric of the local Universe from number counts observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vallejo, Sergio Andres; Romano, Antonio Enea, E-mail: antonio.enea.romano@cern.ch
Number counts observations available with new surveys such as the Euclid mission will be an important source of information about the metric of the Universe. We compute the low red-shift expansion for the energy density and the density contrast using an exact spherically symmetric solution in presence of a cosmological constant. At low red-shift the expansion is more precise than linear perturbation theory prediction. We then use the local expansion to reconstruct the metric from the monopole of the density contrast. We test the inversion method using numerical calculations and find a good agreement within the regime of validity ofmore » the red-shift expansion. The method could be applied to observational data to reconstruct the metric of the local Universe with a level of precision higher than the one achievable using perturbation theory.« less
Tide or Tsunami? The Impact of Metrics on Scholarly Research
ERIC Educational Resources Information Center
Bonnell, Andrew G.
2016-01-01
Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…
Negotiating plausibility: intervening in the future of nanotechnology.
Selin, Cynthia
2011-12-01
The national-level scenarios project NanoFutures focuses on the social, political, economic, and ethical implications of nanotechnology, and is initiated by the Center for Nanotechnology in Society at Arizona State University (CNS-ASU). The project involves novel methods for the development of plausible visions of nanotechnology-enabled futures, elucidates public preferences for various alternatives, and, using such preferences, helps refine future visions for research and outreach. In doing so, the NanoFutures project aims to address a central question: how to deliberate the social implications of an emergent technology whose outcomes are not known. The solution pursued by the NanoFutures project is twofold. First, NanoFutures limits speculation about the technology to plausible visions. This ambition introduces a host of concerns about the limits of prediction, the nature of plausibility, and how to establish plausibility. Second, it subjects these visions to democratic assessment by a range of stakeholders, thus raising methodological questions as to who are relevant stakeholders and how to activate different communities so as to engage the far future. This article makes the dilemmas posed by decisions about such methodological issues transparent and therefore articulates the role of plausibility in anticipatory governance.
da Silva, Lucas Goulart; Ribeiro, Milton Cezar; Hasui, Érica; da Costa, Carla Aparecida; da Cunha, Rogério Grassetto Teixeira
2015-01-01
Forest fragmentation and habitat loss are among the major current extinction causes. Remaining fragments are mostly small, isolated and showing poor quality. Being primarily arboreal, Neotropical primates are generally sensitive to fragmentation effects. Furthermore, primates are involved in complex ecological process. Thus, landscape changes that negatively interfere with primate population dynamic affect the structure, composition, and ultimately the viability of the whole community. We evaluated if fragment size, isolation and visibility and matrix permeability are important for explaining the occurrence of three Neotropical primate species. Employing playback, we verified the presence of Callicebus nigrifrons, Callithrix aurita and Sapajus nigritus at 45 forest fragments around the municipality of Alfenas, Brazil. We classified the landscape and evaluated the metrics through predictive models of occurrence. We selected the best models through Akaike Selection Criterion. Aiming at validating our results, we applied the plausible models to another region (20 fragments at the neighboring municipality of Poço Fundo, Brazil). Twelve models were plausible, and three were validated, two for Sapajus nigritus (Area and Area+Visibility) and one for Callicebus nigrifrons (Area+Matrix). Our results reinforce the contribution of fragment size to maintain biodiversity within highly degraded habitats. At the same time, they stress the importance of including novel, biologically relevant metrics in landscape studies, such as visibility and matrix permeability, which can provide invaluable help for similar studies in the future and on conservation practices in the long run. PMID:25658108
Small business activity does not measure entrepreneurship.
Henrekson, Magnus; Sanandaji, Tino
2014-02-04
Entrepreneurship policy mainly aims to promote innovative Schumpeterian entrepreneurship. However, the rate of entrepreneurship is commonly proxied using quantity-based metrics, such as small business activity, the self-employment rate, or the number of startups. We argue that those metrics give rise to misleading inferences regarding high-impact Schumpeterian entrepreneurship. To unambiguously identify high-impact entrepreneurs we focus on self-made billionaires (in US dollars) who appear on Forbes Magazine's list and who became wealthy by founding new firms. We identify 996 such billionaire entrepreneurs in 50 countries in 1996-2010, a systematic cross-country study of billionaire entrepreneurs. The rate of billionaire entrepreneurs correlates negatively with self-employment, small business ownership, and firm startup rates. Countries with higher income, higher trust, lower taxes, more venture capital investment, and lower regulatory burdens have higher billionaire entrepreneurship rates but less self-employment. Despite its limitations, the number of billionaire entrepreneurs appears to be a plausible cross-country measure of Schumpeterian entrepreneurship.
Supernovae, an accelerating universe and the cosmological constant
Kirshner, Robert P.
1999-01-01
Observations of supernova explosions halfway back to the Big Bang give plausible evidence that the expansion of the universe has been accelerating since that epoch, approximately 8 billion years ago and suggest that energy associated with the vacuum itself may be responsible for the acceleration. PMID:10200242
"Ex Corde Universitatis": From the Heart of the University
ERIC Educational Resources Information Center
O'Brien, George Dennis
2004-01-01
This paper explores the place of religion within the assumptions of the modern research university. The issue for Christianity is essentially epistemic: Given the criteria for truth or plausibility that prevail in advanced academic communities, what are the warrants for Christian belief? Are the prevailing criteria defined such that Christian…
Extraterrestrial intelligent beings do not exist
NASA Astrophysics Data System (ADS)
Tipler, F. J.
1980-09-01
The singularity vs. the plurality of inhabited worlds in the universe is debated. Attention is given to astrophysical constraints on the evolution of intelligent species and to motivations for interstellar communication and exploration. It is argued that it is plausible that there is only one inhabited planet in the universe.
Numerical Solution of the Problem of the Expansion of the Universe in the Schwarzschild Metric
NASA Astrophysics Data System (ADS)
Vasenin, I. M.; Goiko, V. L.
2018-02-01
The statement and solution of the problem of the expansion of the Universe in nonstationary spherically-symmetric coordinates in the Schwarzschild metric are considered without pressure taken into account. The observational data of astronomers investigating the rates of recession of distant stars are explained on the basis of the obtained solutions.
ERIC Educational Resources Information Center
Oonyu, Joseph C.; Wamala, Robert
2012-01-01
This paper investigates the influence of the examination stage of student theses on the completion time of graduate studies at Makerere University, Uganda. The assessment is based on the administrative data of 504 Master's degree students in the 2000 to 2008 enrollment cohorts at the School of Education, Makerere University. The total elapsed time…
Generalized Gödel universes in higher dimensions and pure Lovelock gravity
NASA Astrophysics Data System (ADS)
Dadhich, Naresh; Molina, Alfred; Pons, Josep M.
2017-10-01
The Gödel universe is a homogeneous rotating dust with negative Λ which is a direct product of a three-dimensional pure rotation metric with a line. We would generalize it to higher dimensions for Einstein and pure Lovelock gravity with only one N th-order term. For higher-dimensional generalization, we have to include more rotations in the metric, and hence we shall begin with the corresponding pure rotation odd (d =2 n +1 )-dimensional metric involving n rotations, which eventually can be extended by a direct product with a line or a space of constant curvature for yielding a higher-dimensional Gödel universe. The considerations of n rotations and also of constant curvature spaces is a new line of generalization and is being considered for the first time.
A general relativistic rotating evolutionary universe—Part II
NASA Astrophysics Data System (ADS)
Berman, Marcelo Samuel
2008-06-01
As a sequel to Berman (Astrophys. Space Sci., 2008b), we show that the rotation of the Universe can be dealt by generalised Gaussian metrics, defined in this paper. Robertson-Walker’s metric has been employed with proper-time, in its standard applications; the generalised Gaussian metric implies in the use of a non-constant temporal metric coefficient modifying Robertson-Walker’s standard form. Experimental predictions are made.
Conway, Declan; Dessai, Suraje; Stainforth, David A.
2018-01-01
Abstract Decision‐Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi‐method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder‐identified decision‐critical metrics are examined: a basin‐wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade‐offs emerge between intrabasin and basin‐wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long‐term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision‐making under deep uncertainty. PMID:29706676
Bhave, Ajay Gajanan; Conway, Declan; Dessai, Suraje; Stainforth, David A
2018-02-01
Decision-Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi-method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder-identified decision-critical metrics are examined: a basin-wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade-offs emerge between intrabasin and basin-wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long-term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision-making under deep uncertainty.
Gouda, Hebe N; Critchley, Julia; Powles, John; Capewell, Simon
2012-01-28
Reasons for the widespread declines in coronary heart disease (CHD) mortality in high income countries are controversial. Here we explore how the type of metric chosen for the analyses of these declines affects the answer obtained. The analyses we reviewed were performed using IMPACT, a large Excel based model of the determinants of temporal change in mortality from CHD. Assessments of the decline in CHD mortality in the USA between 1980 and 2000 served as the central case study. Analyses based in the metric of number of deaths prevented attributed about half the decline to treatments (including preventive medications) and half to favourable shifts in risk factors. However, when mortality change was expressed in the metric of life-years-gained, the share attributed to risk factor change rose to 65%. This happened because risk factor changes were modelled as slowing disease progression, such that the hypothetical deaths averted resulted in longer average remaining lifetimes gained than the deaths averted by better treatments. This result was robust to a range of plausible assumptions on the relative effect sizes of changes in treatments and risk factors. Time-based metrics (such as life years) are generally preferable because they direct attention to the changes in the natural history of disease that are produced by changes in key health determinants. The life-years attached to each death averted will also weight deaths in a way that better reflects social preferences.
Teixeira, Andreia Sofia; Monteiro, Pedro T; Carriço, João A; Ramirez, Mário; Francisco, Alexandre P
2015-01-01
Trees, including minimum spanning trees (MSTs), are commonly used in phylogenetic studies. But, for the research community, it may be unclear that the presented tree is just a hypothesis, chosen from among many possible alternatives. In this scenario, it is important to quantify our confidence in both the trees and the branches/edges included in such trees. In this paper, we address this problem for MSTs by introducing a new edge betweenness metric for undirected and weighted graphs. This spanning edge betweenness metric is defined as the fraction of equivalent MSTs where a given edge is present. The metric provides a per edge statistic that is similar to that of the bootstrap approach frequently used in phylogenetics to support the grouping of taxa. We provide methods for the exact computation of this metric based on the well known Kirchhoff's matrix tree theorem. Moreover, we implement and make available a module for the PHYLOViZ software and evaluate the proposed metric concerning both effectiveness and computational performance. Analysis of trees generated using multilocus sequence typing data (MLST) and the goeBURST algorithm revealed that the space of possible MSTs in real data sets is extremely large. Selection of the edge to be represented using bootstrap could lead to unreliable results since alternative edges are present in the same fraction of equivalent MSTs. The choice of the MST to be presented, results from criteria implemented in the algorithm that must be based in biologically plausible models.
NASA Astrophysics Data System (ADS)
Bhave, Ajay Gajanan; Conway, Declan; Dessai, Suraje; Stainforth, David A.
2018-02-01
Decision-Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi-method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder-identified decision-critical metrics are examined: a basin-wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade-offs emerge between intrabasin and basin-wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long-term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision-making under deep uncertainty.
Parsimony in landscape metrics: Strength, universality, and consistency
Samuel A. Cushman; Kevin McGarigal; Maile C. Neel
2008-01-01
Ecologists can be overwhelmed by the number of metrics available to quantify landscape structure. Clarification of interrelationships and redundancy is needed to guide metric selection and interpretation for the purpose of landscape monitoring. In this study we identified independent components of class- and landscape-level structure in multiple landscapes in each of...
Particle dynamics in the original Schwarzschild metric
NASA Astrophysics Data System (ADS)
Fimin, N. N.; Chechetkin, V. M.
2016-04-01
The properties of the original Schwarzschild metric for a point gravitating mass are considered. The laws of motion in the corresponding space-time are established, and the transition from the Schwarzschildmetric to the metric of a "dusty universe" are studied. The dynamics of a system of particles in thr post-Newtonian approximation are analyzed.
Author Impact Metrics in Communication Sciences and Disorder Research
ERIC Educational Resources Information Center
Stuart, Andrew; Faucette, Sarah P.; Thomas, William Joseph
2017-01-01
Purpose: The purpose was to examine author-level impact metrics for faculty in the communication sciences and disorder research field across a variety of databases. Method: Author-level impact metrics were collected for faculty from 257 accredited universities in the United States and Canada. Three databases (i.e., Google Scholar, ResearchGate,…
Metric Education in Mathematics Methods Classes.
ERIC Educational Resources Information Center
Trent, John H.
A pre-test on knowledge of the metric system was administered to elementary mathematics methods classes at the University of Nevada at the beginning of the 1975 Spring Semester. A one-hour lesson was prepared and taught regarding metric length, weight, volume, and temperature. At the end of the semester the original test was given as the…
Equilibrium thermodynamics and neutrino decoupling in quasi-metric cosmology
NASA Astrophysics Data System (ADS)
Østvang, Dag
2018-05-01
The laws of thermodynamics in the expanding universe are formulated within the quasi-metric framework. The quasi-metric cosmic expansion does not directly influence momenta of material particles, so the expansion directly cools null particles only (e.g., photons). Therefore, said laws differ substantially from their counterparts in standard cosmology. Consequently, all non-null neutrino mass eigenstates are predicted to have the same energy today as they had just after neutrino decoupling in the early universe. This indicates that the predicted relic neutrino background is strongly inconsistent with detection rates measured in solar neutrino detectors (Borexino in particular). Thus quasi-metric cosmology is in violent conflict with experiment unless some exotic property of neutrinos makes the relic neutrino background essentially undetectable (e.g., if all massive mass eigenstates decay into "invisible" particles over cosmic time scales). But in absence of hard evidence in favour of the necessary exotic neutrino physics needed to resolve said conflict, the current status of quasi-metric relativity has been changed to non-viable.
NASA Astrophysics Data System (ADS)
Pope, C. N.; Sohnius, M. F.; Stelle, K. S.
We show that, contrary to previous conjectures, there exist acceptable counterterms for Ricci-flat N = 1 and N = 2 super-symmetric σ-models. In the N = 1 case we present infinite sequences of counterterms, starting from the 7-loop order, that do not vanish for general riemannian Ricci-flat metrics but do vanish when the metric is also Kähler. We then investigate the counterterms for theories with Ricci-flat Kähler metrics (i.e. N = 2 models). Acceptable counterterms must vanish for hyper-Kähler metrics (the N = 4 case), and must respect the principle of universality; i.e. that counterterms to the metric can be expressed without the use of complex structures or other special tensors, which do not exist for general riemannian spaces. We show that a recently proposed 4-loop counterterm for the N = 2 models does indeed satisfy these two conditions, despite the apparent stringency of the universality principle. Hence the finiteness of Ricci-flat N = 1 and N = 2 supersymmetric σ-models seems unlikely to persist beyond the 3-loop order.
Institutional Nitrogen Footprint: A Case Study at Oregon State ...
Many institutions of higher education are measuring and consciously managing their impact on the environment, using metrics of energy use, recycling, alternative transportation or local foods. While the carbon footprint is a more widely known metric of sustainability, the nitrogen footprint is also an important measure of human environmental impact that comes from food, energy, transportation and waste demands of a university. Oregon State University is a large, western land-grant university that has supported a Sustainability Office for more than 10 years, and joined the institutional Nitrogen Footprint Network in 2015. This poster presentation will demonstrate the Nitrogen Footprint Tool calculations for a large land-grant institution using existing data. Goals to reduce nitrogen will be explored in relation to existing efforts within the university that aim to reduce their carbon footprint, support alternative transportation, reduce waste and increase local foods. EPA's Sustainable and Healthy Communities Research Program has been working with the University of Virginia to grow the Nitrogen Footprint Tool (NFT) network from one institution to over 16 institutions since 2014. This poster will present the nitrogen footprint results from Oregon State University. The university has been actively involved in sustainability efforts for many years, and this presentation will share how much of the data that OSU collects for their existing sustainability metrics
Small business activity does not measure entrepreneurship
Henrekson, Magnus; Sanandaji, Tino
2014-01-01
Entrepreneurship policy mainly aims to promote innovative Schumpeterian entrepreneurship. However, the rate of entrepreneurship is commonly proxied using quantity-based metrics, such as small business activity, the self-employment rate, or the number of startups. We argue that those metrics give rise to misleading inferences regarding high-impact Schumpeterian entrepreneurship. To unambiguously identify high-impact entrepreneurs we focus on self-made billionaires (in US dollars) who appear on Forbes Magazine’s list and who became wealthy by founding new firms. We identify 996 such billionaire entrepreneurs in 50 countries in 1996–2010, a systematic cross-country study of billionaire entrepreneurs. The rate of billionaire entrepreneurs correlates negatively with self-employment, small business ownership, and firm startup rates. Countries with higher income, higher trust, lower taxes, more venture capital investment, and lower regulatory burdens have higher billionaire entrepreneurship rates but less self-employment. Despite its limitations, the number of billionaire entrepreneurs appears to be a plausible cross-country measure of Schumpeterian entrepreneurship. PMID:24449873
A Simple Principled Approach for Modeling and Understanding Uniform Color Metrics
Smet, Kevin A.G.; Webster, Michael A.; Whitehead, Lorne A.
2016-01-01
An important goal in characterizing human color vision is to order color percepts in a way that captures their similarities and differences. This has resulted in the continuing evolution of “uniform color spaces,” in which the distances within the space represent the perceptual differences between the stimuli. While these metrics are now very successful in predicting how color percepts are scaled, they do so in largely empirical, ad hoc ways, with limited reference to actual mechanisms of color vision. In this article our aim is to instead begin with general and plausible assumptions about color coding, and then develop a model of color appearance that explicitly incorporates them. We show that many of the features of empirically-defined color order systems (such as those of Munsell, Pantone, NCS, and others) as well as many of the basic phenomena of color perception, emerge naturally from fairly simple principles of color information encoding in the visual system and how it can be optimized for the spectral characteristics of the environment. PMID:26974939
On metrics and super-Riemann surfaces
NASA Astrophysics Data System (ADS)
Hodgkin, Luke
1987-08-01
It is shown that any super-Riemann surface M admits a large space of metrics (in a rather basic sense); while if M is of compact genus g type, g>1, M admits a unique metric whose lift to the universal cover is superconformally equivalent to the standard (Baranov-Shvarts) metric on the super-half plane. This explains the relation between the different methods of calculation of the upper Teichmüller space by the author (using arbitrary superconformal transformations) and Crane and Rabin (using only isometries).
Report on metric study tour to Republic of South Africa
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laner, F. J.
1978-01-01
The modernized metric system, known universally as the International System of Units (abbreviated SI under the French name) was renamed in 1960 by the world body on standards. A map shows 98 percent of the world using or moving toward adoption of SI units. Only the countries of Burma, Liberia, Brunei, and Southern Yemen are nonmetric. The author describes a two-week session in Pretoria and Johannesburg on metrication, followed by additional meetings on metrication in Rhodesia. (MCW)
On the hyperbolicity and stability of 3+1 formulations of metric f( R) gravity
NASA Astrophysics Data System (ADS)
Mongwane, Bishop
2016-11-01
3+1 formulations of the Einstein field equations have become an invaluable tool in Numerical relativity, having been used successfully in modeling spacetimes of black hole collisions, stellar collapse and other complex systems. It is plausible that similar considerations could prove fruitful for modified gravity theories. In this article, we pursue from a numerical relativistic viewpoint the 3+1 formulation of metric f( R) gravity as it arises from the fourth order equations of motion, without invoking the dynamical equivalence with Brans-Dicke theories. We present the resulting system of evolution and constraint equations for a generic function f( R), subject to the usual viability conditions. We confirm that the time propagation of the f( R) Hamiltonian and Momentum constraints take the same Mathematical form as in general relativity, irrespective of the f( R) model. We further recast the 3+1 system in a form akin to the BSSNOK formulation of numerical relativity. Without assuming any specific model, we show that the ADM version of f( R) is weakly hyperbolic and is plagued by similar zero speed modes as in the general relativity case. On the other hand the BSSNOK version is strongly hyperbolic and hence a promising formulation for numerical simulations in metric f( R) theories.
Snow removal performance metrics : final report.
DOT National Transportation Integrated Search
2017-05-01
This document is the final report for the Clear Roads project entitled Snow Removal Performance Metrics. The project team was led by researchers at Washington State University on behalf of Clear Roads, an ongoing pooled fund research effort focused o...
Automated Metrics in a Virtual-Reality Myringotomy Simulator: Development and Construct Validity.
Huang, Caiwen; Cheng, Horace; Bureau, Yves; Ladak, Hanif M; Agrawal, Sumit K
2018-06-15
The objectives of this study were: 1) to develop and implement a set of automated performance metrics into the Western myringotomy simulator, and 2) to establish construct validity. Prospective simulator-based assessment study. The Auditory Biophysics Laboratory at Western University, London, Ontario, Canada. Eleven participants were recruited from the Department of Otolaryngology-Head & Neck Surgery at Western University: four senior otolaryngology consultants and seven junior otolaryngology residents. Educational simulation. Discrimination between expert and novice participants on five primary automated performance metrics: 1) time to completion, 2) surgical errors, 3) incision angle, 4) incision length, and 5) the magnification of the microscope. Automated performance metrics were developed, programmed, and implemented into the simulator. Participants were given a standardized simulator orientation and instructions on myringotomy and tube placement. Each participant then performed 10 procedures and automated metrics were collected. The metrics were analyzed using the Mann-Whitney U test with Bonferroni correction. All metrics discriminated senior otolaryngologists from junior residents with a significance of p < 0.002. Junior residents had 2.8 times more errors compared with the senior otolaryngologists. Senior otolaryngologists took significantly less time to completion compared with junior residents. The senior group also had significantly longer incision lengths, more accurate incision angles, and lower magnification keeping both the umbo and annulus in view. Automated quantitative performance metrics were successfully developed and implemented, and construct validity was established by discriminating between expert and novice participants.
Performance Benchmarks for Scholarly Metrics Associated with Fisheries and Wildlife Faculty
Swihart, Robert K.; Sundaram, Mekala; Höök, Tomas O.; DeWoody, J. Andrew; Kellner, Kenneth F.
2016-01-01
Research productivity and impact are often considered in professional evaluations of academics, and performance metrics based on publications and citations increasingly are used in such evaluations. To promote evidence-based and informed use of these metrics, we collected publication and citation data for 437 tenure-track faculty members at 33 research-extensive universities in the United States belonging to the National Association of University Fisheries and Wildlife Programs. For each faculty member, we computed 8 commonly used performance metrics based on numbers of publications and citations, and recorded covariates including academic age (time since Ph.D.), sex, percentage of appointment devoted to research, and the sub-disciplinary research focus. Standardized deviance residuals from regression models were used to compare faculty after accounting for variation in performance due to these covariates. We also aggregated residuals to enable comparison across universities. Finally, we tested for temporal trends in citation practices to assess whether the “law of constant ratios”, used to enable comparison of performance metrics between disciplines that differ in citation and publication practices, applied to fisheries and wildlife sub-disciplines when mapped to Web of Science Journal Citation Report categories. Our regression models reduced deviance by ¼ to ½. Standardized residuals for each faculty member, when combined across metrics as a simple average or weighted via factor analysis, produced similar results in terms of performance based on percentile rankings. Significant variation was observed in scholarly performance across universities, after accounting for the influence of covariates. In contrast to findings for other disciplines, normalized citation ratios for fisheries and wildlife sub-disciplines increased across years. Increases were comparable for all sub-disciplines except ecology. We discuss the advantages and limitations of our methods, illustrate their use when applied to new data, and suggest future improvements. Our benchmarking approach may provide a useful tool to augment detailed, qualitative assessment of performance. PMID:27152838
Performance Benchmarks for Scholarly Metrics Associated with Fisheries and Wildlife Faculty.
Swihart, Robert K; Sundaram, Mekala; Höök, Tomas O; DeWoody, J Andrew; Kellner, Kenneth F
2016-01-01
Research productivity and impact are often considered in professional evaluations of academics, and performance metrics based on publications and citations increasingly are used in such evaluations. To promote evidence-based and informed use of these metrics, we collected publication and citation data for 437 tenure-track faculty members at 33 research-extensive universities in the United States belonging to the National Association of University Fisheries and Wildlife Programs. For each faculty member, we computed 8 commonly used performance metrics based on numbers of publications and citations, and recorded covariates including academic age (time since Ph.D.), sex, percentage of appointment devoted to research, and the sub-disciplinary research focus. Standardized deviance residuals from regression models were used to compare faculty after accounting for variation in performance due to these covariates. We also aggregated residuals to enable comparison across universities. Finally, we tested for temporal trends in citation practices to assess whether the "law of constant ratios", used to enable comparison of performance metrics between disciplines that differ in citation and publication practices, applied to fisheries and wildlife sub-disciplines when mapped to Web of Science Journal Citation Report categories. Our regression models reduced deviance by ¼ to ½. Standardized residuals for each faculty member, when combined across metrics as a simple average or weighted via factor analysis, produced similar results in terms of performance based on percentile rankings. Significant variation was observed in scholarly performance across universities, after accounting for the influence of covariates. In contrast to findings for other disciplines, normalized citation ratios for fisheries and wildlife sub-disciplines increased across years. Increases were comparable for all sub-disciplines except ecology. We discuss the advantages and limitations of our methods, illustrate their use when applied to new data, and suggest future improvements. Our benchmarking approach may provide a useful tool to augment detailed, qualitative assessment of performance.
Gödel metrics with chronology protection in Horndeski gravities
NASA Astrophysics Data System (ADS)
Geng, Wei-Jian; Li, Shou-Long; Lü, H.; Wei, Hao
2018-05-01
Gödel universe, one of the most interesting exact solutions predicted by General Relativity, describes a homogeneous rotating universe containing naked closed time-like curves (CTCs). It was shown that such CTCs are the consequence of the null energy condition in General Relativity. In this paper, we show that the Gödel-type metrics with chronology protection can emerge in Einstein-Horndeski gravity. We construct such exact solutions also in Einstein-Horndeski-Maxwell and Einstein-Horndeski-Proca theories.
Pop, Mihai
2018-04-27
University of Maryland's Mihai Pop on Genome Assembly Forensics: Metrics for Assessing Assembly Correctness at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
Of paradox and plausibility: the dynamic of change in medical law.
Harrington, John
2014-01-01
This article develops a model of change in medical law. Drawing on systems theory, it argues that medical law participates in a dynamic of 'deparadoxification' and 'reparadoxification' whereby the underlying contingency of the law is variously concealed through plausible argumentation, or revealed by critical challenge. Medical law is, thus, thoroughly rhetorical. An examination of the development of the law on abortion and on the sterilization of incompetent adults shows that plausibility is achieved through the deployment of substantive common sense and formal stylistic devices. It is undermined where these elements are shown to be arbitrary and constructed. In conclusion, it is argued that the politics of medical law are constituted by this antagonistic process of establishing and challenging provisionally stable normative regimes. © The Author [2014]. Published by Oxford University Press; all rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Malykh, A. A.; Nutku, Y.; Sheftel, M. B.
2003-11-01
Explicit Riemannian metrics with Euclidean signature and anti-self-dual curvature that do not admit any Killing vectors are presented. The metric and the Riemann curvature scalars are homogeneous functions of degree zero in a single real potential and its derivatives. The solution for the potential is a sum of exponential functions which suggests that for the choice of a suitable domain of coordinates and parameters it can be the metric on a compact manifold. Then, by the theorem of Hitchin, it could be a class of metrics on K3, or on surfaces whose universal covering is K3.
Using Publication Metrics to Highlight Academic Productivity and Research Impact
Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.
2016-01-01
This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141
Tiyarattanachai, Ronnachai; Hollmann, Nicholas M
2016-01-01
In 2010, Universitas Indonesia (UI) developed the UI GreenMetric World University Ranking for universities to share information about their sustainability practices. This ranking system was well aligned with the basis of Sustainability for Higher Education. The scoring system can also be used as a guideline for universities to achieve sustainability in their campuses. Since its first launch, more universities around the world have increasingly participated in the ranking system including many universities in Thailand. This study compared perception of stakeholders in Green Campus and Non-Green Campus universities in Thailand regarding stakeholders' satisfaction on sustainability practices and perceived quality of life at their campuses. The results showed that stakeholders at the studied Green Campus University were more satisfied and had significantly better perceived quality of life compared to stakeholders from the studied Non-Green Campus university. The results suggested that universities should adopt the criteria set in the UI GreenMetric World University Ranking to achieve better sustainability in their campuses and improve quality of life of their stakeholders.
Institutional Nitrogen Footprint: A Case Study at Oregon State University
Many institutions of higher education are measuring and consciously managing their impact on the environment, using metrics of energy use, recycling, alternative transportation or local foods. While the carbon footprint is a more widely known metric of sustainability, the nitrog...
Evaluating Metrics of Drainage Divide Mobility
NASA Astrophysics Data System (ADS)
Forte, A. M.; Whipple, K. X.; DiBiase, R.; Gasparini, N. M.; Ouimet, W. B.
2016-12-01
Watersheds are the fundamental organizing units in landscapes and thus the controls on drainage divide location and mobility are an essential facet of landscape evolution. Additionally, many common topographic analyses fundamentally assume that river network topology and divide locations are largely static, allowing channel profile form to be interpreted in terms of spatio-temporal patterns of rock uplift rate relative to baselevel, climate, or rock properties. Recently however, it has been suggested that drainage divides are more mobile than previously thought and that divide mobility, and resulting changes in drainage area, can potentially induce changes to fluvial topography comparable to spatio-temporal variation in rock uplift, climate, or rock properties. Ultimately, reliable metrics are needed to diagnose the mobility of divides. One such recently proposed metric is cross-divide contrasts in `chi', a measure of the current topology of the drainage network, but cross-divide contrasts in a number of topographic metrics show promise. Here we use a series of landscape evolution modeling scenarios in which we induce divide mobility under different conditions to test the utility of a suite of plausible topographic metrics of divide mobility and compare these to natural examples. Specifically, we test cross-divide contrasts in mean slope, mean local relief, channel bed elevation at a reference drainage area, and chi. Our results highlight that cross-divide contrasts in chi can only be accurately interpreted in terms of divide mobility when uplift, rock erodibility, climate, and base-level are uniform across both river networks on either side of the divide. This is problematic for application of this metric to natural landscapes as (1) uniformity of all of these parameters is exceedingly unlikely and (2) quantifying the spatial patterns of these parameters is difficult. Consequently, as shown here for both simulated and natural landscapes, simple measures of cross-divide contrasts in mean slope, mean local relief, and channel bed elevation at a reference drainage area are more robust metrics of divide mobility, correctly identifying stable or mobile divides independent of cross-divide differences in rock uplift, climate, erodibility or baselevel.
Imprints from the global cosmological expansion to the local spacetime dynamics.
Fahr, Hans J; Siewert, Mark
2008-05-01
We study the general relativistic spacetime metrics surrounding massive cosmological objects, such as suns, stars, galaxies or galaxy clusters. The question addressed here is the transition of local, object-related spacetime metrics into the global, cosmological Robertson-Walker metrics. We demonstrate that the answer often quoted for this problem from the literature, the so-called Einstein-Straus vacuole, which connects a static outer Schwarzschild solution with the time-dependent Robertson-Walker universe, is inadequate to describe the local spacetime of a gravitationally bound system. Thus, we derive here an alternative model describing such bound systems by a metrics more closely tied to the fundamental problem of structure formation in the early universe and obtain a multitude of solutions characterising the time-dependence of a local scale parameter. As we can show, a specific solution out of this multitude is able to, as a by-product, surprisingly enough, explain the presently much discussed phenomenon of the PIONEER anomaly.
A bibliometric analysis of digestive health research in Canada.
Tuitt, Desiree; Knight, Frank; Lipman, Tara
2011-11-01
Measurement of the impact and influence of medical⁄scientific journals, and of individual researchers has become more widely practiced in recent decades. This is driven, in part, by the increased availability of data regarding citations of research articles, and by increased competition for research funding. Digestive disease research has been identified as a particularly strong discipline in Canada. The authors collected quantitative data on the impact and influence of Canadian digestive health research. The present study involved an analysis of the research impact (Hirsch factor) and research influence (Influence factor) of 106 digestive health researchers in Canada. Rankings of the top 25 researchers on the basis of the two metrics were dominated by the larger research groups at the University of Toronto (Toronto, Ontario), McMaster University (Hamilton, Ontario), and the Universities of Calgary (Calgary, Alberta) and Alberta (Edmonton, Alberta), but with representation by other research groups at the Universities of Manitoba (Winnipeg, Manitoba), Western Ontario (London, Ontario) and McGill University (Montreal, Quebec). Female and male researchers had similar scores for the two metrics, as did basic scientists versus clinical investigators. Strategic recruitment, particularly of established investigators, can have a major impact on the ranking of research groups. Comparing these metrics over different time frames can provide insights into the vulnerabilities and strengths of research groups.
2013-01-01
and levels of corruption, as well as more ephemeral soft power considerations like national reputation, moral clout, and cultural influence.7 For...of carefully calibrated issues that balance underlying national interests and plausible opportunities for exerting influence. Middle power diplomacy...World (University Park: Pennsylvania State University Press, 1997); Björn Hettne, András Inotai and Osvaldo Sunkel, eds., Gobalism and the New
Gravitational Radiation - a New Window Onto the Universe. (Karl Schwarzschild Lecture 1996)
NASA Astrophysics Data System (ADS)
Thorne, K. S.
A summary is given of the current status and plans for gravitational-wave searches at all plausible wavelengths, from the size of the observable universe to a few kilometers. The anticipated scientific payoff from these searches is described, including expectations for detailed studies of black holes and neutron stars, high-accuracy tests of general relativity, and hopes for the discovery of exotic new kinds of objects.
Guidelines for evaluating performance of oyster habitat restoration
Baggett, Lesley P.; Powers, Sean P.; Brumbaugh, Robert D.; Coen, Loren D.; DeAngelis, Bryan M.; Greene, Jennifer K.; Hancock, Boze T.; Morlock, Summer M.; Allen, Brian L.; Breitburg, Denise L.; Bushek, David; Grabowski, Jonathan H.; Grizzle, Raymond E.; Grosholz, Edwin D.; LaPeyre, Megan K.; Luckenbach, Mark W.; McGraw, Kay A.; Piehler, Michael F.; Westby, Stephanie R.; zu Ermgassen, Philine S. E.
2015-01-01
Restoration of degraded ecosystems is an important societal goal, yet inadequate monitoring and the absence of clear performance metrics are common criticisms of many habitat restoration projects. Funding limitations can prevent adequate monitoring, but we suggest that the lack of accepted metrics to address the diversity of restoration objectives also presents a serious challenge to the monitoring of restoration projects. A working group with experience in designing and monitoring oyster reef projects was used to develop standardized monitoring metrics, units, and performance criteria that would allow for comparison among restoration sites and projects of various construction types. A set of four universal metrics (reef areal dimensions, reef height, oyster density, and oyster size–frequency distribution) and a set of three universal environmental variables (water temperature, salinity, and dissolved oxygen) are recommended to be monitored for all oyster habitat restoration projects regardless of their goal(s). In addition, restoration goal-based metrics specific to four commonly cited ecosystem service-based restoration goals are recommended, along with an optional set of seven supplemental ancillary metrics that could provide information useful to the interpretation of prerestoration and postrestoration monitoring data. Widespread adoption of a common set of metrics with standardized techniques and units to assess well-defined goals not only allows practitioners to gauge the performance of their own projects but also allows for comparison among projects, which is both essential to the advancement of the field of oyster restoration and can provide new knowledge about the structure and ecological function of oyster reef ecosystems.
A Tale of Three District Energy Systems: Metrics and Future Opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pass, Rebecca Zarin; Wetter, Michael; Piette, Mary Ann
Improving the sustainability of cities is crucial for meeting climate goals in the next several decades. One way this is being tackled is through innovation in district energy systems, which can take advantage of local resources and economies of scale to improve the performance of whole neighborhoods in ways infeasible for individual buildings. These systems vary in physical size, end use services, primary energy resources, and sophistication of control. They also vary enormously in their choice of optimization metrics while all under the umbrella-goal of improved sustainability. This paper explores the implications of choice of metric on district energy systemsmore » using three case studies: Stanford University, the University of California at Merced, and the Richmond Bay campus of the University of California at Berkeley. They each have a centralized authority to implement large-scale projects quickly, while maintaining data records, which makes them relatively effective at achieving their respective goals. Comparing the systems using several common energy metrics reveals significant differences in relative system merit. Additionally, a novel bidirectional heating and cooling system is presented. This system is highly energy-efficient, and while more analysis is required, may be the basis of the next generation of district energy systems.« less
Making Metrics Matter: How to Use Indicators to Govern Effectively
ERIC Educational Resources Information Center
Allen, Clyde; Bacow, Lawrence S.; Trombley, Laura Skandera
2011-01-01
Many institutions develop specific measures or indicators--often called "dashboards"--to inform boards and top administrators about the college or university's current situation and performance and assist them in moving the institution ahead strategically. And, increasingly, institutions are using metrics not only to assess internal…
A common visual metric for approximate number and density
Dakin, Steven C.; Tibber, Marc S.; Greenwood, John A.; Kingdom, Frederick A. A.; Morgan, Michael J.
2011-01-01
There is considerable interest in how humans estimate the number of objects in a scene in the context of an extensive literature on how we estimate the density (i.e., spacing) of objects. Here, we show that our sense of number and our sense of density are intertwined. Presented with two patches, observers found it more difficult to spot differences in either density or numerosity when those patches were mismatched in overall size, and their errors were consistent with larger patches appearing both denser and more numerous. We propose that density is estimated using the relative response of mechanisms tuned to low and high spatial frequencies (SFs), because energy at high SFs is largely determined by the number of objects, whereas low SF energy depends more on the area occupied by elements. This measure is biased by overall stimulus size in the same way as human observers, and by estimating number using the same measure scaled by relative stimulus size, we can explain all of our results. This model is a simple, biologically plausible common metric for perceptual number and density. PMID:22106276
Building a Metrics-Enabled Marketing Curriculum: The Cornerstone Course
ERIC Educational Resources Information Center
Pilling, Bruce K.; Rigdon, Edward E.; Brightman, Harvey J.
2012-01-01
The lack of analytical preparation of marketing students was a key concern at a large, public university in southeastern United States, leading to the decision to create a new required undergraduate marketing metrics course. This article describes the development of that course, designed specifically to strengthen analytical skills across the…
NASA Astrophysics Data System (ADS)
Lombardi, D.; Sinatra, G. M.
2013-12-01
Critical evaluation and plausibility reappraisal of scientific explanations have been underemphasized in many science classrooms (NRC, 2012). Deep science learning demands that students increase their ability to critically evaluate the quality of scientific knowledge, weigh alternative explanations, and explicitly reappraise their plausibility judgments. Therefore, this lack of instruction about critical evaluation and plausibility reappraisal has, in part, contributed to diminished understanding about complex and controversial topics, such as global climate change. The Model-Evidence Link (MEL) diagram (originally developed by researchers at Rutgers University under an NSF-supported project; Chinn & Buckland, 2012) is an instructional scaffold that promotes students to critically evaluate alternative explanations. We recently developed a climate change MEL and found that the students who used the MEL experienced a significant shift in their plausibility judgments toward the scientifically accepted model of human-induced climate change. Using the MEL for instruction also resulted in conceptual change about the causes of global warming that reflected greater understanding of fundamental scientific principles. Furthermore, students sustained this conceptual change six months after MEL instruction (Lombardi, Sinatra, & Nussbaum, 2013). This presentation will discuss recent educational research that supports use of the MEL to promote critical evaluation, plausibility reappraisal, and conceptual change, and also, how the MEL may be particularly effective for learning about global climate change and other socio-scientific topics. Such instruction to develop these fundamental thinking skills (e.g., critical evaluation and plausibility reappraisal) is demanded by both the Next Generation Science Standards (Achieve, 2013) and the Common Core State Standards for English Language Arts and Mathematics (CCSS Initiative-ELA, 2010; CCSS Initiative-Math, 2010), as well as a society that is equipped to deal with challenges in a way that is beneficial to our national and global community.
Testing backreaction effects with observational Hubble parameter data
NASA Astrophysics Data System (ADS)
Cao, Shu-Lei; Teng, Huan-Yu; Wan, Hao-Yi; Yu, Hao-Ran; Zhang, Tong-Jie
2018-02-01
The spatially averaged inhomogeneous Universe includes a kinematical backreaction term Q_{D} that is relate to the averaged spatial Ricci scalar
Talent Management for Universities
ERIC Educational Resources Information Center
Bradley, Andrew P.
2016-01-01
This paper explores human resource management practices in the university sector with a specific focus on talent pools and talent management more generally. The paper defines talent management in the context of the university sector and then explores its interdependence with organisational strategy, the metrics used to measure academic performance…
A Framework to Integrate Public, Dynamic Metrics into an OER Platform
ERIC Educational Resources Information Center
Cohen, Jaclyn Zetta; Omollo, Kathleen Ludewig; Malicke, Dave
2014-01-01
The usage metrics for open educational resources (OER) are often either hidden behind an authentication system or shared intermittently in static, aggregated format at the repository level. This paper discusses the first year of University of Michigan's project to share its OER usage data dynamically, publicly, to synthesize it across different…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copeland, Alex; Brown, C. Titus
2011-10-13
DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
The Death of Socrates: Managerialism, Metrics and Bureaucratisation in Universities
ERIC Educational Resources Information Center
Orr, Yancey; Orr, Raymond
2016-01-01
Neoliberalism exults the ability of unregulated markets to optimise human relations. Yet, as David Graeber has recently illustrated, it is paradoxically built on rigorous systems of rules, metrics and managers. The potential transition to a market-based tuition and research-funding model for higher education in Australia has, not surprisingly,…
Copeland, Alex; Brown, C. Titus
2018-04-27
DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
Supporting Research Impact Metrics in Academic Libraries: A Case Study
ERIC Educational Resources Information Center
Braun, Steven
2017-01-01
Measuring research impact has become a nearly ubiquitous facet of scholarly communication. At the University of Minnesota Medical School, new administrative directives have directly tied impact metrics to faculty assessment, promotion, and tenure. In this paper, I describe a platform for the analysis and visualization of research impact that was…
Extended DBI massive gravity with generalized fiducial metric
NASA Astrophysics Data System (ADS)
Chullaphan, Tossaporn; Tannukij, Lunchakorn; Wongjun, Pitayuth
2015-06-01
We consider an extended model of DBI massive gravity by generalizing the fiducial metric to be an induced metric on the brane corresponding to a domain wall moving in five-dimensional Schwarzschild-Anti-de Sitter spacetime. The model admits all solutions of FLRW metric including flat, closed and open geometries while the original one does not. The background solutions can be divided into two branches namely self-accelerating branch and normal branch. For the self-accelerating branch, the graviton mass plays the role of cosmological constant to drive the late-time acceleration of the universe. It is found that the number degrees of freedom of gravitational sector is not correct similar to the original DBI massive gravity. There are only two propagating degrees of freedom from tensor modes. For normal branch, we restrict our attention to a particular class of the solutions which provides an accelerated expansion of the universe. It is found that the number of degrees of freedom in the model is correct. However, at least one of them is ghost degree of freedom which always present at small scale implying that the theory is not stable.
Universality of isothermal fluid spheres in Lovelock gravity
NASA Astrophysics Data System (ADS)
Dadhich, Naresh; Hansraj, Sudan; Maharaj, Sunil D.
2016-02-01
We show universality of isothermal fluid spheres in pure Lovelock gravity where the equation of motion has only one N th order term coming from the corresponding Lovelock polynomial action of degree N . Isothermality is characterized by the equation of state, p =α ρ and the property, ρ ˜1 /r2 N . Then the solution describing isothermal spheres, which exist only for the pure Lovelock equation, is of the same form for the general Lovelock degree N in all dimensions d ≥2 N +2 . We further prove that the necessary and sufficient condition for the isothermal sphere is that its metric is conformal to the massless global monopole or the solid angle deficit metric, and this feature is also universal.
Shedding light on baryonic dark matter.
Silk, J
1991-02-01
Halo dark matter, if it is baryonic, may plausibly consist of compact stellar remnants. Jeans mass clouds containing 10(6) to 10(8) solar masses could have efficiently formed stars in the early universe and could plausibly have generated, for a suitably top-heavy stellar initial mass function, a high abundance of neutron stars as well as a small admixture of long-lived low mass stars. Within the resulting clusters of dark remnants, which eventually are tidally disrupted when halos eventually form, captures of neutron stars by non-degenerate stars resulted in formation of close binaries. These evolve to produce, by the present epoch, an observable x-ray signal associated with dark matter aggregations in galaxy halos and galaxy cluster cores.
NASA Astrophysics Data System (ADS)
Kastor, David; Ray, Sourya; Traschen, Jennie
2017-10-01
We study the problem of finding brane-like solutions to Lovelock gravity, adopting a general approach to establish conditions that a lower dimensional base metric must satisfy in order that a solution to a given Lovelock theory can be constructed in one higher dimension. We find that for Lovelock theories with generic values of the coupling constants, the Lovelock tensors (higher curvature generalizations of the Einstein tensor) of the base metric must all be proportional to the metric. Hence, allowed base metrics form a subclass of Einstein metrics. This subclass includes so-called ‘universal metrics’, which have been previously investigated as solutions to quantum-corrected field equations. For specially tuned values of the Lovelock couplings, we find that the Lovelock tensors of the base metric need to satisfy fewer constraints. For example, for Lovelock theories with a unique vacuum there is only a single such constraint, a case previously identified in the literature, and brane solutions can be straightforwardly constructed.
Resilience Metrics for the Electric Power System: A Performance-Based Approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vugrin, Eric D.; Castillo, Andrea R; Silva-Monroy, Cesar Augusto
Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. formore » the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.« less
Inflationary Cosmology: Is Our Universe Part of a Multiverse?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guth, Alan
2008-11-06
In this talk, Guth explains the inflationary theory and reviews the features that make it scientifically plausible. In addition, he discusses the biggest mystery in cosmology: Why is the value of the cosmological constant, sometimes called the "anti-gravity" effect, so remarkably small compared to theoretical expectations?
Universal moduli spaces of Riemann surfaces
NASA Astrophysics Data System (ADS)
Ji, Lizhen; Jost, Jürgen
2017-04-01
We construct a moduli space for Riemann surfaces that is universal in the sense that it represents compact Riemann surfaces of any finite genus. This moduli space is a connected complex subspace of an infinite dimensional complex space, and is stratified according to genus such that each stratum has a compact closure, and it carries a metric and a measure that induce a Riemannian metric and a finite volume measure on each stratum. Applications to the Plateau-Douglas problem for minimal surfaces of varying genus and to the partition function of Bosonic string theory are outlined. The construction starts with a universal moduli space of Abelian varieties. This space carries a structure of an infinite dimensional locally symmetric space which is of interest in its own right. The key to our construction of the universal moduli space then is the Torelli map that assigns to every Riemann surface its Jacobian and its extension to the Satake-Baily-Borel compactifications.
ERIC Educational Resources Information Center
Tran, Tam; Bowman-Carpio, LeeAnna; Buscher, Nate; Davidson, Pamela; Ford, Jennifer J.; Jenkins, Erick; Kalay, Hillary Noll; Nakazono, Terry; Orescan, Helene; Sak, Rachael; Shin, Irene
2017-01-01
In 2013, the University of California, Biomedical Research, Acceleration, Integration, and Development (UC BRAID) convened a regional network of contracting directors from the five University of California (UC) health campuses to: (i) increase collaboration, (ii) operationalize and measure common metrics as a basis for performance improvement…
Noncommutative spherically symmetric spacetimes at semiclassical order
NASA Astrophysics Data System (ADS)
Fritz, Christopher; Majid, Shahn
2017-07-01
Working within the recent formalism of Poisson-Riemannian geometry, we completely solve the case of generic spherically symmetric metric and spherically symmetric Poisson-bracket to find a unique answer for the quantum differential calculus, quantum metric and quantum Levi-Civita connection at semiclassical order O(λ) . Here λ is the deformation parameter, plausibly the Planck scale. We find that r, t, d r, d t are all forced to be central, i.e. undeformed at order λ, while for each value of r, t we are forced to have a fuzzy sphere of radius r with a unique differential calculus which is necessarily nonassociative at order λ2 . We give the spherically symmetric quantisation of the FLRW cosmology in detail and also recover a previous analysis for the Schwarzschild black hole, now showing that the quantum Ricci tensor for the latter vanishes at order λ. The quantum Laplace-Beltrami operator for spherically symmetric models turns out to be undeformed at order λ while more generally in Poisson-Riemannian geometry we show that it deforms to □f+λ2ωαβ(Ricγα-Sγα)(∇^βdf)γ+O(λ2) in terms of the classical Levi-Civita connection \\widehat\
Huang, Hung-Chung; Jupiter, Daniel; VanBuren, Vincent
2010-01-01
Background Identification of genes with switch-like properties will facilitate discovery of regulatory mechanisms that underlie these properties, and will provide knowledge for the appropriate application of Boolean networks in gene regulatory models. As switch-like behavior is likely associated with tissue-specific expression, these gene products are expected to be plausible candidates as tissue-specific biomarkers. Methodology/Principal Findings In a systematic classification of genes and search for biomarkers, gene expression profiles (GEPs) of more than 16,000 genes from 2,145 mouse array samples were analyzed. Four distribution metrics (mean, standard deviation, kurtosis and skewness) were used to classify GEPs into four categories: predominantly-off, predominantly-on, graded (rheostatic), and switch-like genes. The arrays under study were also grouped and examined by tissue type. For example, arrays were categorized as ‘brain group’ and ‘non-brain group’; the Kolmogorov-Smirnov distance and Pearson correlation coefficient were then used to compare GEPs between brain and non-brain for each gene. We were thus able to identify tissue-specific biomarker candidate genes. Conclusions/Significance The methodology employed here may be used to facilitate disease-specific biomarker discovery. PMID:20140228
Implementation of Quality Assurance and Quality Control Measures in the National Phenology Database
NASA Astrophysics Data System (ADS)
Gerst, K.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Barnett, L.
2015-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database has over 5.5 million observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, and vetted documented methodologies and protocols. The USA-NPN has implemented a number of measures to ensure both quality assurance and quality control. Here we describe the resources that have been developed so that incoming data submitted by both citizen and professional scientists are reliable; these include training materials, such as a botanical primer and species profiles. We also describe a number of automated quality control processes applied to incoming data streams to optimize data output quality. Existing and planned quality control measures for output of raw and derived data include: (1) Validation of site locations, including latitude, longitude, and elevation; (2) Flagging of records that conflict for a given date for an individual plant; (3) Flagging where species occur outside known ranges; (4) Flagging of records when phenophases occur outside of the plausible order for a species; (5) Flagging of records when intensity measures do not follow a plausible progression for a phenophase; (6) Flagging of records when a phenophase occurs outside of the plausible season, and (7) Quantification of precision and uncertainty for estimation of phenological metrics. Finally, we will describe preliminary work to develop methods for outlier detection that will inform plausibility checks. Ultimately we aim to maximize data quality of USA-NPN data and data products to ensure that this database can continue to be reliably applied for science and decision-making for multiple scales and applications.
Shedding light on baryonic dark matter
NASA Technical Reports Server (NTRS)
Silk, Joseph
1991-01-01
Halo dark matter, if it is baryonic, may plausibly consist of compact stellar remnants. Jeans mass clouds containing 10 to the 6th to 10 to the 8th solar masses could have efficiently formed stars in the early universe and could plausibly have generated, for a suitably top-heavy stellar initial mass function, a high abundance of neutron stars as well as a small admixture of long-lived low mass stars. Within the resulting clusters of dark remnants, which eventually are tidally disrupted when halos eventually form, captures of neutron stars by nondegenerate stars resulted in formation of close binaries. These evolve to produce, by the present epoch, an observable X-ray signal associated with dark matter aggregations in galaxy cluster cores.
Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment
2015-02-04
implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented
Lorentz violation with a universal minimum speed as foundation of de Sitter relativity
NASA Astrophysics Data System (ADS)
Cruz, Cláudio Nassif; Dos Santos, Rodrigo Francisco; Amaro de Faria, A. C.
We aim to investigate the theory of Lorentz violation with an invariant minimum speed called Symmetrical Special Relativity (SSR) from the viewpoint of its metric. Thus, we should explore the nature of SSR-metric in order to understand the origin of the conformal factor that appears in the metric by deforming Minkowski metric by means of an invariant minimum speed that breaks down Lorentz symmetry. So, we are able to realize that there is a similarity between SSR and a new space with variable negative curvature ( -∞ < ℛ < 0) connected to a set of infinite cosmological constants (0 < Λ < ∞), working like an extended de Sitter (dS) relativity, so that such extended dS-relativity has curvature and cosmological “constant” varying in time. We obtain a scenario that is more similar to dS-relativity given in the approximation of a slightly negative curvature for representing the current universe having a tiny cosmological constant. Finally, we show that the invariant minimum speed provides the foundation for understanding the kinematics origin of the extra dimension considered in dS-relativity in order to represent the dS-length.
Towards a more plausible dragon
NASA Astrophysics Data System (ADS)
Efthimiou, Costas
2014-08-01
Wizards, mermaids, dragons and aliens. Walking, running, flying and space travel. A hi-tech elevator, a computer, a propulsion engine and a black hole. What do all of these things have in common? This might seem like a really hard brainteaser but the answer is simple: they all obey the fundamental laws of our universe.
Beyond Empathy to Building a Plausible Economic Future.
ERIC Educational Resources Information Center
Albertus, Alvin D.; Bright, Larry K.
The complexity of the global society and economy, and the resulting fracturing of social classes across the Midwest, the nation, and the world demand a significant expansion of the importance of human relations training courses for counselor education and for general teacher education. At the University of South Dakota at Vermillion the School of…
A Validation of Object-Oriented Design Metrics
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Briand, Lionel; Melo, Walcelio L.
1995-01-01
This paper presents the results of a study conducted at the University of Maryland in which we experimentally investigated the suite of Object-Oriented (00) design metrics introduced by [Chidamber and Kemerer, 1994]. In order to do this, we assessed these metrics as predictors of fault-prone classes. This study is complementary to [Lieand Henry, 1993] where the same suite of metrics had been used to assess frequencies of maintenance changes to classes. To perform our validation accurately, we collected data on the development of eight medium-sized information management systems based on identical requirements. All eight projects were developed using a sequential life cycle model, a well-known 00 analysis/design method and the C++ programming language. Based on experimental results, the advantages and drawbacks of these 00 metrics are discussed and suggestions for improvement are provided. Several of Chidamber and Kemerer's 00 metrics appear to be adequate to predict class fault-proneness during the early phases of the life-cycle. We also showed that they are, on our data set, better predictors than "traditional" code metrics, which can only be collected at a later phase of the software development processes.
The Entrepreneurial University: Vision and Metrics
ERIC Educational Resources Information Center
Etzkowitz, Henry
2016-01-01
Forged in different academic and national traditions, the university is arriving at a common entrepreneurial format that incorporates and transcends its traditional missions. The academic entrepreneurial transition arises from the confluence of the internal development of higher education institutions and external influences on academic structures…
Quality assessment for color reproduction using a blind metric
NASA Astrophysics Data System (ADS)
Bringier, B.; Quintard, L.; Larabi, M.-C.
2007-01-01
This paper deals with image quality assessment. This field plays nowadays an important role in various image processing applications. Number of objective image quality metrics, that correlate or not, with the subjective quality have been developed during the last decade. Two categories of metrics can be distinguished, the first with full-reference and the second with no-reference. Full-reference metric tries to evaluate the distortion introduced to an image with regards to the reference. No-reference approach attempts to model the judgment of image quality in a blind way. Unfortunately, the universal image quality model is not on the horizon and empirical models established on psychophysical experimentation are generally used. In this paper, we focus only on the second category to evaluate the quality of color reproduction where a blind metric, based on human visual system modeling is introduced. The objective results are validated by single-media and cross-media subjective tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilar, José Edgar Madriz; Bellini, Mauricio, E-mail: jemadriz@fisica.ugto.mx, E-mail: mbellini@mdp.edu.ar
2010-11-01
We study scalar field fluctuations of the inflaton field in an early inflationary universe on an effective 4D Schwarzschild-de Sitter (SdS) metric, which is obtained after make a planar coordinate transformation on a 5D Ricci-flat Schwarzschild-de Sitter (SdS) static metric. We obtain the important result that the spectrum of fluctuations at zeroth order is independent of the scalar field mass M on Schwarzschild scales, while on cosmological scales it exhibits a mass dependence. However, in the first-order expansion, the spectrum depends of the inflaton mass and the amplitude is linear with the Black-Hole (BH) mass m.
ERIC Educational Resources Information Center
Dekydtspotter, Laurent; Sprouse, Rex A.
2001-01-01
Addresses the issue of second language (L2) epistemology assuming Chomsky's (1995) discussion of the place of universal grammar in mental design. Discusses interaction of adjectival restriction in interrogative expressions, contrasts plausibility of nativist and non-nativist approaches to the etiology of such grammatical knowledge, an reports…
The Big Bang and Cosmic Inflation
NASA Astrophysics Data System (ADS)
Guth, Alan H.
2014-03-01
A summary is given of the key developments of cosmology in the 20th century, from the work of Albert Einstein to the emergence of the generally accepted hot big bang model. The successes of this model are reviewed, but emphasis is placed on the questions that the model leaves unanswered. The remainder of the paper describes the inflationary universe model, which provides plausible answers to a number of these questions. It also offers a possible explanation for the origin of essentially all the matter and energy in the observed universe.
Brooks, Frank J; Grigsby, Perry W
2013-12-23
Many types of cancer are located and assessed via positron emission tomography (PET) using the 18F-fluorodeoxyglucose (FDG) radiotracer of glucose uptake. There is rapidly increasing interest in exploiting the intra-tumor heterogeneity observed in these FDG-PET images as an indicator of disease outcome. If this image heterogeneity is of genuine prognostic value, then it either correlates to known prognostic factors, such as tumor stage, or it indicates some as yet unknown tumor quality. Therefore, the first step in demonstrating the clinical usefulness of image heterogeneity is to explore the dependence of image heterogeneity metrics upon established prognostic indicators and other clinically interesting factors. If it is shown that image heterogeneity is merely a surrogate for other important tumor properties or variations in patient populations, then the theoretical value of quantified biological heterogeneity may not yet translate into the clinic given current imaging technology. We explore the relation between pelvic lymph node status at diagnosis and the visually evident uptake heterogeneity often observed in 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) images of cervical carcinomas. We retrospectively studied the FDG-PET images of 47 node negative and 38 node positive patients, each having FIGO stage IIb tumors with squamous cell histology. Imaged tumors were segmented using 40% of the maximum tumor uptake as the tumor-defining threshold and then converted into sets of three-dimensional coordinates. We employed the sphericity, extent, Shannon entropy (S) and the accrued deviation from smoothest gradients (ζ) as image heterogeneity metrics. We analyze these metrics within tumor volume strata via: the Kolmogorov-Smirnov test, principal component analysis and contingency tables. We found no statistically significant difference between the positive and negative lymph node groups for any one metric or plausible combinations thereof. Additionally, we observed that S is strongly dependent upon tumor volume and that ζ moderately correlates with mean FDG uptake. FDG uptake heterogeneity did not indicate patients with differing prognoses. Apparent heterogeneity differences between clinical groups may be an artifact arising from either the dependence of some image metrics upon other factors such as tumor volume or upon the underlying variations in the patient populations compared.
Perdurance of multiply connected de Sitter space
NASA Astrophysics Data System (ADS)
González-Díaz, Pedro F.
1999-06-01
This paper deals with a study of the effects that spherically symmetric first-order metric perturbations and vacuum quantum fluctuations have on the stability of the multiply connected de Sitter spacetime recently proposed by Gott and Li. It is the main conclusion of this study that although such a spacetime is stable to the classical metric perturbations for any size of the nonchronal region, it is only stable against the quantum fluctuations of vacuum if the size of the multiply connected region is of the order of the Planck scale. Therefore, boundary conditions for the state of the universe based on the notion that the universe created itself in a regime where closed timelike curves were active and stable still appear to be physically and philosophically well supported as are those boundary conditions relying on the notion that the universe was created out of nothing.
NASA Astrophysics Data System (ADS)
Singh, S. Surendra
2018-05-01
Considering the locally rotationally symmetric (LRS) Bianchi type-I metric with cosmological constant Λ, Einstein’s field equations are discussed based on the background of anisotropic fluid. We assumed the condition A = B 1 m for the metric potentials A and B, where m is a positive constant to obtain the viable model of the Universe. It is found that Λ(t) is positive and inversely proportional to time. The values of matter-energy density Ωm, dark energy density ΩΛ and deceleration parameter q are found to be consistent with the values of WMAP observations. State finder parameters and anisotropic deviation parameter are also investigated. It is also observed that the derived model is an accelerating, shearing and non-rotating Universe. Some of the asymptotic and geometrical behaviors of the derived models are investigated with the age of the Universe.
2016-03-01
Performance Metrics University of Waterloo Permanganate Treatment of an Emplaced DNAPL Source (Thomson et al., 2007) Table 5.6 Remediation Performance Data... permanganate vs. peroxide/Fenton’s for chemical oxidation). Poorer performance was generally observed when the Total CVOC was the contaminant metric...using a soluble carbon substrate (lactate), chemical oxidation using Fenton’s reagent, and chemical oxidation using potassium permanganate . At
Preserved Network Metrics across Translated Texts
NASA Astrophysics Data System (ADS)
Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.
2014-09-01
Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.
Guerrero, Lourdes; Jones, Lisa B.; Tong, Greg; Ireland, Christine; Dumbauld, Jill; Rainwater, Julie
2015-01-01
Abstract Purpose This pilot study describes the career development programs (i.e., NIH KL2 awards) across five Clinical and Translational Science Award (CTSA) institutions within the University of California (UC) system, and examines the feasibility of a set of common metrics for evaluating early outcomes. Methods A survey of program administrators provided data related to the institutional environment within which each KL2 program was implemented. Application and progress report data yielded a combined data set that characterized KL2 awardees, their initial productivity, and early career outcomes. Results The pilot project demonstrated the feasibility of aggregating common metrics data across multiple institutions. The data indicated that KL2 awardees were an accomplished set of investigators, both before and after the award period, representing a wide variety of disciplines. Awardees that had completed their trainee period overwhelmingly remained active in translational research conducted within an academic setting. Early indications also suggest high rates of success with obtaining research funding subsequent to the KL2 award. Conclusion This project offers a model for how to collect and analyze common metrics related to the education and training function of the CTSA Consortium. Next steps call for expanding participation to other CTSA sites outside of the University of California system. PMID:26602332
ERIC Educational Resources Information Center
Morris, Tracy L.; Laipple, Joseph S.
2015-01-01
A national sample of 1515 university administrators (academic deans, directors, associate deans, and department chairs) completed a survey of leadership skills, preparedness for administrative role, and job satisfaction. Overall, participants felt least well prepared in the areas of developing entrepreneurial revenue, developing metrics to…
A Metric on Phylogenetic Tree Shapes.
Colijn, C; Plazzotta, G
2018-01-01
The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Anisotropic deformations of spatially open cosmology in massive gravity theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazuet, Charles; Volkov, Mikhail S.; Mukohyama, Shinji, E-mail: charles.mazuet@lmpt.univ-tours.fr, E-mail: shinji.mukohyama@yukawa.kyoto-u.ac.jp, E-mail: volkov@lmpt.univ-tours.fr
We combine analytical and numerical methods to study anisotropic deformations of the spatially open homogeneous and isotropic cosmology in the ghost free massive gravity theory with flat reference metric. We find that if the initial perturbations are not too strong then the physical metric relaxes back to the isotropic de Sitter state. However, the dumping of the anisotropies is achieved at the expense of exciting the Stueckelberg fields in such a way that the reference metric changes and does not share anymore with the physical metric the same rotational and translational symmetries. As a result, the universe evolves towards amore » fixed point which does not coincide with the original solution, but for which the physical metric is still de Sitter. If the initial perturbation is strong, then its evolution generically leads to a singular anisotropic state or, for some parameter values, to a decay into flat spacetime. We also present an infinite dimensional family of new homogeneous and isotropic cosmologies in the theory.« less
Mass Function of Galaxy Clusters in Relativistic Inhomogeneous Cosmology
NASA Astrophysics Data System (ADS)
Ostrowski, Jan J.; Buchert, Thomas; Roukema, Boudewijn F.
The current cosmological model (ΛCDM) with the underlying FLRW metric relies on the assumption of local isotropy, hence homogeneity of the Universe. Difficulties arise when one attempts to justify this model as an average description of the Universe from first principles of general relativity, since in general, the Einstein tensor built from the averaged metric is not equal to the averaged stress-energy tensor. In this context, the discrepancy between these quantities is called "cosmological backreaction" and has been the subject of scientific debate among cosmologists and relativists for more than 20 years. Here we present one of the methods to tackle this problem, i.e. averaging the scalar parts of the Einstein equations, together with its application, the cosmological mass function of galaxy clusters.
An index of reservoir habitat impairment
Miranda, L.E.; Hunt, K.M.
2011-01-01
Fish habitat impairment resulting from natural and anthropogenic watershed and in-lake processes has in many cases reduced the ability of reservoirs to sustain native fish assemblages and fisheries quality. Rehabilitation of impaired reservoirs is hindered by the lack of a method suitable for scoring impairment status. To address this limitation, an index of reservoir habitat impairment (IRHI) was developed by merging 14 metrics descriptive of common impairment sources, with each metric scored from 0 (no impairment) to 5 (high impairment) by fisheries scientists with local knowledge. With a plausible range of 5 to 25, distribution of the IRHI scores ranged from 5 to 23 over 482 randomly selected reservoirs dispersed throughout the USA. The IRHI reflected five impairment factors including siltation, structural habitat, eutrophication, water regime, and aquatic plants. The factors were weakly related to key reservoir characteristics including reservoir area, depth, age, and usetype, suggesting that common reservoir descriptors are poor predictors of fish habitat impairment. The IRHI is rapid and inexpensive to calculate, provides an easily understood measure of the overall habitat impairment, allows comparison of reservoirs and therefore prioritization of restoration activities, and may be used to track restoration progress. The major limitation of the IRHI is its reliance on unstandardized professional judgment rather than standardized empirical measurements. ?? 2010 US Government.
Bose–Einstein graviton condensate in a Schwarzschild black hole
NASA Astrophysics Data System (ADS)
Alfaro, Jorge; Espriu, Domènec; Gabbanelli, Luciano
2018-01-01
We analyze in detail a previous proposal by Dvali and Gómez that black holes could be treated as consisting of a Bose–Einstein condensate of gravitons. In order to do so we extend the Einstein–Hilbert action with a chemical potential-like term, thus placing ourselves in a grand-canonical ensemble. The form and characteristics of this chemical potential-like piece are discussed in some detail. We argue that the resulting equations of motion derived from the action could be interpreted as the Gross–Pitaevskii equation describing a graviton Bose–Einstein condensate trapped by the black hole gravitational field. After this, we proceed to expand the ensuring equations of motion up to second order around the classical Schwarzschild metric so that some non-linear terms in the metric fluctuation are kept. Next we search for solutions and, modulo some very plausible assumptions, we find out that the condensate vanishes outside the horizon but is non-zero in its interior. Inspired by a linearized approximation around the horizon we are able to find an exact solution for the mean-field wave function describing the graviton Bose–Einstein condensate in the black hole interior. After this, we can rederive some of the relations involving the number of gravitons N and the black hole characteristics along the lines suggested by Dvali and Gómez.
Dullet, Navjit W; Geraghty, Estella M; Kaufman, Taylor; Kissee, Jamie L; King, Jesse; Dharmar, Madan; Smith, Anthony C; Marcin, James P
2017-04-01
The objective of this study was to estimate travel-related and environmental savings resulting from the use of telemedicine for outpatient specialty consultations with a university telemedicine program. The study was designed to retrospectively analyze the telemedicine consultation database at the University of California Davis Health System (UCDHS) between July 1996 and December 2013. Travel distances and travel times were calculated between the patient home, the telemedicine clinic, and the UCDHS in-person clinic. Travel cost savings and environmental impact were calculated by determining differences in mileage reimbursement rate and emissions between those incurred in attending telemedicine appointments and those that would have been incurred if a visit to the hub site had been necessary. There were 19,246 consultations identified among 11,281 unique patients. Telemedicine visits resulted in a total travel distance savings of 5,345,602 miles, a total travel time savings of 4,708,891 minutes or 8.96 years, and a total direct travel cost savings of $2,882,056. The mean per-consultation round-trip distance savings were 278 miles, average travel time savings were 245 minutes, and average cost savings were $156. Telemedicine consultations resulted in a total emissions savings of 1969 metric tons of CO 2 , 50 metric tons of CO, 3.7 metric tons of NO x , and 5.5 metric tons of volatile organic compounds. This study demonstrates the positive impact of a health system's outpatient telemedicine program on patient travel time, patient travel costs, and environmental pollutants. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Ford, Jeremy W.; Missall, Kristen N.; Hosp, John L.; Kuhle, Jennifer L.
2016-01-01
Advances in maze selection curriculum-based measurement have led to several published tools with technical information for interpretation (e.g., norms, benchmarks, cut-scores, classification accuracy) that have increased their usefulness for universal screening. A range of scoring practices have emerged for evaluating student performance on maze…
The Plan for Donations or "Don'tations": The Case of the Alumni-Funded Entrepreneurship Center
ERIC Educational Resources Information Center
Olsen, Mona Anita Kristiansen
2012-01-01
This case was developed for use in a course on entrepreneurial education that focuses on leadership. Background from Blue Stone University, including information on its mission, organizational structure, metrics, entrepreneurial programs, and stakeholders, is presented. This case explores how a school within a university approached the creation of…
ERIC Educational Resources Information Center
Okpara, Gazie S.; Agu, Agu G.
2017-01-01
Nonregular higher education in Nigeria became an integral part of the university manpower development since 1960, when the Ashby Commission recommended establishing evening degree programs. These ubiquitous programs have contributed to national capacity-building and remain relatively unmonitored by the National Universities Commission. This…
Can Human Capital Metrics Effectively Benchmark Higher Education with For-Profit Companies?
ERIC Educational Resources Information Center
Hagedorn, Kathy; Forlaw, Blair
2007-01-01
Last fall, Saint Louis University participated in St. Louis, Missouri's, first Human Capital Performance Study alongside several of the region's largest for-profit employers. The university also participated this year in the benchmarking of employee engagement factors conducted by the St. Louis Business Journal in its effort to quantify and select…
NASA Technical Reports Server (NTRS)
Strybel, Thomas Z.; Vu, Kim-Phuong L.; Battiste, Vernol; Dao, Arik-Quang; Dwyer, John P.; Landry, Steven; Johnson, Walter; Ho, Nhut
2011-01-01
A research consortium of scientists and engineers from California State University Long Beach (CSULB), San Jose State University Foundation (SJSUF), California State University Northridge (CSUN), Purdue University, and The Boeing Company was assembled to evaluate the impact of changes in roles and responsibilities and new automated technologies, being introduced in the Next Generation Air Transportation System (NextGen), on operator situation awareness (SA) and workload. To meet these goals, consortium members performed systems analyses of NextGen concepts and airspace scenarios, and concurrently evaluated SA, workload, and performance measures to assess their appropriateness for evaluations of NextGen concepts and tools. The following activities and accomplishments were supported by the NRA: a distributed simulation, metric development, systems analysis, part-task simulations, and large-scale simulations. As a result of this NRA, we have gained a greater understanding of situation awareness and its measurement, and have shared our knowledge with the scientific community. This network provides a mechanism for consortium members, colleagues, and students to pursue research on other topics in air traffic management and aviation, thus enabling them to make greater contributions to the field
Applying the Goal-Question-Indicator-Metric (GQIM) Method to Perform Military Situational Analysis
2016-05-11
www.sei.cmu.edu CMU/SEI-2016-TN-003 | SOFTWARE ENGINEERING INSTITUTE | CARNEGIE MELLON UNIVERSITY Distribution Statement A: Approved for Public Release...Distribution is Unlimited Copyright 2016 Carnegie Mellon University This material is based upon work funded and supported by the Department of...Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally
An accurate metric for the spacetime around rotating neutron stars
NASA Astrophysics Data System (ADS)
Pappas, George
2017-04-01
The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.
NASA Astrophysics Data System (ADS)
Santa Vélez, Camilo; Enea Romano, Antonio
2018-05-01
Static coordinates can be convenient to solve the vacuum Einstein's equations in presence of spherical symmetry, but for cosmological applications comoving coordinates are more suitable to describe an expanding Universe, especially in the framework of cosmological perturbation theory (CPT). Using CPT we develop a method to transform static spherically symmetric (SSS) modifications of the de Sitter solution from static coordinates to the Newton gauge. We test the method with the Schwarzschild de Sitter (SDS) metric and then derive general expressions for the Bardeen's potentials for a class of SSS metrics obtained by adding to the de Sitter metric a term linear in the mass and proportional to a general function of the radius. Using the gauge invariance of the Bardeen's potentials we then obtain a gauge invariant definition of the turn around radius. We apply the method to an SSS solution of the Brans-Dicke theory, confirming the results obtained independently by solving the perturbation equations in the Newton gauge. The Bardeen's potentials are then derived for new SSS metrics involving logarithmic, power law and exponential modifications of the de Sitter metric. We also apply the method to SSS metrics which give flat rotation curves, computing the radial energy density profile in comoving coordinates in presence of a cosmological constant.
A novel critical infrastructure resilience assessment approach using dynamic Bayesian networks
NASA Astrophysics Data System (ADS)
Cai, Baoping; Xie, Min; Liu, Yonghong; Liu, Yiliu; Ji, Renjie; Feng, Qiang
2017-10-01
The word resilience originally originates from the Latin word "resiliere", which means to "bounce back". The concept has been used in various fields, such as ecology, economics, psychology, and society, with different definitions. In the field of critical infrastructure, although some resilience metrics are proposed, they are totally different from each other, which are determined by the performances of the objects of evaluation. Here we bridge the gap by developing a universal critical infrastructure resilience metric from the perspective of reliability engineering. A dynamic Bayesian networks-based assessment approach is proposed to calculate the resilience value. A series, parallel and voting system is used to demonstrate the application of the developed resilience metric and assessment approach.
Energy distributions of Bianchi type-VI h Universe in general relativity and teleparallel gravity
NASA Astrophysics Data System (ADS)
Özkurt, Ş.; eref; Aygün, Sezg&idot; n.
2017-04-01
In this paper, we have investigated the energy and momentum density distributions for the inhomogeneous generalizations of homogeneous Bianchi type-VI h metric with Einstein, Bergmann-Thomson, Landau-Lifshitz, Papapetrou, Tolman and Møller prescriptions in general relativity (GR) and teleparallel gravity (TG). We have found exactly the same results for Einstein, Bergmann-Thomson and Landau-Lifshitz energy-momentum distributions in Bianchi type-VI h metric for different gravitation theories. The energy-momentum distributions of the Bianchi type- VI h metric are found to be zero for h = -1 in GR and TG. However, our results agree with Tripathy et al, Tryon, Rosen and Aygün et al.
Conditions for defocusing around more general metrics in infinite derivative gravity
NASA Astrophysics Data System (ADS)
Edholm, James
2018-04-01
Infinite derivative gravity is able to resolve the big bang curvature singularity present in general relativity by using a simplifying ansatz. We show that it can also avoid the Hawking-Penrose singularity, by allowing defocusing of null rays through the Raychaudhuri equation. This occurs not only in the minimal case where we ignore the matter contribution but also in the case where matter plays a key role. We investigate the conditions for defocusing for the general case where this ansatz applies and also for more specific metrics, including a general Friedmann-Robertson-Walker metric and three specific choices of the scale factor which produce a bouncing Friedmann-Robertson-Walker universe.
The ultimate question of origins: God and the beginning of the Universe.
NASA Astrophysics Data System (ADS)
Craig, W. L.
Both cosmology and philosophy trace their roots to the wonder felt by the ancient Greeks as they contemplated the Universe. The ultimate question remains why the Universe exists rather than nothing. This question led Leibniz to postulate the existence of a metaphysically necessary being, which he identified as God. Leibniz's critics, however, disputed this identification, claiming that the space-time universe itself may be the metaphysically necessary being. The discovery during this century that the Universe began to exist, however, calls into question the Universe's status as metaphysically necessary, since any necessary being must be eternal in its existence. Although various cosmogonic models claiming to avert the beginning of the Universe predicted by the standard model have been and continue to be offered, no model involving an eternal universe has proved as plausible as the standard model. Unless we are to assert that the Universe simply sprang into being uncaused out of nothing, we are thus led to Leibniz's conclusion. Several objections to inferring a supernatural cause of the origin of the Universe are considered and found to be unsound.
The Ultimate Question of Origins: God and the Beginning of the Universe
NASA Astrophysics Data System (ADS)
Craig, William Lane
1999-12-01
Both cosmology and philosophy trace their roots to the wonder felt by the ancient Greeks as they contemplated the universe. The ultimate question remains why the universe exists rather than nothing. This question led Leibniz to postulate the existence of a metaphysically necessary being, which he identified as God. Leibniz's critics, however, disputed this identification, claiming that the space-time universe itself may be the metaphysically necessary being. The discovery during this century that the universe began to exist, however, calls into question the universe's status as metaphysically necessary, since any necessary being must be eternal in its existence. Although various cosmogonic models claiming to avert the beginning of the universe predicted by the standard model have been and continue to be offered, no model involving an eternal universe has proved as plausible as the standard model. Unless we are to assert that the universe simply sprang into being uncaused out of nothing, we are thus led to Leibniz's conclusion. Several objections to inferring a supernatural cause of the origin of the universe are considered and found to be unsound.
Cross-evaluation of metrics to estimate the significance of creative works
Wasserman, Max; Zeng, Xiao Han T.; Amaral, Luís A. Nunes
2015-01-01
In a world overflowing with creative works, it is useful to be able to filter out the unimportant works so that the significant ones can be identified and thereby absorbed. An automated method could provide an objective approach for evaluating the significance of works on a universal scale. However, there have been few attempts at creating such a measure, and there are few “ground truths” for validating the effectiveness of potential metrics for significance. For movies, the US Library of Congress’s National Film Registry (NFR) contains American films that are “culturally, historically, or aesthetically significant” as chosen through a careful evaluation and deliberation process. By analyzing a network of citations between 15,425 United States-produced films procured from the Internet Movie Database (IMDb), we obtain several automated metrics for significance. The best of these metrics is able to indicate a film’s presence in the NFR at least as well or better than metrics based on aggregated expert opinions or large population surveys. Importantly, automated metrics can easily be applied to older films for which no other rating may be available. Our results may have implications for the evaluation of other creative works such as scientific research. PMID:25605881
Technical Interchange Meeting Guidelines Breakout
NASA Technical Reports Server (NTRS)
Fong, Rob
2002-01-01
Along with concept developers, the Systems Evaluation and Assessment (SEA) sub-element of VAMS will develop those scenarios and metrics required for testing the new concepts that reside within the System-Level Integrated Concepts (SLIC) sub-element in the VAMS project. These concepts will come from the NRA process, space act agreements, a university group, and other NASA researchers. The emphasis of those concepts is to increase capacity while at least maintaining the current safety level. The concept providers will initially develop their own scenarios and metrics for self-evaluation. In about a year, the SEA sub-element will become responsible for conducting initial evaluations of the concepts using a common scenario and metric set. This set may derive many components from the scenarios and metrics used by the concept providers. Ultimately, the common scenario\\metric set will be used to help determine the most feasible and beneficial concepts. A set of 15 questions and issues, discussed below, pertaining to the scenario and metric set, and its use for assessing concepts, was submitted by the SEA sub-element for consideration during the breakout session. The questions were divided among the three breakout groups. Each breakout group deliberated on its set of questions and provided a report on its discussion.
Quantitative evaluation of muscle synergy models: a single-trial task decoding approach
Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano
2013-01-01
Muscle synergies, i.e., invariant coordinated activations of groups of muscles, have been proposed as building blocks that the central nervous system (CNS) uses to construct the patterns of muscle activity utilized for executing movements. Several efficient dimensionality reduction algorithms that extract putative synergies from electromyographic (EMG) signals have been developed. Typically, the quality of synergy decompositions is assessed by computing the Variance Accounted For (VAF). Yet, little is known about the extent to which the combination of those synergies encodes task-discriminating variations of muscle activity in individual trials. To address this question, here we conceive and develop a novel computational framework to evaluate muscle synergy decompositions in task space. Unlike previous methods considering the total variance of muscle patterns (VAF based metrics), our approach focuses on variance discriminating execution of different tasks. The procedure is based on single-trial task decoding from muscle synergy activation features. The task decoding based metric evaluates quantitatively the mapping between synergy recruitment and task identification and automatically determines the minimal number of synergies that captures all the task-discriminating variability in the synergy activations. In this paper, we first validate the method on plausibly simulated EMG datasets. We then show that it can be applied to different types of muscle synergy decomposition and illustrate its applicability to real data by using it for the analysis of EMG recordings during an arm pointing task. We find that time-varying and synchronous synergies with similar number of parameters are equally efficient in task decoding, suggesting that in this experimental paradigm they are equally valid representations of muscle synergies. Overall, these findings stress the effectiveness of the decoding metric in systematically assessing muscle synergy decompositions in task space. PMID:23471195
Exploration Analysis of Carbon Dioxide Levels and Ultrasound Measures of the Eye During ISS Missions
NASA Technical Reports Server (NTRS)
Young, M.; Mason, S.; Schaefer, C.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Coble, C.; Gruschkus, S.; Law, J.; Alexander, D.;
2016-01-01
Enhanced screening for the Visual Impairment/Intracranial Pressure (VIIP) Syndrome, including in-flight ultrasound, was implemented in 2010 to better characterize the changes in vision observed in some long-duration crewmembers. Suggested possible risk factors for VIIP include cardiovascular changes, diet, anatomical and genetic factors, and environmental conditions. As a potent vasodilator, carbon dioxide (CO (sub 2)), which is chronically elevated on the International Space Station (ISS) relative to typical indoor and outdoor ambient levels on Earth, seems a plausible contributor to VIIP. In an effort to understand the possible associations between CO (sub 2) and VIIP, this study analyzes the relationship between ambient CO (sub 2) levels on ISS and ultrasound measures of the eye obtained from ISS fliers. CO (sub 2) measurements will be pulled directly from Operational Data Reduction Complex for the Lab and Node 3 major constituent analyzers (MCAs) on ISS or from sensors located in the European Columbus module, as available. CO (sub 2) measures between ultrasound sessions will be summarized using standard time series class metrics in MATLAB including time-weighted means and variances. Cumulative CO (sub 2) exposure metrics will also be developed. Regression analyses will be used to quantify the relationships between the CO (sub 2) metrics and specific ultrasound measures. Generalized estimating equations will adjust for the repeated measures within individuals. Multiple imputation techniques will be used to adjust for any possible biases in missing data for either CO (sub 2) or ultrasound measures. These analyses will elucidate the possible relationship between CO (sub 2) and changes in vision and also inform future analysis of inflight VIIP data.
Robustness of Reconstructed Ancestral Protein Functions to Statistical Uncertainty.
Eick, Geeta N; Bridgham, Jamie T; Anderson, Douglas P; Harms, Michael J; Thornton, Joseph W
2017-02-01
Hypotheses about the functions of ancient proteins and the effects of historical mutations on them are often tested using ancestral protein reconstruction (APR)-phylogenetic inference of ancestral sequences followed by synthesis and experimental characterization. Usually, some sequence sites are ambiguously reconstructed, with two or more statistically plausible states. The extent to which the inferred functions and mutational effects are robust to uncertainty about the ancestral sequence has not been studied systematically. To address this issue, we reconstructed ancestral proteins in three domain families that have different functions, architectures, and degrees of uncertainty; we then experimentally characterized the functional robustness of these proteins when uncertainty was incorporated using several approaches, including sampling amino acid states from the posterior distribution at each site and incorporating the alternative amino acid state at every ambiguous site in the sequence into a single "worst plausible case" protein. In every case, qualitative conclusions about the ancestral proteins' functions and the effects of key historical mutations were robust to sequence uncertainty, with similar functions observed even when scores of alternate amino acids were incorporated. There was some variation in quantitative descriptors of function among plausible sequences, suggesting that experimentally characterizing robustness is particularly important when quantitative estimates of ancient biochemical parameters are desired. The worst plausible case method appears to provide an efficient strategy for characterizing the functional robustness of ancestral proteins to large amounts of sequence uncertainty. Sampling from the posterior distribution sometimes produced artifactually nonfunctional proteins for sequences reconstructed with substantial ambiguity. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Chacón-Cardona, C. A.; Casas-Miranda, R. A.
2014-10-01
Recent works about large structure in the universe put in doubt the homogeneity transition almost universally accepted, (Joyce et al.2005), (Gaite 2007), (Chacón-Cardona & Casas-Miranda 2012). In the present work we develop theoretically the density contrast for the spherical collapse of an over-density of dark matter which evolve in a inhomogeneous universe inside a fractal cosmology presented by (Bondi 1947).
Analysis of Dependencies and Impacts of Metroplex Operations
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel A.; Ayyalasomayajula, Sricharan
2010-01-01
This report documents research performed by Purdue University under subcontract to the George Mason University (GMU) for the Metroplex Operations effort sponsored by NASA's Airportal Project. Purdue University conducted two tasks in support of the larger efforts led by GMU: a) a literature review on metroplex operations followed by identification and analysis of metroplex dependencies, and b) the analysis of impacts of metroplex operations on the larger U.S. domestic airline service network. The tasks are linked in that the ultimate goal is an understanding of the role of dependencies among airports in a metroplex in causing delays both locally and network-wide. The Purdue team has formulated a system-of-systems framework to analyze metroplex dependencies (including simple metrics to quantify them) and develop compact models to predict delays based on network structure. These metrics and models were developed to provide insights for planners to formulate tailored policies and operational strategies that streamline metroplex operations and mitigate delays and congestion.
Florida Atlantic University Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of North Florida Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida State University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida International University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Central Florida Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Gulf Coast University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Polytechnic University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of North Florida Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida International University Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of West Florida Work Plan, 2013-2014
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new Strategic Plan 2012-2025 is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's Annual Accountability Report provides yearly tracking for how the System is…
University of North Florida Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Gulf Coast University Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Polytechnic University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of West Florida Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida A&M University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Gulf Coast University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Atlantic University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida A&M University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida Atlantic University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida State University Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Florida Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida International University Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Central Florida Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Florida Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
Florida State University Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Florida Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of Central Florida Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of West Florida Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
System Summary of University Annual Work Plans, 2014-15
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future; (1) The Board of Governors' new Strategic Plan 2012-2025 is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's Annual Accountability Report provides yearly tracking for how the System is…
Let Them See It: A Project to Build Capacity by Raising Awareness of Teaching-Development Pathways
ERIC Educational Resources Information Center
Bolt, Susan; Fenn, Jody; Ohly, Christian
2016-01-01
In an ideal world, university teaching and research would be valued equally; however, this is not currently the case. The notion that university reputation should be judged predominantly by research metrics has been challenged by a global trend towards a demand-driven system that encourages widening participation, student choice and social…
ERIC Educational Resources Information Center
Mimura, Carol
2007-01-01
In the years since the passage of the Bayh-Dole Act of 1980, university technology transfer success has been measured primarily by traditional metrics such as numbers of patents filed, revenue obtained from licensed patents and numbers of start-up companies founded to commercialize university intellectual property. Intellectual property (IP)…
2016 System Summary of University Work Plans. Revised
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2016
2016-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' 2025 System Strategic Plan is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's Annual Accountability Report provides yearly tracking for how the System is progressing…
ERIC Educational Resources Information Center
Observatory on Borderless Higher Education, 2010
2010-01-01
Two interesting initiatives have started to develop more balanced international league tables of universities. The first initiative is being taken by UK magazine the Times Higher Education (THE), which recently entered into a collaboration with both the international academic community and research-metrics company Thomsons Reuters to create an…
Self-acceleration in scalar-bimetric theories
NASA Astrophysics Data System (ADS)
Brax, Philippe; Valageas, Patrick
2018-05-01
We describe scalar-bimetric theories where the dynamics of the Universe are governed by two separate metrics, each with an Einstein-Hilbert term. In this setting, the baryonic and dark matter components of the Universe couple to metrics which are constructed as functions of these two gravitational metrics. More precisely, the two metrics coupled to matter are obtained by a linear combination of their vierbeins, with scalar-dependent coefficients. The scalar field, contrary to dark-energy models, does not have a potential of which the role is to mimic a late-time cosmological constant. The late-time acceleration of the expansion of the Universe can be easily obtained at the background level in these models by appropriately choosing the coupling functions appearing in the decomposition of the vierbeins for the baryonic and dark matter metrics. We explicitly show how the concordance model can be retrieved with negligible scalar kinetic energy. This requires the scalar coupling functions to show variations of order unity during the accelerated expansion era. This leads in turn to deviations of order unity for the effective Newton constants and a fifth force that is of the same order as Newtonian gravity, with peculiar features. The baryonic and dark matter self-gravities are amplified although the gravitational force between baryons and dark matter is reduced and even becomes repulsive at low redshift. This slows down the growth of baryonic density perturbations on cosmological scales, while dark matter perturbations are enhanced. These scalar-bimetric theories have a perturbative cutoff scale of the order of 1 AU, which prevents a precise comparison with Solar System data. On the other hand, we can deduce strong requirements on putative UV completions by analyzing the stringent constraints in the Solar System. Hence, in our local environment, the upper bound on the time evolution of Newton's constant requires an efficient screening mechanism that both damps the fifth force on small scales and decouples the local value of Newton constant from its cosmological value. This cannot be achieved by a quasistatic chameleon mechanism and requires going beyond the quasistatic regime and probably using derivative screenings, such as Kmouflage or Vainshtein screening, on small scales.
Evolution of the auditory ossicles in extant hominids: metric variation in African apes and humans.
Quam, Rolf M; Coleman, Mark N; Martínez, Ignacio
2014-08-01
The auditory ossicles in primates have proven to be a reliable source of phylogenetic information. Nevertheless, to date, very little data have been published on the metric dimensions of the ear ossicles in African apes and humans. The present study relies on the largest samples of African ape ear ossicles studied to date to address questions of taxonomic differences and the evolutionary transformation of the ossicles in gorillas, chimpanzees and humans. Both African ape taxa show a malleus that is characterized by a long and slender manubrium and relatively short corpus, whereas humans show the opposite constellation of a short and thick manubrium and relatively long corpus. These changes in the manubrium are plausibly linked with changes in the size of the tympanic membrane. The main difference between the incus in African apes and humans seems to be related to changes in the functional length. Compared with chimpanzees, human incudes are larger in nearly all dimensions, except articular facet height, and show a more open angle between the axes. The gorilla incus resembles humans more closely in its metric dimensions, including functional length, perhaps as a result of the dramatically larger body size compared with chimpanzees. The differences between the stapedes of humans and African apes are primarily size-related, with humans being larger in nearly all dimensions. Nevertheless, some distinctions between the African apes were found in the obturator foramen and head height. Although correlations between metric variables in different ossicles were generally lower than those between variables in the same bone, variables of the malleus/incus complex appear to be more strongly correlated than those of the incus/stapes complex, perhaps reflecting the different embryological and evolutionary origins of the ossicles. The middle ear lever ratio for the African apes is similar to other haplorhines, but humans show the lowest lever ratio within primates. Very low levels of sexual dimorphism were found in the ossicles within each taxon, but some relationship with body size and several dimensions of the ear bones was found. Several of the metric distinctions in the incus and stapes imply a slightly different articulation of the ossicular chain within the tympanic cavity in African apes compared with humans. The limited auditory implications of these metric differences in the ossicles are also discussed. Finally, the results of this study suggest that several plesiomorphic features for apes may be retained in the ear bones of the early hominin taxa Australopithecus and Paranthropus as well as in the Neandertals. © 2014 Anatomical Society.
Evolution of the auditory ossicles in extant hominids: metric variation in African apes and humans
Quam, Rolf M; Coleman, Mark N; Martínez, Ignacio
2014-01-01
The auditory ossicles in primates have proven to be a reliable source of phylogenetic information. Nevertheless, to date, very little data have been published on the metric dimensions of the ear ossicles in African apes and humans. The present study relies on the largest samples of African ape ear ossicles studied to date to address questions of taxonomic differences and the evolutionary transformation of the ossicles in gorillas, chimpanzees and humans. Both African ape taxa show a malleus that is characterized by a long and slender manubrium and relatively short corpus, whereas humans show the opposite constellation of a short and thick manubrium and relatively long corpus. These changes in the manubrium are plausibly linked with changes in the size of the tympanic membrane. The main difference between the incus in African apes and humans seems to be related to changes in the functional length. Compared with chimpanzees, human incudes are larger in nearly all dimensions, except articular facet height, and show a more open angle between the axes. The gorilla incus resembles humans more closely in its metric dimensions, including functional length, perhaps as a result of the dramatically larger body size compared with chimpanzees. The differences between the stapedes of humans and African apes are primarily size-related, with humans being larger in nearly all dimensions. Nevertheless, some distinctions between the African apes were found in the obturator foramen and head height. Although correlations between metric variables in different ossicles were generally lower than those between variables in the same bone, variables of the malleus/incus complex appear to be more strongly correlated than those of the incus/stapes complex, perhaps reflecting the different embryological and evolutionary origins of the ossicles. The middle ear lever ratio for the African apes is similar to other haplorhines, but humans show the lowest lever ratio within primates. Very low levels of sexual dimorphism were found in the ossicles within each taxon, but some relationship with body size and several dimensions of the ear bones was found. Several of the metric distinctions in the incus and stapes imply a slightly different articulation of the ossicular chain within the tympanic cavity in African apes compared with humans. The limited auditory implications of these metric differences in the ossicles are also discussed. Finally, the results of this study suggest that several plesiomorphic features for apes may be retained in the ear bones of the early hominin taxa Australopithecus and Paranthropus as well as in the Neandertals. PMID:24845949
ERIC Educational Resources Information Center
Granito, Dolores
These guidelines for in-service and preservice teacher education related to the conversion to the metric system were developed from a survey of published materials, university faculty, and mathematics supervisors. The eleven guidelines fall into three major categories: (1) design of teacher training programs, (2) teacher training, and (3)…
2012-12-01
or any other aspect of this collection of information, including suggestions for reducing this burden to Department of Defense, Washington...University. 9 Additionally, incorporation of NF2 specific QoL metrics is progressing in all aspects of NF care at NYU. The baseline QoL data will...utilization of mental health providers to provide both counseling, and when necessary, treatment of this currently under-recognized aspect of NF2
Ancient Chinese medical ethics and the four principles of biomedical ethics.
Tsai, D F
1999-01-01
The four principles approach to biomedical ethics (4PBE) has, since the 1970s, been increasingly developed as a universal bioethics method. Despite its wide acceptance and popularity, the 4PBE has received many challenges to its cross-cultural plausibility. This paper first specifies the principles and characteristics of ancient Chinese medical ethics (ACME), then makes a comparison between ACME and the 4PBE with a view to testing out the 4PBE's cross-cultural plausibility when applied to one particular but very extensive and prominent cultural context. The result shows that the concepts of respect for autonomy, non-maleficence, beneficence and justice are clearly identifiable in ACME. Yet, being influenced by certain socio-cultural factors, those applying the 4PBE in Chinese society may tend to adopt a "beneficence-oriented", rather than an "autonomy-oriented" approach, which, in general, is dissimilar to the practice of contemporary Western bioethics, where "autonomy often triumphs". PMID:10461594
Quantifying MCMC exploration of phylogenetic tree space.
Whidden, Chris; Matsen, Frederick A
2015-05-01
In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Holographic Spherically Symmetric Metrics
NASA Astrophysics Data System (ADS)
Petri, Michael
The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.
Yiadom, Maame Yaa A B; Scheulen, James; McWade, Conor M; Augustine, James J
2016-07-01
The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017. A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with identical definitions. During a 1.5-day meeting-structured according to social change theories of information exchange, self-interest, and interdependence-common definitions were identified and negotiated using the EDBA's published definitions as a start for discussion. Methods of process analysis theory were used in the 8 weeks following the meeting to achieve official consensus on definitions. These two lists were submitted to the organizations' leadership for implementation approval. A total of 374 unique measures were identified, of which 57 (15%) were shared by at least two organizations. Fourteen (4%) were common to all three organizations. In addition to agreement on definitions for the 14 measures used by all three organizations, agreement was reached on universal definitions for 17 of the 57 measures shared by at least two organizations. The negotiation outcome was a list of 31 measures with universal definitions to be adopted by each organization by 2017. The use of negotiation, social change, and process analysis theories achieved the adoption of universal definitions among the EDBA, EDOSG, and AAAEM. This will impact performance benchmarking for nearly half of US EDs. It initiates a formal commitment to utilize standardized metrics, and it transitions consistency in reporting ED operations metrics from consensus to implementation. This work advances our ability to more accurately characterize variation in ED care delivery models, resource utilization, and performance. In addition, it permits future aggregation of these three data sets, thus facilitating the creation of more robust ED operations research data sets unified by a universal language. Negotiation, social change, and process analysis principles can be used to advance the adoption of additional definitions. © 2016 by the Society for Academic Emergency Medicine.
1989-06-01
to facilitate in-depth communication of research results in a multi-disciplinary gathering led to a decision to have long presentations and limit the...learning subfields such as computational learning theory and explanation based learning? Second, as the machine learning field increases its emphasis...Architecture, Pat Langley, University of California, Irvine .................................................... 22 A Theory of Human Plausible Reasoning
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of South Florida--System Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of South Florida Tampa Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
University of South Florida System Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
ERIC Educational Resources Information Center
Sterman, Leila Belle; Clark, Jason A.
2017-01-01
Many research libraries are looking for new ways to demonstrate value for their parent institutions. Metrics, assessment, and promotion of research continue to grow in importance, but they have not always fallen into the scope of services for the research library. Montana State University (MSU) Library recognized a need and interest to quantify…
Fermionic Tunneling Effect and Hawking Radiation in a Non Commutative FRW Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouhalouf, H.; Aissaoui, H.; Mebarki, N.
2010-10-31
The formalism of a non commutative gauge gravity is applied to an FRW universe and the corresponding modified metric, veirbein and spin connection components are obtained. Moreover, using the Hamilton-Jacobi method and as a pure space-time deformation effect, the NCG Hawking radiation via a fermionic tunneling transition through the dynamical NCG horizon is also studied.
ERIC Educational Resources Information Center
Xiao, Jing; Bai, Yu; He, Yini; McWhinnie, Chad M.; Ling, Yu; Smith, Hannah; Huebner, E. Scott
2016-01-01
The aim of this study was to test the gender invariance of the Chinese version of the Achievement Goal Questionnaire (AGQ-C) utilizing a sample of 1,115 Chinese university students. Multi-group confirmatory factor analysis supported the configural, metric, and scalar invariance of the AGQ-C across genders. Analyses also revealed that the latent…
Test of the FLRW Metric and Curvature with Strong Lens Time Delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Kai; Li, Zhengxiang; Wang, Guo-Jian
We present a new model-independent strategy for testing the Friedmann–Lemaître–Robertson–Walker (FLRW) metric and constraining cosmic curvature, based on future time-delay measurements of strongly lensed quasar-elliptical galaxy systems from the Large Synoptic Survey Telescope and supernova observations from the Dark Energy Survey. The test only relies on geometric optics. It is independent of the energy contents of the universe and the validity of the Einstein equation on cosmological scales. The study comprises two levels: testing the FLRW metric through the distance sum rule (DSR) and determining/constraining cosmic curvature. We propose an effective and efficient (redshift) evolution model for performing the formermore » test, which allows us to concretely specify the violation criterion for the FLRW DSR. If the FLRW metric is consistent with the observations, then on the second level the cosmic curvature parameter will be constrained to ∼0.057 or ∼0.041 (1 σ ), depending on the availability of high-redshift supernovae, which is much more stringent than current model-independent techniques. We also show that the bias in the time-delay method might be well controlled, leading to robust results. The proposed method is a new independent tool for both testing the fundamental assumptions of homogeneity and isotropy in cosmology and for determining cosmic curvature. It is complementary to cosmic microwave background plus baryon acoustic oscillation analyses, which normally assume a cosmological model with dark energy domination in the late-time universe.« less
NASA Astrophysics Data System (ADS)
Amaral, Marcelo M.; Aschheim, Raymond; Bubuianu, Laurenţiu; Irwin, Klee; Vacaru, Sergiu I.; Woolridge, Daniel
2017-09-01
The goal of this work is to elaborate on new geometric methods of constructing exact and parametric quasiperiodic solutions for anamorphic cosmology models in modified gravity theories, MGTs, and general relativity, GR. There exist previously studied generic off-diagonal and diagonalizable cosmological metrics encoding gravitational and matter fields with quasicrystal like structures, QC, and holonomy corrections from loop quantum gravity, LQG. We apply the anholonomic frame deformation method, AFDM, in order to decouple the (modified) gravitational and matter field equations in general form. This allows us to find integral varieties of cosmological solutions determined by generating functions, effective sources, integration functions and constants. The coefficients of metrics and connections for such cosmological configurations depend, in general, on all spacetime coordinates and can be chosen to generate observable (quasi)-periodic/aperiodic/fractal/stochastic/(super) cluster/filament/polymer like (continuous, stochastic, fractal and/or discrete structures) in MGTs and/or GR. In this work, we study new classes of solutions for anamorphic cosmology with LQG holonomy corrections. Such solutions are characterized by nonlinear symmetries of generating functions for generic off-diagonal cosmological metrics and generalized connections, with possible nonholonomic constraints to Levi-Civita configurations and diagonalizable metrics depending only on a time like coordinate. We argue that anamorphic quasiperiodic cosmological models integrate the concept of quantum discrete spacetime, with certain gravitational QC-like vacuum and nonvacuum structures. And, that of a contracting universe that homogenizes, isotropizes and flattens without introducing initial conditions or multiverse problems.
Universal health coverage in Rwanda: dream or reality.
Nyandekwe, Médard; Nzayirambaho, Manassé; Baptiste Kakoma, Jean
2014-01-01
Universal Health Coverage (UHC) has been a global concern for a long time and even more nowadays. While a number of publications are almost unanimous that Rwanda is not far from UHC, very few have focused on its financial sustainability and on its extreme external financial dependency. The objectives of this study are: (i) To assess Rwanda UHC based mainly on Community-Based Health Insurance (CBHI) from 2000 to 2012; (ii) to inform policy makers about observed gaps for a better way forward. A retrospective (2000-2012) SWOT analysis was applied to six metrics as key indicators of UHC achievement related to WHO definition, i.e. (i) health insurance and access to care, (ii) equity, (iii) package of services, (iv) rights-based approach, (v) quality of health care, (vi) financial-risk protection, and (vii) CBHI self-financing capacity (SFC) was added by the authors. The first metric with 96,15% of overall health insurance coverage and 1.07 visit per capita per year versus 1 visit recommended by WHO, the second with 24,8% indigent people subsidized versus 24,1% living in extreme poverty, the third, the fourth, and the fifth metrics excellently performing, the sixth with 10.80% versus ≤40% as limit acceptable of catastrophic health spending level and lastly the CBHI SFC i.e. proper cost recovery estimated at 82.55% in 2011/2012, Rwanda UHC achievements are objectively convincing. Rwanda UHC is not a dream but a reality if we consider all convincing results issued of the seven metrics.
No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.
Li, Xuelong; Guo, Qun; Lu, Xiaoqiang
2016-05-13
It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.
Extending the ΛCDM model through shear-free anisotropies
NASA Astrophysics Data System (ADS)
Pereira, Thiago S.; Pabon, Davincy T.
2016-07-01
If the spacetime metric has anisotropic spatial curvature, one can still expand the universe as if it were isotropic, provided that the energy-momentum tensor satisfies a certain constraint. This leads to the so-called shear-free (SF) metrics, which have the interesting property of violating the cosmological principle while still preserving the isotropy of the cosmic microwave background (CMB) radiation. In this work, we show that SF cosmologies correspond to an attractor solution in the space of models with anisotropic spatial curvature. Through a rigorous definition of linear perturbation theory in these spacetimes, we show that SF models represent a viable alternative to explain the large-scale evolution of the universe, leading, in particular to a kinematically equivalent Sachs-Wolfe (SW) effect. Alternatively, we discuss some specific signatures that SF models would imprint on the temperature spectrum of CMB.
Improving Staffing and Nurse Engagement in a Neuroscience Intermediate Unit.
Nadolski, Charles; Britt, Pheraby; Ramos, Leah C
2017-06-01
The neuroscience intermediate unit is a 23-bed unit that was initially staffed with a nurse-to-patient ratio of 1:4 to 1:5. In time, the unit's capacity to care for the exceeding number of progressively acute patients fell short of the desired goals in the staff affecting the nurse satisfaction. The clinical nurses desired a lower nurse-patient ratio. The purpose of this project was to justify a staffing increase through a return on investment and increased quality metrics. This initiative used mixed methodology to determine the ideal staffing for a neuroscience intermediate unit. The quantitative section focused on a review of the acuity of the patients. The qualitative section was based on descriptive interviews with University Healthcare Consortium nurse managers from similar units. The study reviewed the acuity of 9,832 patient days to determine the accurate acuity of neuroscience intermediate unit patients. Nurse managers at 12 University Healthcare Consortium hospitals and 8 units at the Medical University of South Carolina were contacted to compare staffing levels. The increase in nurse staffing contributed to an increase in many quality metrics. There were an 80% decrease in controllable nurse turnover and a 75% reduction in falls with injury after the lowered nurse-patient ratio. These 2 metrics established a return on investment for the staffing increase. In addition, the staffing satisfaction question on the Press Ganey employee engagement survey increased from 2.44 in 2013 to 3.72 in 2015 in response to the advocacy of the bedside nurses.
A comprehensive quality control workflow for paired tumor-normal NGS experiments.
Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc
2017-06-01
Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Alkozei, Anna; Smith, Ryan; Kotzin, Megan D; Waugaman, Debby L; Killgore, William D S
2017-01-27
It has been shown that higher levels of trait gratitude are associated with better self-reported sleep quality, possibly due to differences in presleep cognitions. However previous studies have not taken into account the role of depressive symptoms in this relationship. In this study, 88 nonclinical 18-29-year-olds completed the Gratitude Resentment and Appreciation Test (GRAT) as a measure of trait gratitude. The Glasgow Content of Thought Inventory (GCTI) was used to measure the intrusiveness of cognitions prior to sleep onset, the Motivation and Energy Inventory (MEI) assessed daytime fatigue, and the Pittsburgh Sleep Quality Index (PSQI) was used to assess self-reported sleep quality. The BDI-II assessed self-reported depressive symptoms. Consistent with previous work, GRAT scores were positively associated with higher daytime energy and greater number of hours of sleep per night. Importantly, however, we further observed that depressive symptoms mediated the relationships between gratitude scores and sleep metrics. Depressive mood state appears to mediate the association between gratitude and self-reported sleep quality metrics. We suggest, as one plausible model of these phenomena, that highly grateful individuals have lower symptoms of depression, which in turn leads to fewer presleep worries, resulting in better perceived sleep quality. Future work should aim to disentangle the causal nature of these relationships in order to better understand how these important variables interact.
The impact of variation in scaling factors on the estimation of ...
Many physiologically based pharmacokinetic (PBPK) models include values for metabolic rate parameters extrapolated from in vitro metabolism studies using scaling factors such as mg of microsomal protein per gram of liver (MPPGL) and liver mass (FVL). Variation in scaling factor values impacts metabolic rate parameter estimates (Vmax) and hence estimates of internal dose used in dose response analysis. The impacts of adult human variation in MPPGL and FVL on estimates of internal dose were assessed using a human PBPK model for BDCM for several internal dose metrics for two exposure scenarios (single 0.25 liter drink of water or 10 minute shower) under plausible (5 micrograms/L) and high level (20 micrograms/L) water concentrations. For both concentrations, all internal dose metrics were changed less than 5% for the showering scenario (combined inhalation and dermal exposure). In contrast, a 27-fold variation in area under the curve for BDCM in venous blood was observed at both oral exposure concentrations, whereas total amount of BDCM metabolized in liver was relatively unchanged. This analysis demonstrates that variability in the scaling factors used for in vitro to in vivo extrapolation (IVIVE) for metabolic rate parameters can have a significant route-dependent impact on estimates of internal dose under environmentally relevant exposure scenarios. This indicates the need to evaluate both uncertainty and variability for scaling factors used for IVIVE. Sca
Interdisciplinary research has consistently lower funding success.
Bromham, Lindell; Dinnage, Russell; Hua, Xia
2016-06-30
Interdisciplinary research is widely considered a hothouse for innovation, and the only plausible approach to complex problems such as climate change. One barrier to interdisciplinary research is the widespread perception that interdisciplinary projects are less likely to be funded than those with a narrower focus. However, this commonly held belief has been difficult to evaluate objectively, partly because of lack of a comparable, quantitative measure of degree of interdisciplinarity that can be applied to funding application data. Here we compare the degree to which research proposals span disparate fields by using a biodiversity metric that captures the relative representation of different fields (balance) and their degree of difference (disparity). The Australian Research Council's Discovery Programme provides an ideal test case, because a single annual nationwide competitive grants scheme covers fundamental research in all disciplines, including arts, humanities and sciences. Using data on all 18,476 proposals submitted to the scheme over 5 consecutive years, including successful and unsuccessful applications, we show that the greater the degree of interdisciplinarity, the lower the probability of being funded. The negative impact of interdisciplinarity is significant even when number of collaborators, primary research field and type of institution are taken into account. This is the first broad-scale quantitative assessment of success rates of interdisciplinary research proposals. The interdisciplinary distance metric allows efficient evaluation of trends in research funding, and could be used to identify proposals that require assessment strategies appropriate to interdisciplinary research.
Application of hazard models for patients with breast cancer in Cuba
Alfonso, Anet Garcia; de Oca, Néstor Arcia Montes
2011-01-01
There has been a rapid development in hazard models and survival analysis in the last decade. This article aims to assess the overall survival time of breast cancer in Cuba, as well as to determine plausible factors that may have a significant impact in the survival time. The data are obtained from the National Cancer Register of Cuba. The data set used in this study relates to 6381 patients diagnosed with breast cancer between January 2000 and December 2002. Follow-up data are available until the end of December 2007, by which time 2167 (33.9%) had died and 4214 (66.1%) were still alive. The adequacy of six parametric models is assessed by using their Akaike information criterion values. Five of the six parametric models (Exponential, Weibull, Log-logistic, Lognormal, and Generalized Gamma) are parameterized by using the accelerated failure-time metric, and the Gompertz model is parameterized by using the proportional hazard metric. The main result in terms of survival is found for the different categories of the clinical stage covariate. The survival time among patients who have been diagnosed at early stage of breast cancer is about 60% higher than the one among patients diagnosed at more advanced stage of the disease. Differences among provinces have not been found. The age is another significant factor, but there is no important difference between patient ages. PMID:21686138
Application of hazard models for patients with breast cancer in Cuba.
Alfonso, Anet Garcia; de Oca, Néstor Arcia Montes
2011-01-01
There has been a rapid development in hazard models and survival analysis in the last decade. This article aims to assess the overall survival time of breast cancer in Cuba, as well as to determine plausible factors that may have a significant impact in the survival time. The data are obtained from the National Cancer Register of Cuba. The data set used in this study relates to 6381 patients diagnosed with breast cancer between January 2000 and December 2002. Follow-up data are available until the end of December 2007, by which time 2167 (33.9%) had died and 4214 (66.1%) were still alive. The adequacy of six parametric models is assessed by using their Akaike information criterion values. Five of the six parametric models (Exponential, Weibull, Log-logistic, Lognormal, and Generalized Gamma) are parameterized by using the accelerated failure-time metric, and the Gompertz model is parameterized by using the proportional hazard metric. The main result in terms of survival is found for the different categories of the clinical stage covariate. The survival time among patients who have been diagnosed at early stage of breast cancer is about 60% higher than the one among patients diagnosed at more advanced stage of the disease. Differences among provinces have not been found. The age is another significant factor, but there is no important difference between patient ages.
Fischer, H Felix; Wahl, Inka; Nolte, Sandra; Liegl, Gregor; Brähler, Elmar; Löwe, Bernd; Rose, Matthias
2017-12-01
To investigate differential item functioning (DIF) of PROMIS Depression items between US and German samples we compared data from the US PROMIS calibration sample (n = 780), a German general population survey (n = 2,500) and a German clinical sample (n = 621). DIF was assessed in an ordinal logistic regression framework, with 0.02 as criterion for R 2 -change and 0.096 for Raju's non-compensatory DIF. Item parameters were initially fixed to the PROMIS Depression metric; we used plausible values to account for uncertainty in depression estimates. Only four items showed DIF. Accounting for DIF led to negligible effects for the full item bank as well as a post hoc simulated computer-adaptive test (< 0.1 point on the PROMIS metric [mean = 50, standard deviation =10]), while the effect on the short forms was small (< 1 point). The mean depression severity (43.6) in the German general population sample was considerably lower compared to the US reference value of 50. Overall, we found little evidence for language DIF between US and German samples, which could be addressed by either replacing the DIF items by items not showing DIF or by scoring the short form in German samples with the corrected item parameters reported. Copyright © 2016 John Wiley & Sons, Ltd.
Endoscopic Full-Field Swept-Source Optical Coherence Tomography Neuroimaging System
NASA Astrophysics Data System (ADS)
Felts Almog, Ilan
Optical Coherence Tomography (OCT) has the capability to differentiate brain elements with intrinsic contrast and at a resolution an order-of-magnitude higher than other imaging modalities. This thesis investigates the feasibility of OCT for neuroimaging applied to neurosurgical guidance. We present, to our knowledge, the first Full-Field Swept-Source OCT system operating near a wavelength of 1310 nm, achieving a transverse imaging resolution of 6.5 mum, an axial resolution of 14 mum in tissue and a field of view of 270 mum x 180 mum x 400 mum. Imaging experiments were performed on rat brain tissues ex vivo, human cortical tissue ex vivo, and rats in vivo. A multi-level threshold metric applied on the intensity of the images led to a plausible correlation between the observed density and location in the brain. The proof-of-concept OCT system can be improved and miniaturized for clinical use.
Metrics for quantifying antimicrobial use in beef feedlots.
Benedict, Katharine M; Gow, Sheryl P; Reid-Smith, Richard J; Booker, Calvin W; Morley, Paul S
2012-08-01
Accurate antimicrobial drug use data are needed to enlighten discussions regarding the impact of antimicrobial drug use in agriculture. The primary objective of this study was to investigate the perceived accuracy and clarity of different methods for reporting antimicrobial drug use information collected regarding beef feedlots. Producers, veterinarians, industry representatives, public health officials, and other knowledgeable beef industry leaders were invited to complete a web-based survey. A total of 156 participants in 33 US states, 4 Canadian provinces, and 8 other countries completed the survey. No single metric was considered universally optimal for all use circumstances or for all audiences. To effectively communicate antimicrobial drug use data, evaluation of the target audience is critical to presenting the information. Metrics that are most accurate need to be carefully and repeatedly explained to the audience.
New universal attractor in nonminimally coupled gravity: Linear inflation
NASA Astrophysics Data System (ADS)
Racioppi, Antonio
2018-06-01
Once quantum corrections are taken into account, the strong coupling limit of the ξ -attractor models (in metric gravity) might depart from the usual Starobinsky solution and move into linear inflation. Furthermore, it is well known that the metric and Palatini formulations of gravity lead to different inflationary predictions in presence of nonminimally couplings between gravity and the inflaton. In this paper, we show that for a certain class of nonminimally coupled models, loop corrections will lead to a linear inflation attractor regardless of the adopted gravity formulation.
ERIC Educational Resources Information Center
Bartik, Timothy J.; Gormley, William; Adelstein, Shirley
2011-01-01
This paper estimates future adult earnings effects associated with a universal pre-K program in Tulsa, Oklahoma. These informed projections help to compensate for the lack of long-term data on universal pre-K programs, while using metrics that relate test scores to valued social benefits. Combining test-score data from the fall of 2006 and recent…
ERIC Educational Resources Information Center
Grigson, Anna
2009-01-01
The University of Westminster began collecting and analyzing vendor usage reports since starting their e-book collections in 2004. They have used the results both to monitor the use of the collections and to calculate basic cost-per-use metrics that have informed decisions on whether to renew particular resources. In 2008, they sought to extend…
Cosmological models with homogeneous and isotropic spatial sections
NASA Astrophysics Data System (ADS)
Katanaev, M. O.
2017-05-01
The assumption that the universe is homogeneous and isotropic is the basis for the majority of modern cosmological models. We give an example of a metric all of whose spatial sections are spaces of constant curvature but the space-time is nevertheless not homogeneous and isotropic as a whole. We give an equivalent definition of a homogeneous and isotropic universe in terms of embedded manifolds.
Systems engineering technology for networks
NASA Technical Reports Server (NTRS)
1994-01-01
The report summarizes research pursued within the Systems Engineering Design Laboratory at Virginia Polytechnic Institute and State University between May 16, 1993 and January 31, 1994. The project was proposed in cooperation with the Computational Science and Engineering Research Center at Howard University. Its purpose was to investigate emerging systems engineering tools and their applicability in analyzing the NASA Network Control Center (NCC) on the basis of metrics and measures.
ERIC Educational Resources Information Center
Tran, Ngoc-Yen; Chan, Emily K.
2018-01-01
With limited campus resources for faculty scholarship, the College of Science (CoS) at San José State University (SJSU) developed scholarly output metrics as a way to add a quantitative component to the distribution of funds, to ensure objectivity, and to reward proven researchers. To support CoS's efforts to identify and quantify science faculty…
ERIC Educational Resources Information Center
Nur, Yusuf Ahmed; Grabner-Hagen, Melissa; Saam, Julie Reinhardt
2013-01-01
The metric for assessing the quality of a university within a traditional Western setting is well established. Evaluation of higher education institutions within developing countries, however, is not as clear-cut. In this paper the efficacy of quality assessment measures are examined through the case study of a university in the Somali Republic,…
Handbook of Forecasting Techniques. Part 2. Description of 31 Techniques
1977-08-01
a discipline, or some other coherent group. Panels have often produced good results, but care must be taken to avoid bandwagon effects , blockage of...34 bandwagon " effect often occurs in panels, so that one person’s viewpoint overwhelms the opinions of others and/or plausible alternatives never get proper...as an ancient one, however. Since Newton, the western world has increasingly acknowledged the universality of cause- effect explanations, with cause
The influence of MCAT and GPA preadmission academic metrics on interview scores.
Gay, Steven E; Santen, Sally A; Mangrulkar, Rajesh S; Sisson, Thomas H; Ross, Paula T; Zaidi, Nikki L Bibler
2018-03-01
Medical school admissions interviews are used to assess applicants' nonacademic characteristics as advocated by the Association of American Medical Colleges' Advancing Holistic Review Initiative. The objective of this study is to determine whether academic metrics continue to significantly influence interviewers' scores in holistic processes by blinding interviewers to applicants' undergraduate grade point averages (uGPA) and Medical College Admission Test (MCAT). This study examines academic and demographic predictors of interview scores for two applicant cohorts at the University of Michigan Medical School. In 2012, interviewers were provided applicants' uGPA and MCAT scores; in 2013, these academic metrics were withheld from interviewers' files. Hierarchical regression analysis was conducted to examine the influence of academic and demographic variables on overall cohort interview scores. When interviewers were provided uGPA and MCAT scores, academic metrics explained more variation in interview scores (7.9%) than when interviewers were blinded to these metrics (4.1%). Further analysis showed a statistically significant interaction between cohort and uGPA, indicating that the association between uGPA and interview scores was significantly stronger for the 2012 unblinded cohort compared to the 2013 blinded cohort (β = .573, P < .05). By contrast, MCAT scores had no interactive effects on interviewer scores. While MCAT scores accounted for some variation in interview scores for both cohorts, only access to uGPA significantly influenced interviewers' scores when looking at interaction effects. Withholding academic metrics from interviewers' files may promote assessment of nonacademic characteristics independently from academic metrics.
Global Surgery System Strengthening: It Is All About the Right Metrics.
Watters, David A; Guest, Glenn D; Tangi, Viliami; Shrime, Mark G; Meara, John G
2018-04-01
Progress in achieving "universal access to safe, affordable surgery, and anesthesia care when needed" is dependent on consensus not only about the key messages but also on what metrics should be used to set goals and measure progress. The Lancet Commission on Global Surgery not only achieved consensus on key messages but also recommended 6 key metrics to inform national surgical plans and monitor scale-up toward 2030. These metrics measure access to surgery, as well as its timeliness, safety, and affordability: (1) Two-hour access to the 3 Bellwether procedures (cesarean delivery, emergency laparotomy, and management of an open fracture); (2) Surgeon, Anesthetist, and Obstetrician workforce >20/100,000; (3) Surgical volume of 5000 procedures/100,000; (4) Reporting of perioperative mortality rate; and (5 and 6) Risk rates of catastrophic expenditure and impoverishment when requiring surgery. This article discusses the definition, validity, feasibility, relevance, and progress with each of these metrics. The authors share their experience of introducing the metrics in the Pacific and sub-Saharan Africa. We identify appropriate messages for each potential stakeholder-the patients, practitioners, providers (health services and hospitals), public (community), politicians, policymakers, and payers. We discuss progress toward the metrics being included in core indicator lists by the World Health Organization and the World Bank and how they have been, or may be, used to inform National Surgical Plans in low- and middle-income countries to scale-up the delivery of safe, affordable, and timely surgical and anesthesia care to all who need it.
GREENPLEX -- A SUSTAINABLE URBAN FORM FOR THE 21ST CENTURY
Outputs include images of architecture, space usage, social design, elevators, skybridges, ETFE envelope, structures, construction process, HVAC system, and water system. Outputs include performance metrics for the University Community Greenplex and traditional univer...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2000-10-02
Scientific progress reports submitted by university researchers conducting projects funded through CPBR and metrics reports submitted by industry sponsors that provided matching funds to the projects.
New geometric and field theoretic aspects of a radiation dominated universe
NASA Astrophysics Data System (ADS)
Modak, Sujoy K.
2018-05-01
The homogeneous and isotropic radiation dominated universe, following the inflationary stage, is expressed as a spherically symmetric and inhomogeneous spacetime upon a power-law-type conformal transformation of the null (cosmological) coordinates. This new spacetime metric has many interesting properties. While the static observers, at a fixed position in this new spacetime, do not see any horizon, some nonstatic observers encounter a horizon due to their motion which is analogous to the situation of Rindler observers in Minkowski spacetime. The symmetry of the new metric offers a unitarily inequivalent quantization of the massless scalar field and provides a new example of particle creation. We calculate the particle content of the cosmological vacuum state with respect to the static observer in this new spacetime who, with respect to cosmological time, is freely falling in the asymptotic past and future but accelerated in between.
Constants of the motion, universal time and the Hamilton-Jacobi function in general relativity
NASA Astrophysics Data System (ADS)
O'Hara, Paul
2013-04-01
In most text books of mechanics, Newton's laws or Hamilton's equations of motion are first written down and then solved based on initial conditions to determine the constants of the motions and to describe the trajectories of the particles. In this essay, we take a different starting point. We begin with the metrics of general relativity and show how they can be used to construct by inspection constants of motion, which can then be used to write down the equations of the trajectories. This will be achieved by deriving a Hamiltonian-Jacobi function from the metric and showing that its existence requires all of the above mentioned properties. The article concludes by showing that a consistent theory of such functions also requires the need for a universal measure of time which can be identified with the "worldtime" parameter, first introduced by Steuckelberg and later developed by Horwitz and Piron.
Bunting, Alexandra C; Alavifard, Sepand; Walker, Benjamin; Miller, Donald R; Ramsay, Tim; Boet, Sylvain
2018-03-05
To evaluate the relative research productivity and ranking of anesthesiology departments in Canada and the United States, using the Hirsch index (h-index) and 4 other previously validated metrics. We identified 150 anesthesiology departments in Canada and the United States with an accredited residency program. Publications for each of the 150 departments were identified using Thomson's Institute for Scientific Information Web of Science, and the citation report for each department was exported. The bibliometric data were used to calculate publication metrics for 3 time periods: cumulative (1945-2014), 10 years (2005-2014), and 5 years (2010-2014). The following group metrics were then used to determine the publication impact and relative ranking of all 150 departments: h-index, m-index, total number of publications, sum of citations, and average number of citations per article. Ranking for each metric were also stratified by using a proxy for departmental size. The most common journals in which US and Canadian anesthesiology departments publish their work were identified. The majority (23 of the top 25) of top-ranked anesthesiology departments are in the United States, and 2 of the top 25 departments (University of Toronto; McGill University) are in Canada. There was a strong positive relationship between each of h-index, total number of publications, and the sum of citations (0.91-0.97; P < .0001). Departmental size correlates with increased academic productivity on most metrics. The most frequent journals in which US and Canadian anesthesiology departments publish are Anesthesiology, Anesthesia and Analgesia, and the Canadian Journal of Anesthesia. Our study ranked the Canadian and US anesthesiology departmental research productivity using the h-index applied to each department, total number of publications, total number of citations, and average number of citations. The strong relationship between the h-index and both the number of publications and number of citations of anesthesiology departments shows that the departments with the highest number of publications are also producing research with the most highly cited articles (ie, most impact), as demonstrated by the h-index.
Channel change and bed-material transport in the Umpqua River basin, Oregon
Wallick, J. Rose; O'Connor, Jim E.; Anderson, Scott; Keith, Mackenzie K.; Cannon, Charles; Risley, John C.
2011-01-01
The Umpqua River drains 12,103 square kilometers of western Oregon; with headwaters in the Cascade Range, the river flows through portions of the Klamath Mountains and Oregon Coast Range before entering the Pacific Ocean. Above the head of tide, the Umpqua River, along with its major tributaries, the North and South Umpqua Rivers, flows on a mixed bedrock and alluvium bed, alternating between bedrock rapids and intermittent, shallow gravel bars composed of gravel to cobble-sized clasts. These bars have been a source of commercial aggregate since the mid-twentieth century. Below the head of tide, the Umpqua River contains large bars composed of mud and sand. Motivated by ongoing permitting and aquatic habitat concerns related to in-stream gravel mining on the fluvial reaches, this study evaluated spatial and temporal trends in channel change and bed-material transport for 350 kilometers of river channel along the Umpqua, North Umpqua, and South Umpqua Rivers. The assessment produced (1) detailed mapping of the active channel, using aerial photographs and repeat surveys, and (2) a quantitative estimation of bed-material flux that drew upon detailed measurements of particle size and lithology, equations of transport capacity, and a sediment yield analysis. Bed-material transport capacity estimates at 45 sites throughout the South Umpqua and main stem Umpqua Rivers for the period 1951-2008 result in wide-ranging transport capacity estimates, reflecting the difficulty of applying equations of bed-material transport to a supply-limited river. Median transport capacity values calculated from surface-based equations of bedload transport for each of the study reaches provide indications of maximum possible transport rates and range from 8,000 to 27,000 metric tons per year (tons/yr) for the South Umpqua River and 20,000 to 82,000 metric tons/yr for the main stem Umpqua River upstream of the head of tide; the North Umpqua River probably contributes little bed material. A plausible range of average annual transport rates for the South and main stem Umpqua Rivers, based on bedload transport capacity estimates for bars with reasonable values for reference shear stress, is between 500 and 20,000 metric tons/yr. An empirical bed-material yield analysis predicts 20,000-50,000 metric tons/yr on the South Umpqua River and main stem Umpqua River through the Oregon Coast Range, decreasing to approximately 30,000 metric tons/yr at the head of tide. Surveys of individual mining sites in the South Umpqua River indicate minimum local bed-material flux rates that are typically less than 10,000 metric tons/yr but range up to 30,600 metric tons/yr in high-flow years. On the basis of all of these analyses, actual bedload flux in most years is probably less than 25,000 metric tons/yr in the South Umpqua and main stem Umpqua Rivers, with the North Umpqua River probably contributing negligible amounts. For comparison, the estimated annual volume of commercial gravel extraction from the South Umpqua River between 2001 and 2004 ranged from 610 to 36,570 metric tons, indicating that historical in-stream gravel extraction may have been a substantial fraction of the overall bedload flux.
Universal health coverage in Rwanda: dream or reality
Nyandekwe, Médard; Nzayirambaho, Manassé; Baptiste Kakoma, Jean
2014-01-01
Introduction Universal Health Coverage (UHC) has been a global concern for a long time and even more nowadays. While a number of publications are almost unanimous that Rwanda is not far from UHC, very few have focused on its financial sustainability and on its extreme external financial dependency. The objectives of this study are: (i) To assess Rwanda UHC based mainly on Community-Based Health Insurance (CBHI) from 2000 to 2012; (ii) to inform policy makers about observed gaps for a better way forward. Methods A retrospective (2000-2012) SWOT analysis was applied to six metrics as key indicators of UHC achievement related to WHO definition, i.e. (i) health insurance and access to care, (ii) equity, (iii) package of services, (iv) rights-based approach, (v) quality of health care, (vi) financial-risk protection, and (vii) CBHI self-financing capacity (SFC) was added by the authors. Results The first metric with 96,15% of overall health insurance coverage and 1.07 visit per capita per year versus 1 visit recommended by WHO, the second with 24,8% indigent people subsidized versus 24,1% living in extreme poverty, the third, the fourth, and the fifth metrics excellently performing, the sixth with 10.80% versus ≤40% as limit acceptable of catastrophic health spending level and lastly the CBHI SFC i.e. proper cost recovery estimated at 82.55% in 2011/2012, Rwanda UHC achievements are objectively convincing. Conclusion Rwanda UHC is not a dream but a reality if we consider all convincing results issued of the seven metrics. PMID:25170376
Making Strategic Sense of Cyber Power: Why the Sky is Not Falling
2013-04-01
identified very plausibly in the early- 19th century . In that regard, it is probably no exaggeration to argue that the elec- tric telegraph in the 1840s...including: The Sheriff: America’s Defense of the New World Order (University Press of Kentucky, 2004); Another Bloody Century : Fu- ture Warfare (Weidenfeld...provenance of the better part of a century prior to 1945, that of our contemporary IT revolution centered around the computer and its exploitation
Holographic Dark Energy Density
NASA Astrophysics Data System (ADS)
Saadat, Hassan
2011-06-01
In this article we consider the cosmological model based on the holographic dark energy. We study dark energy density in Universe with arbitrary spatially curvature described by the Friedmann-Robertson-Walker metric. We use Chevallier-Polarski-Linder parametrization to specify dark energy density.
Automatic intersection map generation task 10 report.
DOT National Transportation Integrated Search
2016-02-29
This report describes the work conducted in Task 10 of the V2I Safety Applications Development Project. The work was performed by the University of Michigan Transportation Research Institute (UMTRI) under contract to the Crash Avoidance Metrics Partn...
Beyond the Turing Test: Performance Metrics for Evaluating a Computer Simulation of the Human Mind
2002-08-01
Tomasello , 2001. Perceiving intentions and learning words in the second year of life. In M. Bowerman and S. Levinson (Eds.), Language Acquisition and Conceptual Development. Cambridge University Press, New York, NY.
The Optimizer Topology Characteristics in Seismic Hazards
NASA Astrophysics Data System (ADS)
Sengor, T.
2015-12-01
The characteristic data of the natural phenomena are questioned in a topological space approach to illuminate whether there is an algorithm behind them bringing the situation of physics of phenomena to optimized states even if they are hazards. The optimized code designing the hazard on a topological structure mashes the metric of the phenomena. The deviations in the metric of different phenomena push and/or pull the fold of the other suitable phenomena. For example if the metric of a specific phenomenon A fits to the metric of another specific phenomenon B after variation processes generated with the deviation of the metric of previous phenomenon A. Defining manifold processes covering the metric characteristics of each of every phenomenon is possible for all the physical events; i.e., natural hazards. There are suitable folds in those manifold groups so that each subfold fits to the metric characteristics of one of the natural hazard category at least. Some variation algorithms on those metric structures prepare a gauge effect bringing the long time stability of Earth for largely scaled periods. The realization of that stability depends on some specific conditions. These specific conditions are called optimized codes. The analytical basics of processes in topological structures are developed in [1]. The codes are generated according to the structures in [2]. Some optimized codes are derived related to the seismicity of NAF beginning from the quakes of the year 1999. References1. Taner SENGOR, "Topological theory and analytical configuration for a universal community model," Procedia- Social and Behavioral Sciences, Vol. 81, pp. 188-194, 28 June 2013, 2. Taner SENGOR, "Seismic-Climatic-Hazardous Events Estimation Processes via the Coupling Structures in Conserving Energy Topologies of the Earth," The 2014 AGU Fall Meeting, Abstract no.: 31374, ABD.
NASA Astrophysics Data System (ADS)
Monjo, R.
2017-11-01
Most of current cosmological theories are built combining an isotropic and homogeneous manifold with a scale factor that depends on time. If one supposes a hyperconical universe with linear expansion, an inhomogeneous metric can be obtained by an appropriate transformation that preserves the proper time. This model locally tends to a flat Friedman-Robertson-Walker metric with linear expansion. The objective of this work is to analyze the observational compatibility of the inhomogeneous metric considered. For this purpose, the corresponding luminosity distance was obtained and was compared with the observations of 580 SNe Ia, taken from the Supernova Cosmology Project. The best fit of the hyperconical model obtains χ02=562 , the same value as the standard Λ CDM model. Finally, a possible relationship is found between both theories.
Artificial General Intelligence: Concept, State of the Art, and Future Prospects
NASA Astrophysics Data System (ADS)
Goertzel, Ben
2014-12-01
In recent years broad community of researchers has emerged, focusing on the original ambitious goals of the AI field - the creation and study of software or hardware systems with general intelligence comparable to, and ultimately perhaps greater than, that of human beings. This paper surveys this diverse community and its progress. Approaches to defining the concept of Artificial General Intelligence (AGI) are reviewed including mathematical formalisms, engineering, and biology inspired perspectives. The spectrum of designs for AGI systems includes systems with symbolic, emergentist, hybrid and universalist characteristics. Metrics for general intelligence are evaluated, with a conclusion that, although metrics for assessing the achievement of human-level AGI may be relatively straightforward (e.g. the Turing Test, or a robot that can graduate from elementary school or university), metrics for assessing partial progress remain more controversial and problematic.
Metrics for quantifying antimicrobial use in beef feedlots
Benedict, Katharine M.; Gow, Sheryl P.; Reid-Smith, Richard J.; Booker, Calvin W.; Morley, Paul S.
2012-01-01
Accurate antimicrobial drug use data are needed to enlighten discussions regarding the impact of antimicrobial drug use in agriculture. The primary objective of this study was to investigate the perceived accuracy and clarity of different methods for reporting antimicrobial drug use information collected regarding beef feedlots. Producers, veterinarians, industry representatives, public health officials, and other knowledgeable beef industry leaders were invited to complete a web-based survey. A total of 156 participants in 33 US states, 4 Canadian provinces, and 8 other countries completed the survey. No single metric was considered universally optimal for all use circumstances or for all audiences. To effectively communicate antimicrobial drug use data, evaluation of the target audience is critical to presenting the information. Metrics that are most accurate need to be carefully and repeatedly explained to the audience. PMID:23372190
Beyeler, Michael; Dutt, Nikil D; Krichmar, Jeffrey L
2013-12-01
Understanding how the human brain is able to efficiently perceive and understand a visual scene is still a field of ongoing research. Although many studies have focused on the design and optimization of neural networks to solve visual recognition tasks, most of them either lack neurobiologically plausible learning rules or decision-making processes. Here we present a large-scale model of a hierarchical spiking neural network (SNN) that integrates a low-level memory encoding mechanism with a higher-level decision process to perform a visual classification task in real-time. The model consists of Izhikevich neurons and conductance-based synapses for realistic approximation of neuronal dynamics, a spike-timing-dependent plasticity (STDP) synaptic learning rule with additional synaptic dynamics for memory encoding, and an accumulator model for memory retrieval and categorization. The full network, which comprised 71,026 neurons and approximately 133 million synapses, ran in real-time on a single off-the-shelf graphics processing unit (GPU). The network was constructed on a publicly available SNN simulator that supports general-purpose neuromorphic computer chips. The network achieved 92% correct classifications on MNIST in 100 rounds of random sub-sampling, which is comparable to other SNN approaches and provides a conservative and reliable performance metric. Additionally, the model correctly predicted reaction times from psychophysical experiments. Because of the scalability of the approach and its neurobiological fidelity, the current model can be extended to an efficient neuromorphic implementation that supports more generalized object recognition and decision-making architectures found in the brain. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Graham, Peter W.; Kaplan, David E.; Rajendran, Surjeet
2018-02-01
We present a class of nonsingular, bouncing cosmologies that evade singularity theorems through the use of vorticity in compact extra dimensions. The vorticity combats the focusing of geodesics during the contracting phase. The construction requires fluids that violate the null energy condition (NEC) in the compact dimensions, where they can be provided by known stable NEC violating sources such as Casimir energy. The four dimensional effective theory contains an NEC violating fluid of Kaluza-Klein excitations of the higher dimensional metric. These spacetime metrics could potentially allow dynamical relaxation to solve the cosmological constant problem. These ideas can also be used to support traversable Lorentzian wormholes.
The story of fake impact factor companies and how we detected them.
Jalalian, Mehrdad
2015-01-01
Beginning about three years ago, the world of academic publishing has become infected by fake impact factors and misleading metrics that are launched by bogus companies. The misleading metrics and fake impact factors have damaged the prestige and reliability of scientific research and scholarly journals. This article presents the in-depth story of some of the main bogus impact factors, how they approached the academic world, and how the author identified them. Some names that they use are Universal Impact Factor (UIF), Global Impact Factor (GIF), and Citefactor, and there even is a fake Thomson Reuters Company.
Effective monitoring of agriculture: a response.
Sachs, Jeffrey D; Remans, Roseline; Smukler, Sean M; Winowiecki, Leigh; Andelman, Sandy J; Cassman, Kenneth G; Castle, David; DeFries, Ruth; Denning, Glenn; Fanzo, Jessica; Jackson, Louise E; Leemans, Rik; Lehmann, Johannes; Milder, Jeffrey C; Naeem, Shahid; Nziguheba, Generose; Palm, Cheryl A; Pingali, Prabhu L; Reganold, John P; Richter, Daniel D; Scherr, Sara J; Sircely, Jason; Sullivan, Clare; Tomich, Thomas P; Sanchez, Pedro A
2012-03-01
The development of effective agricultural monitoring networks is essential to track, anticipate and manage changes in the social, economic and environmental aspects of agriculture. We welcome the perspective of Lindenmayer and Likens (J. Environ. Monit., 2011, 13, 1559) as published in the Journal of Environmental Monitoring on our earlier paper, "Monitoring the World's Agriculture" (Sachs et al., Nature, 2010, 466, 558-560). In this response, we address their three main critiques labeled as 'the passive approach', 'the problem with uniform metrics' and 'the problem with composite metrics'. We expand on specific research questions at the core of the network design, on the distinction between key universal and site-specific metrics to detect change over time and across scales, and on the need for composite metrics in decision-making. We believe that simultaneously measuring indicators of the three pillars of sustainability (environmentally sound, social responsible and economically viable) in an effectively integrated monitoring system will ultimately allow scientists and land managers alike to find solutions to the most pressing problems facing global food security. This journal is © The Royal Society of Chemistry 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desai, V; Labby, Z; Culberson, W
Purpose: To determine whether body site-specific treatment plans form unique “plan class” clusters in a multi-dimensional analysis of plan complexity metrics such that a single beam quality correction determined for a representative plan could be universally applied within the “plan class”, thereby increasing the dosimetric accuracy of a detector’s response within a subset of similarly modulated nonstandard deliveries. Methods: We collected 95 clinical volumetric modulated arc therapy (VMAT) plans from four body sites (brain, lung, prostate, and spine). The lung data was further subdivided into SBRT and non-SBRT data for a total of five plan classes. For each control pointmore » in each plan, a variety of aperture-based complexity metrics were calculated and stored as unique characteristics of each patient plan. A multiple comparison of means analysis was performed such that every plan class was compared to every other plan class for every complexity metric in order to determine which groups could be considered different from one another. Statistical significance was assessed after correcting for multiple hypothesis testing. Results: Six out of a possible 10 pairwise plan class comparisons were uniquely distinguished based on at least nine out of 14 of the proposed metrics (Brain/Lung, Brain/SBRT lung, Lung/Prostate, Lung/SBRT Lung, Lung/Spine, Prostate/SBRT Lung). Eight out of 14 of the complexity metrics could distinguish at least six out of the possible 10 pairwise plan class comparisons. Conclusion: Aperture-based complexity metrics could prove to be useful tools to quantitatively describe a distinct class of treatment plans. Certain plan-averaged complexity metrics could be considered unique characteristics of a particular plan. A new approach to generating plan-class specific reference (pcsr) fields could be established through a targeted preservation of select complexity metrics or a clustering algorithm that identifies plans exhibiting similar modulation characteristics. Measurements and simulations will better elucidate potential plan-class specific dosimetry correction factors.« less
Banks, Peter; Brown, Richard; Laslowski, Alex; Daniels, Yvonne; Branton, Phil; Carpenter, John; Zarbo, Richard; Forsyth, Ramses; Liu, Yan-hui; Kohl, Shane; Diebold, Joachim; Masuda, Shinobu; Plummer, Tim
2017-01-01
Background: Anatomic pathology laboratory workflow consists of 3 major specimen handling processes. Among the workflow are preanalytic, analytic, and postanalytic phases that contain multistep subprocesses with great impact on patient care. A worldwide representation of experts came together to create a system of metrics, as a basis for laboratories worldwide, to help them evaluate and improve specimen handling to reduce patient safety risk. Method: Members of the Initiative for Anatomic Pathology Laboratory Patient Safety (IAPLPS) pooled their extensive expertise to generate a list of metrics highlighting processes with high and low risk for adverse patient outcomes. Results: Our group developed a universal, comprehensive list of 47 metrics for patient specimen handling in the anatomic pathology laboratory. Steps within the specimen workflow sequence are categorized as high or low risk. In general, steps associated with the potential for specimen misidentification correspond to the high-risk grouping and merit greater focus within quality management systems. Primarily workflow measures related to operational efficiency can be considered low risk. Conclusion: Our group intends to advance the widespread use of these metrics in anatomic pathology laboratories to reduce patient safety risk and improve patient care with development of best practices and interlaboratory error reporting programs. PMID:28340232
Banks, Peter; Brown, Richard; Laslowski, Alex; Daniels, Yvonne; Branton, Phil; Carpenter, John; Zarbo, Richard; Forsyth, Ramses; Liu, Yan-Hui; Kohl, Shane; Diebold, Joachim; Masuda, Shinobu; Plummer, Tim; Dennis, Eslie
2017-05-01
Anatomic pathology laboratory workflow consists of 3 major specimen handling processes. Among the workflow are preanalytic, analytic, and postanalytic phases that contain multistep subprocesses with great impact on patient care. A worldwide representation of experts came together to create a system of metrics, as a basis for laboratories worldwide, to help them evaluate and improve specimen handling to reduce patient safety risk. Members of the Initiative for Anatomic Pathology Laboratory Patient Safety (IAPLPS) pooled their extensive expertise to generate a list of metrics highlighting processes with high and low risk for adverse patient outcomes. : Our group developed a universal, comprehensive list of 47 metrics for patient specimen handling in the anatomic pathology laboratory. Steps within the specimen workflow sequence are categorized as high or low risk. In general, steps associated with the potential for specimen misidentification correspond to the high-risk grouping and merit greater focus within quality management systems. Primarily workflow measures related to operational efficiency can be considered low risk. Our group intends to advance the widespread use of these metrics in anatomic pathology laboratories to reduce patient safety risk and improve patient care with development of best practices and interlaboratory error reporting programs. © American Society for Clinical Pathology 2017.
Spherically symmetric conformal gravity and ''gravitational bubbles''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berezin, V.A.; Dokuchaev, V.I.; Eroshenko, Yu.N., E-mail: berezin@inr.ac.ru, E-mail: dokuchaev@inr.ac.ru, E-mail: eroshenko@inr.ac.ru
2016-01-01
The general structure of the spherically symmetric solutions in the Weyl conformal gravity is described. The corresponding Bach equations are derived for the special type of metrics, which can be considered as the representative of the general class. The complete set of the pure vacuum solutions is found. It consists of two classes. The first one contains the solutions with constant two-dimensional curvature scalar of our specific metrics, and the representatives are the famous Robertson-Walker metrics. One of them we called the ''gravitational bubbles'', which is compact and with zero Weyl tensor. Thus, we obtained the pure vacuum curved space-timesmore » (without any material sources, including the cosmological constant) what is absolutely impossible in General Relativity. Such a phenomenon makes it easier to create the universe from ''nothing''. The second class consists of the solutions with varying curvature scalar. We found its representative as the one-parameter family. It appears that it can be conformally covered by the thee-parameter Mannheim-Kazanas solution. We also investigated the general structure of the energy-momentum tensor in the spherical conformal gravity and constructed the vectorial equation that reveals clearly some features of non-vacuum solutions. Two of them are explicitly written, namely, the metrics à la Vaidya, and the electrovacuum space-time metrics.« less
Manganese nodule resources in the northeastern equatorial Pacific
McKelvey, V.E.; Wright, Nancy A.; Rowland, Robert W.
1979-01-01
Recent publication of maps at scale 1:1,000,000 of the northeastern equatorial Pacific region showing publicly available information on the nickel plus copper content of manganese nodules has made it possible to outline the prime area between the Clarion and Clipperton fracture zones which has been the focus of several recent scientific and commercial studies. The area, defined as that in which the nodules contain more than 1.8 percent nickel plus copper, is about 2o5 million km2. The available evidence suggests that about half of it contains nodules in concentration (reported in wet weight units) greater than 5 kg/m2 and averaging 11.9 kg/m2. If we assume that 20 percent of the nodules in this area of 1.25 million km2 are recoverable, its potential recoverable resources are about 2.1 billion dry metric tons of nodules averaging about 25 percent Mn, 1.3 percent Ni, 1.0 percent Cu, 0.22 percent Co, and 0.05 percent Mo—enough to support about 27 mining operations each producing an average of 75 million metric tons of nodules over their lifetimes. Estimates based on other plausible assumptions would be higher or lower, but of the same order of magnitude. Thus it seems probable that the magnitude of the potentially recoverable nodule resources of the Clarion-Clipperton prime area—the most promising now known—is at most in the range of several tens of the average-size operations postulated.
NASA Technical Reports Server (NTRS)
Schaefer, C.; Young, M.; Mason, S.; Coble, C.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Patel, N.; Gibson, C.; Alexander, D.;
2017-01-01
Enhanced screening for the Visual Impairment/Intracranial Pressure (VIIP) syndrome has been implemented to better characterize the ocular and vision changes observed in some long-duration crewmembers. This includes implementation of in-flight ultrasound in 2010 and optical coherence tomography (OCT) in 2013. Potential risk factors for VIIP include cardiovascular health, diet, anatomical and genetic factors, and environmental conditions. Carbon dioxide (CO2), a potent vasodilator, is chronically elevated on the International Space Station (ISS) relative to ambient levels on Earth, and is a plausible risk factor for VIIP. In an effort to understand the possible associations between CO2 and VIIP, this study explores the relationship of ambient CO2 levels on ISS compared to inflight ultrasound and OCT measures of the eye obtained from ISS crewmembers. CO2 measurements were aggregated from Operational Data Reduction Complex and Node 3 major constituent analyzers (MCAs) on ISS or from sensors located in the European Columbus module, as available. CO2 levels in the periods between each ultrasound and OCT session are summarized using timeseries metrics, including time-weighted means and variances. Partial least squares regression analyses are used to quantify the complex relationship between specific ultrasound and OCT measures and the CO2 metrics simulataneously. These analyses will enhance our understanding of the possible associations between CO2 levels and structural changes to the eye which will in turn inform future analysis of inflight VIIP data.
Leos-Urbel, Jacob; Schwartz, Amy Ellen; Weinstein, Meryle; Corcoran, Sean
2013-01-01
This paper examines the impact of the implementation of a universal free school breakfast policy on meals program participation, attendance, and academic achievement. In 2003, New York City made school breakfast free for all students regardless of income, while increasing the price of lunch for those ineligible for meal subsidies. Using a difference-indifference estimation strategy, we derive plausibly causal estimates of the policy’s impact by exploiting within and between group variation in school meal pricing before and after the policy change. Our estimates suggest that the policy resulted in small increases in breakfast participation both for students who experienced a decrease in the price of breakfast and for free-lunch eligible students who experienced no price change. The latter suggests that universal provision may alter behavior through mechanisms other than price, highlighting the potential merits of universal provision over targeted services. We find limited evidence of policy impacts on academic outcomes. PMID:24465073
Employing heat maps to mine associations in structured routine care data.
Toddenroth, Dennis; Ganslandt, Thomas; Castellanos, Ixchel; Prokosch, Hans-Ulrich; Bürkle, Thomas
2014-02-01
Mining the electronic medical record (EMR) has the potential to deliver new medical knowledge about causal effects, which are hidden in statistical associations between different patient attributes. It is our goal to detect such causal mechanisms within current research projects which include e.g. the detection of determinants of imminent ICU readmission. An iterative statistical approach to examine each set of considered attribute pairs delivers potential answers but is difficult to interpret. Therefore, we aimed to improve the interpretation of the resulting matrices by the use of heat maps. We propose strategies to adapt heat maps for the search for associations and causal effects within routine EMR data. Heat maps visualize tabulated metric datasets as grid-like choropleth maps, and thus present measures of association between numerous attribute pairs clearly arranged. Basic assumptions about plausible exposures and outcomes are used to allocate distinct attribute sets to both matrix dimensions. The image then avoids certain redundant graphical elements and provides a clearer picture of the supposed associations. Specific color schemes have been chosen to incorporate preexisting information about similarities between attributes. The use of measures of association as a clustering input has been taken as a trigger to apply transformations which ensure that distance metrics always assume finite values and treat positive and negative associations in the same way. To evaluate the general capability of the approach, we conducted analyses of simulated datasets and assessed diagnostic and procedural codes in a large routine care dataset. Simulation results demonstrate that the proposed clustering procedure rearranges attributes similar to simulated statistical associations. Thus, heat maps are an excellent tool to indicate whether associations concern the same attributes or different ones, and whether affected attribute sets conform to any preexisting relationship between attributes. The dendrograms help in deciding if contiguous sequences of attributes effectively correspond to homogeneous attribute associations. The exemplary analysis of a routine care dataset revealed patterns of associations that follow plausible medical constellations for several diseases and the associated medical procedures and activities. Cases with breast cancer (ICD C50), for example, appeared to be associated with radiation therapy (8-52). In cross check, approximately 60 percent of the attribute pairs in this dataset showed a strong negative association, which can be explained by diseases treated in a medical specialty which routinely does not perform the respective procedures in these cases. The corresponding diagram clearly reflects these relationships in the shape of coherent subareas. We could demonstrate that heat maps of measures of association are effective for the visualization of patterns in routine care EMRs. The adjustable method for the assignment of attributes to image dimensions permits a balance between the display of ample information and a favorable level of graphical complexity. The scope of the search can be adapted by the use of pre-existing assumptions about plausible effects to select exposure and outcome attributes. Thus, the proposed method promises to simplify the detection of undiscovered causal effects within routine EMR data. Copyright © 2013 Elsevier B.V. All rights reserved.
Academic research groups: evaluation of their quality and quality of their evaluation
NASA Astrophysics Data System (ADS)
Berche, Bertrand; Holovatch, Yuri; Kenna, Ralph; Mryglod, Olesya
2016-02-01
In recent years, evaluation of the quality of academic research has become an increasingly important and influential business. It determines, often to a large extent, the amount of research funding flowing into universities and similar institutes from governmental agencies and it impacts upon academic careers. Policy makers are becoming increasingly reliant upon, and influenced by, the outcomes of such evaluations. In response, university managers are increasingly attracted to simple metrics as guides to the dynamics of the positions of their various institutions in league tables. However, these league tables are invariably drawn up by inexpert bodies such as newspapers and magazines, using arbitrary measures and criteria. Terms such as “critical mass” and “h-index” are bandied about without understanding of what they actually mean. Rather than accepting the rise and fall of universities, departments and individuals on a turbulent sea of arbitrary measures, we suggest it is incumbent upon the scientific community itself to clarify their nature. Here we report on recent attempts to do that by properly defining critical mass and showing how group size influences research quality. We also examine currently predominant metrics and show that these fail as reliable indicators of group research quality.
Neutral signature Walker-CSI metrics
NASA Astrophysics Data System (ADS)
Coley, A.; Musoke, N.
2015-03-01
We will construct explicit examples of four-dimensional neutral signature Einstein Walker spaces for which all of the polynomial scalar curvature invariants are constant. We show that these Einstein Walker spaces are Kundt. We then investigate the mathematical properties of the spaces, including holonomy and universality.
ERIC Educational Resources Information Center
Harris, Watson
2011-01-01
There are many articles about space management, including those that discuss space calculations, metrics, and categories. Fewer articles discuss the space budgeting processes used by administrators to allocate space. The author attempts to fill this void by discussing her administrative experiences with Middle Tennessee State University's (MTSU)…
A topology of mineralization and its meaning for prospecting
Neuerburg, G.J.
1982-01-01
Epigenetic mineral deposits are universal members of an orderly spatial and temporal arrangement of igneous rocks, endomorphic rocks, and hydrothermally altered rocks. The association and sequence of these rocks is invariant whereas the metric relations and configurations of the properties of these rocks are unlimited in variety. This characterization satisfies the doctrines of topology. Metric relations are statistical, and their modes are among the better guides to optimal areas for exploration. Metric configurations are graphically irregular and unpredictable mathematical surfaces like mountain topography. Each mineral edifice must be mapped to locate its mineral deposits. All measurements and observations are only positive or neutral for the occurrence of a mineral deposit. Effective prospecting is based on an increasing density of positive data with proximity to the mineral deposit. This means sampling for maximal numbers of positive data, pragmatically the highest ore-element assays at each site, by selecting rock showing maximal development of lode attributes.
Gravity induced wave function collapse
NASA Astrophysics Data System (ADS)
Gasbarri, G.; Toroš, M.; Donadi, S.; Bassi, A.
2017-11-01
Starting from an idea of S. L. Adler [in Quantum Nonlocality and Reality: 50 Years of Bell's Theorem, edited by M. Bell and S. Gao (Cambridge University Press, Cambridge, England 2016)], we develop a novel model of gravity induced spontaneous wave function collapse. The collapse is driven by complex stochastic fluctuations of the spacetime metric. After deriving the fundamental equations, we prove the collapse and amplification mechanism, the two most important features of a consistent collapse model. Under reasonable simplifying assumptions, we constrain the strength ξ of the complex metric fluctuations with available experimental data. We show that ξ ≥10-26 in order for the model to guarantee classicality of macro-objects, and at the same time ξ ≤10-20 in order not to contradict experimental evidence. As a comparison, in the recent discovery of gravitational waves in the frequency range 35 to 250 Hz, the (real) metric fluctuations reach a peak of ξ ˜10-21.
Yang-Mills instantons in Kähler spaces with one holomorphic isometry
NASA Astrophysics Data System (ADS)
Chimento, Samuele; Ortín, Tomás; Ruipérez, Alejandro
2018-03-01
We consider self-dual Yang-Mills instantons in 4-dimensional Kähler spaces with one holomorphic isometry and show that they satisfy a generalization of the Bogomol'nyi equation for magnetic monopoles on certain 3-dimensional metrics. We then search for solutions of this equation in 3-dimensional metrics foliated by 2-dimensional spheres, hyperboloids or planes in the case in which the gauge group coincides with the isometry group of the metric (SO(3), SO (1 , 2) and ISO(2), respectively). Using a generalized hedgehog ansatz the Bogomol'nyi equations reduce to a simple differential equation in the radial variable which admits a universal solution and, in some cases, a particular one, from which one finally recovers instanton solutions in the original Kähler space. We work out completely a few explicit examples for some Kähler spaces of interest.
Expanding space-time and variable vacuum energy
NASA Astrophysics Data System (ADS)
Parmeggiani, Claudio
2017-08-01
The paper describes a cosmological model which contemplates the presence of a vacuum energy varying, very slightly (now), with time. The constant part of the vacuum energy generated, some 6 Gyr ago, a deceleration/acceleration transition of the metric expansion; so now, in an aged Universe, the expansion is inexorably accelerating. The vacuum energy varying part is instead assumed to be eventually responsible of an acceleration/deceleration transition, which occurred about 14 Gyr ago; this transition has a dynamic origin: it is a consequence of the general relativistic Einstein-Friedmann equations. Moreover, the vacuum energy (constant and variable) is here related to the zero-point energy of some quantum fields (scalar, vector, or spinor); these fields are necessarily described in a general relativistic way: their structure depends on the space-time metric, typically non-flat. More precisely, the commutators of the (quantum field) creation/annihilation operators are here assumed to depend on the local value of the space-time metric tensor (and eventually of its curvature); furthermore, these commutators rapidly decrease for high momentum values and they reduce to the standard ones for a flat metric. In this way, the theory is ”gravitationally” regularized; in particular, the zero-point (vacuum) energy density has a well defined value and, for a non static metric, depends on the (cosmic) time. Note that this varying vacuum energy can be negative (Fermi fields) and that a change of its sign typically leads to a minimum for the metric expansion factor (a ”bounce”).
NASA Astrophysics Data System (ADS)
Padmanabhan, T.
2010-11-01
The purpose of the paper is fivefold: (a) Argue that the question in the title can be presented in a meaningful manner and that it requires an answer. (b) Discuss the conventional answers and explain why they are unsatisfactory. (c) Suggest that a key ingredient in the answer could be the instability arising due to the ‘wrong’ sign in the Hilbert action for the kinetic energy term corresponding to expansion factor. (d) Describe how this idea connects up with another peculiar feature of our universe, viz. it spontaneously became more and more classical in the course of evolution. (e) Provide a speculative but plausible scenario, based on the thermodynamic perspective of gravity, in which one has the hope for relating the thermodynamic and cosmological arrows of time.
Frequency conversion by the transformation-optical analogue of the cosmological redshift
NASA Astrophysics Data System (ADS)
Ginis, Vincent; Tassin, Philippe; Craps, Ben; Veretennicoff, Irina
2011-10-01
Recently, there has been a lot of interest in electromagnetic analogues of general relativistic effects. Using the techniques of transformation optics, the material parameters of table-top devices have been calculated such that they implement several effects that occur in outer space, e.g., the implementation of an artificial event horizon inside an optical fiber, an inhomogeneous refractive index profile to mimic celestial mechanics, or an omnidirectional absorber based on an equivalence with black holes. In this communication, we show how we have extended the framework of transformation optics to a time-dependent metric-the Robertson-Walker metric, a popular model for our universe describing the cosmological redshift. This redshift occurs due to the expansion of the universe, where a photon of frequency ωem emitted at instance tem, will be measured at a different frequency ωobs at time tobs. The relation between these two frequencies is given by ωobsa(tobs) = ωema(tem), where a(t) is the time-dependent scale factor of the expanding universe. Our results show that the transformation-optical analogue of the Robertson-Walker metric is a medium with linear, isotropic, and homogeneous material parameters that evolve as a given function of time. The electromagnetic solutions inside such a medium are frequency shifted according to the cosmological redshift formula. Furthermore, we have demonstrated that a finite slab of such a material allows for the frequency conversion of an optical signal without the creation of unwanted sidebands. Because the medium is linear, the superposition principle remains applicable and arbitrary wavepackets can be converted [V. Ginis, P. Tassin, B. Craps, and I. Veretennicoff Opt. Express 18, 5350-5355 (2010)1].
Effect of gender on communication of health information to older adults.
Dearborn, Jennifer L; Panzer, Victoria P; Burleson, Joseph A; Hornung, Frederick E; Waite, Harrison; Into, Frances H
2006-04-01
To examine the effect of gender on three key elements of communication with elderly individuals: effectiveness of the communication, perceived relevance to the individual, and effect of gender-stereotyped content. Survey. University of Connecticut Health Center. Thirty-three subjects (17 female); aged 69 to 91 (mean+/-standard deviation 82+/-5.4). Older adults listened to 16 brief narratives randomized in order and by the sex of the speaker (Narrator Voice). Effectiveness was measured according to ability to identify key features (Risks), and subjects were asked to rate the relevance (Plausibility). Number of Risks detected and determinations of plausibility were analyzed according to Subject Gender and Narrator Voice. Narratives were written for either sex or included male or female bias (Neutral or Stereotyped). Female subjects identified a significantly higher number of Risks across all narratives (P=.01). Subjects perceived a significantly higher number of Risks with a female Narrator Voice (P=.03). A significant Voice-by-Stereotype interaction was present for female-stereotyped narratives (P=.009). In narratives rated as Plausible, subjects detected more Risks (P=.02). Subject Gender influenced communication effectiveness. A female speaker resulted in identification of more Risks for subjects of both sexes, particularly for Stereotyped narratives. There was no significant effect of matching Subject Gender and Narrator Voice. This study suggests that the sex of the speaker influences the effectiveness of communication with older adults. These findings should motivate future research into the means by which medical providers can improve communication with their patients.
NASA Astrophysics Data System (ADS)
Mio, Matthew J.
2017-02-01
Many logistic and instructional changes followed the incorporation of the 12 principles of green chemistry into organic chemistry laboratory courses at the University of Detroit Mercy. Over the last decade, institutional limitations have been turned into green chemical strengths in many areas, including integration of atom economy metrics into learning outcomes, replacing overly toxic equipment and reagents, and modifying matters of reaction scale and type.
Gravitational particle production in braneworld cosmology.
Bambi, C; Urban, F R
2007-11-09
Gravitational particle production in a time variable metric of an expanding universe is efficient only when the Hubble parameter H is not too small in comparison with the particle mass. In standard cosmology, the huge value of the Planck mass M{Pl} makes the mechanism phenomenologically irrelevant. On the other hand, in braneworld cosmology, the expansion rate of the early Universe can be much faster, and many weakly interacting particles can be abundantly created. Cosmological implications are discussed.
Plane Symmetric Solutions in f(G) Gravity
NASA Astrophysics Data System (ADS)
Shamir, M. Farasat; Saeed, Atrooba
2017-12-01
The purpose of this document is to investigate the universe in f(G) gravity. A wgeneral static plane symmetric space-time is chosen and exact solutions are explored using a viable f(G) gravity model. In particular, power and exponential law solutions are discussed. In addition, the physical relevance of the solutions with Taub's metric and anti-deSitter space-time is shown. Graphical analysis of energy density and pressure of the universe is done to substantiate the study.
Dissipative universe-inflation with soft singularity
NASA Astrophysics Data System (ADS)
Brevik, Iver; Timoshkin, Alexander V.
We investigate the early-time accelerated universe after the Big Bang. We pay attention to the dissipative properties of the inflationary universe in the presence of a soft type singularity, making use of the parameters of the generalized equation of state of the fluid. Flat Friedmann-Robertson-Walker metric is being used. We consider cosmological models leading to the so-called type IV singular inflation. Our obtained theoretical results are compared with observational data from the Planck satellite. The theoretical predictions for the spectral index turn out to be in agreement with the data, while for the scalar-to-tensor ratio, there are minor deviations.
Scalar-Tensor Black Holes Embedded in an Expanding Universe
NASA Astrophysics Data System (ADS)
Tretyakova, Daria; Latosh, Boris
2018-02-01
In this review we focus our attention on scalar-tensor gravity models and their empirical verification in terms of black hole and wormhole physics. We focus on a black hole, embedded in an expanding universe, describing both cosmological and astrophysical scales. We show that in scalar-tensor gravity it is quite common that the local geometry is isolated from the cosmological expansion, so that it does not backreact on the black hole metric. We try to extract common features of scalar-tensor black holes in an expanding universe and point out the gaps that must be filled.
Evaluating Research Administration: Methods and Utility
ERIC Educational Resources Information Center
Marina, Sarah; Davis-Hamilton, Zoya; Charmanski, Kara E.
2015-01-01
Three studies were jointly conducted by the Office of Research Administration and Office of Proposal Development at Tufts University to evaluate the services within each respective office. The studies featured assessments that used, respectively, (1) quantitative metrics; (2) a quantitative satisfaction survey with limited qualitative questions;…
DOT National Transportation Integrated Search
2016-12-16
Transportation plans and projects are typically evaluated, both prospectively and retrospectively, with metrics of mobility, notably highway level of service. This practice implicitly treats mobility improvements as desirable. Yet mobility improvemen...
EPA Examines Schools' Handling of Toxic Waste.
ERIC Educational Resources Information Center
Hanson, David
1989-01-01
Estimates that about 30,000 universities, colleges, and high schools produce a total of 4000 metric tons of hazardous waste annually. Discusses the difficulties that academic institutions have in disposing of small amounts of waste. Lists college courses with the potentially hazardous wastes usually produced. (MVL)
The Latest in Corporate-College Partnerships.
ERIC Educational Resources Information Center
Meister, Jeanne C.
2003-01-01
Success factors in establishing corporate-college partnerships include communicating a shared vision for success, defining the degree of customization and flexibility from a university, and mutually devising a marketing and recruitment program. The metrics for success must be defined early and managed throughout the partnership. (JOW)
Turbulence of Weak Gravitational Waves in the Early Universe.
Galtier, Sébastien; Nazarenko, Sergey V
2017-12-01
We study the statistical properties of an ensemble of weak gravitational waves interacting nonlinearly in a flat space-time. We show that the resonant three-wave interactions are absent and develop a theory for four-wave interactions in the reduced case of a 2.5+1 diagonal metric tensor. In this limit, where only plus-polarized gravitational waves are present, we derive the interaction Hamiltonian and consider the asymptotic regime of weak gravitational wave turbulence. Both direct and inverse cascades are found for the energy and the wave action, respectively, and the corresponding wave spectra are derived. The inverse cascade is characterized by a finite-time propagation of the metric excitations-a process similar to an explosive nonequilibrium Bose-Einstein condensation, which provides an efficient mechanism to ironing out small-scale inhomogeneities. The direct cascade leads to an accumulation of the radiation energy in the system. These processes might be important for understanding the early Universe where a background of weak nonlinear gravitational waves is expected.
The Hubble IR cutoff in holographic ellipsoidal cosmologies
NASA Astrophysics Data System (ADS)
Cataldo, Mauricio; Cruz, Norman
2018-01-01
It is well known that for spatially flat FRW cosmologies, the holographic dark energy disfavors the Hubble parameter as a candidate for the IR cutoff. For overcoming this problem, we explore the use of this cutoff in holographic ellipsoidal cosmological models, and derive the general ellipsoidal metric induced by a such holographic energy density. Despite the drawbacks that this cutoff presents in homogeneous and isotropic universes, based on this general metric, we developed a suitable ellipsoidal holographic cosmological model, filled with a dark matter and a dark energy components. At late time stages, the cosmic evolution is dominated by a holographic anisotropic dark energy with barotropic equations of state. The cosmologies expand in all directions in accelerated manner. Since the ellipsoidal cosmologies given here are not asymptotically FRW, the deviation from homogeneity and isotropy of the universe on large cosmological scales remains constant during all cosmic evolution. This feature allows the studied holographic ellipsoidal cosmologies to be ruled by an equation of state ω =p/ρ , whose range belongs to quintessence or even phantom matter.
Effective Lagrangian in de Sitter spacetime
NASA Astrophysics Data System (ADS)
Kitamoto, Hiroyuki; Kitazawa, Yoshihisa
2017-01-01
Scale invariant fluctuations of metric are a universal feature of quantum gravity in de Sitter spacetime. We construct an effective Lagrangian which summarizes their implications on local physics by integrating superhorizon metric fluctuations. It shows infrared quantum effects are local and render fundamental couplings time dependent. We impose Lorenz invariance on the effective Lagrangian as it is required by the principle of general covariance. We show that such a requirement leads to unique physical predictions by fixing the quantization ambiguities. We explain how the gauge parameter dependence of observables is canceled. In particular the relative evolution speed of the couplings are shown to be gauge invariant.
Natural entropy production in an inflationary model for a polarized vacuum
NASA Astrophysics Data System (ADS)
Berman, Marcelo Samuel; Som, Murari M.
2007-08-01
Though entropy production is forbidden in standard FRW Cosmology, Berman and Som presented a simple inflationary model where entropy production by bulk viscosity, during standard inflation without ad hoc pressure terms can be accommodated with Robertson Walker’s metric, so the requirement that the early Universe be anisotropic is not essential in order to have entropy growth during inflationary phase, as we show. Entropy also grows due to shear viscosity, for the anisotropic case. The intrinsically inflationary metric that we propose can be thought of as defining a polarized vacuum, and leads directly to the desired effects without the need of introducing extra pressure terms.
Stochastic HKMDHE: A multi-objective contrast enhancement algorithm
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2018-02-01
This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.
Geodesics in nonexpanding impulsive gravitational waves with Λ. II
NASA Astrophysics Data System (ADS)
Sämann, Clemens; Steinbauer, Roland
2017-11-01
We investigate all geodesics in the entire class of nonexpanding impulsive gravitational waves propagating in an (anti-)de Sitter universe using the distributional metric. We extend the regularization approach of part I [Sämann, C. et al., Classical Quantum Gravity 33(11), 115002 (2016)] to a full nonlinear distributional analysis within the geometric theory of generalized functions. We prove global existence and uniqueness of geodesics that cross the impulsive wave and hence geodesic completeness in full generality for this class of low regularity spacetimes. This, in particular, prepares the ground for a mathematically rigorous account on the "physical equivalence" of the continuous form with the distributional "form" of the metric.
Spacetime emergence of the robertson-walker universe from a matrix model.
Erdmenger, Johanna; Meyer, René; Park, Jeong-Hyuck
2007-06-29
Using a novel, string theory-inspired formalism based on a Hamiltonian constraint, we obtain a conformal mechanical system for the spatially flat four-dimensional Robertson-Walker Universe. Depending on parameter choices, this system describes either a relativistic particle in the Robertson-Walker background or metric fluctuations of the Robertson-Walker geometry. Moreover, we derive a tree-level M theory matrix model in this time-dependent background. Imposing the Hamiltonian constraint forces the spacetime geometry to be fuzzy near the big bang, while the classical Robertson-Walker geometry emerges as the Universe expands. From our approach, we also derive the temperature of the Universe interpolating between the radiation and matter dominated eras.
An investigation of fighter aircraft agility
NASA Technical Reports Server (NTRS)
Valasek, John; Downing, David R.
1993-01-01
This report attempts to unify in a single document the results of a series of studies on fighter aircraft agility funded by the NASA Ames Research Center, Dryden Flight Research Facility and conducted at the University of Kansas Flight Research Laboratory during the period January 1989 through December 1993. New metrics proposed by pilots and the research community to assess fighter aircraft agility are collected and analyzed. The report develops a framework for understanding the context into which the various proposed fighter agility metrics fit in terms of application and testing. Since new metrics continue to be proposed, this report does not claim to contain every proposed fighter agility metric. Flight test procedures, test constraints, and related criteria are developed. Instrumentation required to quantify agility via flight test is considered, as is the sensitivity of the candidate metrics to deviations from nominal pilot command inputs, which is studied in detail. Instead of supplying specific, detailed conclusions about the relevance or utility of one candidate metric versus another, the authors have attempted to provide sufficient data and analyses for readers to formulate their own conclusions. Readers are therefore ultimately responsible for judging exactly which metrics are 'best' for their particular needs. Additionally, it is not the intent of the authors to suggest combat tactics or other actual operational uses of the results and data in this report. This has been left up to the user community. Twenty of the candidate agility metrics were selected for evaluation with high fidelity, nonlinear, non real-time flight simulation computer programs of the F-5A Freedom Fighter, F-16A Fighting Falcon, F-18A Hornet, and X-29A. The information and data presented on the 20 candidate metrics which were evaluated will assist interested readers in conducting their own extensive investigations. The report provides a definition and analysis of each metric; details of how to test and measure the metric, including any special data reduction requirements; typical values for the metric obtained using one or more aircraft types; and a sensitivity analysis if applicable. The report is organized as follows. The first chapter in the report presents a historical review of air combat trends which demonstrate the need for agility metrics in assessing the combat performance of fighter aircraft in a modern, all-aspect missile environment. The second chapter presents a framework for classifying each candidate metric according to time scale (transient, functional, instantaneous), further subdivided by axis (pitch, lateral, axial). The report is then broadly divided into two parts, with the transient agility metrics (pitch lateral, axial) covered in chapters three, four, and five, and the functional agility metrics covered in chapter six. Conclusions, recommendations, and an extensive reference list and biography are also included. Five appendices contain a comprehensive list of the definitions of all the candidate metrics; a description of the aircraft models and flight simulation programs used for testing the metrics; several relations and concepts which are fundamental to the study of lateral agility; an in-depth analysis of the axial agility metrics; and a derivation of the relations for the instantaneous agility and their approximations.
DOT National Transportation Integrated Search
2016-12-01
This report seeks opportunities for standardization of these data and explains findings on three principal tasks. First, it assesses the current state of standardized transportation data. By studying documentation of other programs of standardized da...
Palatini formulation of f( R, T) gravity theory, and its cosmological implications
NASA Astrophysics Data System (ADS)
Wu, Jimin; Li, Guangjie; Harko, Tiberiu; Liang, Shi-Dong
2018-05-01
We consider the Palatini formulation of f( R, T) gravity theory, in which a non-minimal coupling between the Ricci scalar and the trace of the energy-momentum tensor is introduced, by considering the metric and the affine connection as independent field variables. The field equations and the equations of motion for massive test particles are derived, and we show that the independent connection can be expressed as the Levi-Civita connection of an auxiliary, energy-momentum trace dependent metric, related to the physical metric by a conformal transformation. Similar to the metric case, the field equations impose the non-conservation of the energy-momentum tensor. We obtain the explicit form of the equations of motion for massive test particles in the case of a perfect fluid, and the expression of the extra force, which is identical to the one obtained in the metric case. The thermodynamic interpretation of the theory is also briefly discussed. We investigate in detail the cosmological implications of the theory, and we obtain the generalized Friedmann equations of the f( R, T) gravity in the Palatini formulation. Cosmological models with Lagrangians of the type f=R-α ^2/R+g(T) and f=R+α ^2R^2+g(T) are investigated. These models lead to evolution equations whose solutions describe accelerating Universes at late times.
EnviroAtlas - Biodiversity Metrics by 12-digit HUC for the Southwestern United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US EPA, and the US Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness for all vertebrates, specific taxon gr
This dataset was produced by a joint effort of New Mexico State University (NMSU), the U.S. Environmental Protection Agency (EPA), and the U.S. Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for bird species. Metrics include all bird species richness, lists identif
Aging and efficiency in living systems: Complexity, adaptation and self-organization.
Chatterjee, Atanu; Georgiev, Georgi; Iannacchione, Germano
2017-04-01
Living systems are open, out-of-equilibrium thermodynamic entities, that maintain order by locally reducing their entropy. Aging is a process by which these systems gradually lose their ability to maintain their out-of-equilibrium state, as measured by their free-energy rate density, and hence, their order. Thus, the process of aging reduces the efficiency of those systems, making them fragile and less adaptive to the environmental fluctuations, gradually driving them towards the state of thermodynamic equilibrium. In this paper, we discuss the various metrics that can be used to understand the process of aging from a complexity science perspective. Among all the metrics that we propose, action efficiency, is observed to be of key interest as it can be used to quantify order and self-organization in any physical system. Based upon our arguments, we present the dependency of other metrics on the action efficiency of a system, and also argue as to how each of the metrics, influences all the other system variables. In order to support our claims, we draw parallels between technological progress and biological growth. Such parallels are used to support the universal applicability of the metrics and the methodology presented in this paper. Therefore, the results and the arguments presented in this paper throw light on the finer nuances of the science of aging. Copyright © 2017 Elsevier B.V. All rights reserved.
EnviroAtlas - Biodiversity Metrics by 12-digit HUC for the Southeastern United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US EPA, and the US Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 14 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness for all vertebrates, specific taxon gr
NASA Astrophysics Data System (ADS)
Fu, Chengjie; Wu, Puxun; Yu, Hongwei
2017-11-01
The inflationary dynamics and preheating in a model with a nonminimally coupled inflaton field in the metric and Palatini formalisms are studied in this paper. We find that in both formalisms, irrespective of the initial conditions, our Universe will evolve into a slow-roll inflationary era and then the scalar field rolls into an oscillating phase. The value of the scalar field at the end of the inflation in the Palatini formalism is always larger than that in the metric one, which becomes more and more obvious with the increase of the absolute value of the coupling parameter |ξ |. During the preheating, we find that the inflaton quanta are produced explosively due to the parameter resonance and the growth of inflaton quanta will be terminated by the backreaction. With the increase of |ξ |, the resonance bands gradually close to the zero momentum (k =0 ), and the structure of resonance changes and becomes broader and broader in the metric formalism, while it remains to be narrow in the Palatini formalism. The energy transfer from the inflaton field to the fluctuation becomes more and more efficient with the increase of |ξ |, and in the metric formalism the growth of the efficiency of energy transfer is much faster than that in the Palatini formalism. Therefore, the inflation and preheating show different characteristics in different formalisms.
Is there vacuum when there is mass? Vacuum and non-vacuum solutions for massive gravity
NASA Astrophysics Data System (ADS)
Martín-Moruno, Prado; Visser, Matt
2013-08-01
Massive gravity is a theory which has a tremendous amount of freedom to describe different cosmologies, but at the same time, the various solutions one encounters must fulfil some rather nontrivial constraints. Most of the freedom comes not from the Lagrangian, which contains only a small number of free parameters (typically three depending on counting conventions), but from the fact that one is in principle free to choose the reference metric almost arbitrarily—which effectively introduces a non-denumerable infinity of free parameters. In the current paper, we stress that although changing the reference metric would lead to a different cosmological model, this does not mean that the dynamics of the universe can be entirely divorced from its matter content. That is, while the choice of reference metric certainly influences the evolution of the physically observable foreground metric, the effect of matter cannot be neglected. Indeed the interplay between matter and geometry can be significantly changed in some specific models; effectively since the graviton would be able to curve the spacetime by itself, without the need of matter. Thus, even the set of vacuum solutions for massive gravity can have significant structure. In some cases, the effect of the reference metric could be so strong that no conceivable material content would be able to drastically affect the cosmological evolution. Dedicated to the memory of Professor Pedro F González-Díaz
Numerical relativity and the early Universe
NASA Astrophysics Data System (ADS)
Mironov, Sergey
2016-10-01
We consider numerical simulations in general relativity in ADM formalism with cosmological ansatz for the metric. This ansatz is convenient for investigations of the Universe creation in laboratory with Galileons. Here we consider toy model for the software: spherically symmetric scalar field minimally coupled to the gravity with asymmetric double well potential. We studied the dependence of radius of critical bubble on the parameters of the theory. It demonstrates the wide applicability of thin-wall approximation. We did not find any kind of stable bubble solution.
Evolution of the equations of dynamics of the Universe: From Friedmann to the present day
NASA Astrophysics Data System (ADS)
Soloviev, V. O.
2017-05-01
Celebrating the centenary of general relativity theory, we must recall that Friedmann's discovery of the equations of evolution of the Universe became the strongest prediction of this theory. These equations currently remain the foundation of modern cosmology. Nevertheless, data from new observations stimulate a search for modified theories of gravitation. We discuss cosmological aspects of theories with two dynamical metrics and theories of massive gravity, one of which was developed by Logunov and his coworkers.
Mauderly, J L; Barrett, E G; Day, K C; Gigliotti, A P; McDonald, J D; Harrod, K S; Lund, A K; Reed, M D; Seagrave, J C; Campen, M J; Seilkop, S K
2014-09-01
The NERC Program conducted identically designed exposure-response studies of the respiratory and cardiovascular responses of rodents exposed by inhalation for up to 6 months to diesel and gasoline exhausts (DE, GE), wood smoke (WS) and simulated downwind coal emissions (CE). Concentrations of the four combustion-derived mixtures ranged from near upper bound plausible to common occupational and environmental hotspot levels. An "exposure effect" statistic was created to compare the strengths of exposure-response relationships and adjustments were made to minimize false positives among the large number of comparisons. All four exposures caused statistically significant effects. No exposure caused overt illness, neutrophilic lung inflammation, increased circulating micronuclei or histopathology of major organs visible by light microscopy. DE and GE caused the greatest lung cytotoxicity. WS elicited the most responses in lung lavage fluid. All exposures reduced oxidant production by unstimulated alveolar macrophages, but only GE suppressed stimulated macrophages. Only DE retarded clearance of bacteria from the lung. DE before antigen challenge suppressed responses of allergic mice. CE tended to amplify allergic responses regardless of exposure order. GE and DE induced oxidant stress and pro-atherosclerotic responses in aorta; WS and CE had no such effects. No overall ranking of toxicity was plausible. The ranking of exposures by number of significant responses varied among the response models, with each of the four causing the most responses for at least one model. Each exposure could also be deemed most or least toxic depending on the exposure metric used for comparison. The database is available for additional analyses.
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
Whale song analyses using bioinformatics sequence analysis approaches
NASA Astrophysics Data System (ADS)
Chen, Yian A.; Almeida, Jonas S.; Chou, Lien-Siang
2005-04-01
Animal songs are frequently analyzed using discrete hierarchical units, such as units, themes and songs. Because animal songs and bio-sequences may be understood as analogous, bioinformatics analysis tools DNA/protein sequence alignment and alignment-free methods are proposed to quantify the theme similarities of the songs of false killer whales recorded off northeast Taiwan. The eighteen themes with discrete units that were identified in an earlier study [Y. A. Chen, masters thesis, University of Charleston, 2001] were compared quantitatively using several distance metrics. These metrics included the scores calculated using the Smith-Waterman algorithm with the repeated procedure; the standardized Euclidian distance and the angle metrics based on word frequencies. The theme classifications based on different metrics were summarized and compared in dendrograms using cluster analyses. The results agree with earlier classifications derived by human observation qualitatively. These methods further quantify the similarities among themes. These methods could be applied to the analyses of other animal songs on a larger scale. For instance, these techniques could be used to investigate song evolution and cultural transmission quantifying the dissimilarities of humpback whale songs across different seasons, years, populations, and geographic regions. [Work supported by SC Sea Grant, and Ilan County Government, Taiwan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pitts, J. Brian, E-mail: jbp25@cam.ac.uk
2016-02-15
Einstein’s equations were derived for a free massless spin-2 field using universal coupling in the 1950–1970s by various authors; total stress–energy including gravity’s served as a source for linear free field equations. A massive variant was likewise derived in the late 1960s by Freund, Maheshwari and Schonberg, and thought to be unique. How broad is universal coupling? In the last decade four 1-parameter families of massive spin-2 theories (contravariant, covariant, tetrad, and cotetrad of almost any density weights) have been derived using universal coupling. The (co)tetrad derivations included 2 of the 3 pure spin-2 theories due to de Rham, Gabadadze,more » and Tolley; those two theories first appeared in the 2-parameter Ogievetsky–Polubarinov family (1965), which developed the symmetric square root of the metric as a nonlinear group realization. One of the two theories was identified as pure spin-2 by Maheshwari in 1971–1972, thus evading the Boulware–Deser–Tyutin–Fradkin ghost by the time it was announced. Unlike the previous 4 families, this paper permits nonlinear field redefinitions to build the effective metric. By not insisting in advance on knowing the observable significance of the graviton potential to all orders, one finds that an arbitrary graviton mass term can be derived using universal coupling. The arbitrariness of a universally coupled mass/self-interaction term contrasts sharply with the uniqueness of the Einstein kinetic term. One might have hoped to use universal coupling as a tie-breaking criterion for choosing among theories that are equally satisfactory on more crucial grounds (such as lacking ghosts and having a smooth massless limit). But the ubiquity of universal coupling implies that the criterion does not favor any particular theories among those with the Einstein kinetic term.« less
NASA Astrophysics Data System (ADS)
Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.
2017-04-01
Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.
Resolving Conflicts Between Syntax and Plausibility in Sentence Comprehension
Andrews, Glenda; Ogden, Jessica E.; Halford, Graeme S.
2017-01-01
Comprehension of plausible and implausible object- and subject-relative clause sentences with and without prepositional phrases was examined. Undergraduates read each sentence then evaluated a statement as consistent or inconsistent with the sentence. Higher acceptance of consistent than inconsistent statements indicated reliance on syntactic analysis. Higher acceptance of plausible than implausible statements reflected reliance on semantic plausibility. There was greater reliance on semantic plausibility and lesser reliance on syntactic analysis for more complex object-relatives and sentences with prepositional phrases than for less complex subject-relatives and sentences without prepositional phrases. Comprehension accuracy and confidence were lower when syntactic analysis and semantic plausibility yielded conflicting interpretations. The conflict effect on comprehension was significant for complex sentences but not for less complex sentences. Working memory capacity predicted resolution of the syntax-plausibility conflict in more and less complex items only when sentences and statements were presented sequentially. Fluid intelligence predicted resolution of the conflict in more and less complex items under sequential and simultaneous presentation. Domain-general processes appear to be involved in resolving syntax-plausibility conflicts in sentence comprehension. PMID:28458748
Counterfactual Plausibility and Comparative Similarity.
Stanley, Matthew L; Stewart, Gregory W; Brigard, Felipe De
2017-05-01
Counterfactual thinking involves imagining hypothetical alternatives to reality. Philosopher David Lewis (1973, 1979) argued that people estimate the subjective plausibility that a counterfactual event might have occurred by comparing an imagined possible world in which the counterfactual statement is true against the current, actual world in which the counterfactual statement is false. Accordingly, counterfactuals considered to be true in possible worlds comparatively more similar to ours are judged as more plausible than counterfactuals deemed true in possible worlds comparatively less similar. Although Lewis did not originally develop his notion of comparative similarity to be investigated as a psychological construct, this study builds upon his idea to empirically investigate comparative similarity as a possible psychological strategy for evaluating the perceived plausibility of counterfactual events. More specifically, we evaluate judgments of comparative similarity between episodic memories and episodic counterfactual events as a factor influencing people's judgments of plausibility in counterfactual simulations, and we also compare it against other factors thought to influence judgments of counterfactual plausibility, such as ease of simulation and prior simulation. Our results suggest that the greater the perceived similarity between the original memory and the episodic counterfactual event, the greater the perceived plausibility that the counterfactual event might have occurred. While similarity between actual and counterfactual events, ease of imagining, and prior simulation of the counterfactual event were all significantly related to counterfactual plausibility, comparative similarity best captured the variance in ratings of counterfactual plausibility. Implications for existing theories on the determinants of counterfactual plausibility are discussed. Copyright © 2016 Cognitive Science Society, Inc.
Partially supervised speaker clustering.
Tang, Hao; Chu, Stephen Mingyu; Hasegawa-Johnson, Mark; Huang, Thomas S
2012-05-01
Content-based multimedia indexing, retrieval, and processing as well as multimedia databases demand the structuring of the media content (image, audio, video, text, etc.), one significant goal being to associate the identity of the content to the individual segments of the signals. In this paper, we specifically address the problem of speaker clustering, the task of assigning every speech utterance in an audio stream to its speaker. We offer a complete treatment to the idea of partially supervised speaker clustering, which refers to the use of our prior knowledge of speakers in general to assist the unsupervised speaker clustering process. By means of an independent training data set, we encode the prior knowledge at the various stages of the speaker clustering pipeline via 1) learning a speaker-discriminative acoustic feature transformation, 2) learning a universal speaker prior model, and 3) learning a discriminative speaker subspace, or equivalently, a speaker-discriminative distance metric. We study the directional scattering property of the Gaussian mixture model (GMM) mean supervector representation of utterances in the high-dimensional space, and advocate exploiting this property by using the cosine distance metric instead of the euclidean distance metric for speaker clustering in the GMM mean supervector space. We propose to perform discriminant analysis based on the cosine distance metric, which leads to a novel distance metric learning algorithm—linear spherical discriminant analysis (LSDA). We show that the proposed LSDA formulation can be systematically solved within the elegant graph embedding general dimensionality reduction framework. Our speaker clustering experiments on the GALE database clearly indicate that 1) our speaker clustering methods based on the GMM mean supervector representation and vector-based distance metrics outperform traditional speaker clustering methods based on the “bag of acoustic features” representation and statistical model-based distance metrics, 2) our advocated use of the cosine distance metric yields consistent increases in the speaker clustering performance as compared to the commonly used euclidean distance metric, 3) our partially supervised speaker clustering concept and strategies significantly improve the speaker clustering performance over the baselines, and 4) our proposed LSDA algorithm further leads to state-of-the-art speaker clustering performance.
A Ten Year Citation Analysis of Major Australian Research Institutions
ERIC Educational Resources Information Center
Batterham, Robin J.
2011-01-01
The introduction of the Excellence in Research for Australia scheme has heightened debate amongst research institutions over the use of metrics such as citations, especially given the ready availability of citation data. An analysis is presented of the citation performance of nine Australian universities and the Commonwealth Scientific, Industrial…
The Relationship between Student Engagement and Alumni Involvement
ERIC Educational Resources Information Center
Randazza, Paula T.
2017-01-01
As issues of access, affordability, and accountability continue to make the national agenda, students and families are evaluating colleges and universities based on metrics that help determine a return on investment. While measures such as student loan debt and graduation rates are important factors in evaluating any institution, it is also…
Benchmarking Usage Statistics in Collection Management Decisions for Serials
ERIC Educational Resources Information Center
Tucker, Cory
2009-01-01
Usage statistics are an important metric for making decisions on serials. Although the University of Nevada, Las Vegas (UNLV) Libraries have been collecting usage statistics, the statistics had not frequently been used to make decisions and had not been included in collection development policy. After undergoing a collection assessment, the…
A Challenge to Metrics as Evidence of Scholarity
ERIC Educational Resources Information Center
Stelmach, Bonnie L.; von Wolff, Stuart D.
2011-01-01
Now that universities have shifted their priorities to those of the "cash nexus", they increasingly articulate their accomplishments and validate their existence in business terms for a globally competitive academic market. But corporatizing trends and the use of bibliometric tools that rank publication and quantify scholarity impact a…
Applying a Metrics Report Card
ERIC Educational Resources Information Center
Klubeck, Martin; Langthorne, Michael
2008-01-01
In this article, the authors suggest that providing a report card enables an IT department to check its progress and overall performance; communicate the department's effectiveness to university leadership, IT membership, and customers; and make any necessary adjustments. A report card will not show how efficiently the IT department functions, but…
Inhomogeneous generalization of some Bianchi models
NASA Astrophysics Data System (ADS)
Carmeli, M.; Charach, Ch.
1980-02-01
Vacuum Bianchi models which can be transformed to the Einstein-Rosen metric are considered. The models are used in order to construct new inhomogeneous universes, which are generalizations of Bianchi cosmologies of types III, V and VIh. Recent generalizations of these Bianchi models, considered by Wainwright et al., are also discussed.
The Paradox of Collaboration: A Moral Continuum
ERIC Educational Resources Information Center
Macfarlane, Bruce
2017-01-01
Collaboration is a modern mantra of the neoliberal university and part of a discourse allied to research performativity quantitatively measured via co-authorship. Yet, beyond the metrics and the positive rhetoric collaboration is a complex and paradoxical concept. Academic staff are exhorted to collaborate, particularly in respect to research…
An Examination of Advisor Concerns in the Era of Academic Analytics
ERIC Educational Resources Information Center
Daughtry, Jeremy J.
2017-01-01
Performance-based funding models are increasingly becoming the norm for many institutions of higher learning. Such models place greater emphasis on student retention and success metrics, for example, as requirements for receiving state appropriations. To stay competitive, universities have adopted academic analytics technologies capable of…
IT Metrics and Money: One Approach to Public Accountability
ERIC Educational Resources Information Center
Daigle, Stephen L.
2004-01-01
Performance measurement can be a difficult political as well as technical challenge for educational institutions at all levels. Performance-based budgeting can raise the stakes still higher by linking resource allocation to a public "report card." The 23-campus system of the California State University (CSU) accepted each of these…
2012-01-01
Microorganisms are ubiquitous on earth and have diverse metabolic transformative capabilities important for environmental biodegradation of chemicals that helps maintain ecosystem and human health. Microbial biodegradative metabolism is the main focus of the University of Minnesota Biocatalysis/Biodegradation Database (UM-BBD). UM-BBD data has also been used to develop a computational metabolic pathway prediction system that can be applied to chemicals for which biodegradation data is currently lacking. The UM-Pathway Prediction System (UM-PPS) relies on metabolic rules that are based on organic functional groups and predicts plausible biodegradative metabolism. The predictions are useful to environmental chemists that look for metabolic intermediates, for regulators looking for potential toxic products, for microbiologists seeking to understand microbial biodegradation, and others with a wide-range of interests. PMID:22587916
Adding a nitrogen footprint to Colorado State University’s sustainability plan
Kimiecik, Jacob; Baron, Jill S.; Weinmann, Timothy; Taylor, Emily
2017-01-01
As a large land grant university with more than 32,000 students, Colorado State University has both on-campus non-agricultural and agricultural sources of nitrogen (N) released to the environment. We used the Nitrogen Footprint Tool to estimate the amount of N released from different sectors of the university for the CSU 2014 academic year. The largest on campus sources were food production, utilities (heating, cooling, electricity), and research animals. The total on-campus N footprint in 2014 was 287 metric tons. This value was equivalent to the nitrogen footprint of agricultural experiment stations and other agricultural facilities, whose nitrogen footprint was 273 metric tons. CSU has opportunities to reduce its on-campus footprint through educational programs promoting low-meat diets and commuting by bicycle or bus. There is also an opportunity to advance ideas of agricultural best management practices, including precision farming and better livestock management. This article describes the planned and ongoing efforts to educate CSU about how societal activities release nitrogen to the environment, contributing to global change. It offers personal and institutional options for taking action, which would ultimately reduce CSU’s excess reactive nitrogen loss to the environment. The N-footprint for CSU, including scenarios of possible future nitrogen reductions, is also discussed.
Recruitment Processes in Academia: Does the Emperor Have Any Clothes?
Ataie-Ashtiani, Behzad
2016-10-01
The final outcome of promotion and recruitment processes in universities should be conventional and plausible by the members of the relevant scientific community, to affirm that the processes have been competitive and fair. The objective of this opinion letter is to make a plea for the importance of the post-auditing and quantitative assessment of the selection criteria. It is shown that for an example case the outcome of the post-audit does not look reasonable from an external point of view, at least regarding the research competency.
Primordial perturbations in a rainbow universe with running Newton constant
NASA Astrophysics Data System (ADS)
Brighenti, Francesco; Gubitosi, Giulia; Magueijo, Joao
2017-03-01
We compute the spectral index of primordial perturbations in a rainbow universe. We allow the Newton constant G to run at (super-) Planckian energies and we consider both vacuum and thermal perturbations. If the rainbow metric is the one associated to a generalized Horava-Lifshitz dispersion relation, we find that only when G tends asymptotically to 0 can one match the observed value of the spectral index and solve the horizon problem, both for vacuum and thermal perturbations. For vacuum fluctuations the observational constraints imply that the primordial universe expansion can be both accelerating or decelerating, while in the case of thermal perturbations only decelerating expansion is allowed.
Evaluating community and campus environmental public health programs.
Pettibone, Kristianna G; Parras, Juan; Croisant, Sharon Petronella; Drew, Christina H
2014-01-01
The National Institute of Environmental Health Sciences' (NIEHS) Partnerships for Environmental Public Health (PEPH) program created the Evaluation Metrics Manual as a tool to help grantees understand how to map out their programs using a logic model, and to identify measures for documenting their achievements in environmental public health research. This article provides an overview of the manual, describing how grantees and community partners contributed to the manual, and how the basic components of a logic model can be used to identify metrics. We illustrate how the approach can be implemented, using a real-world case study from the University of Texas Medical Branch, where researchers worked with community partners to develop a network to address environmental justice issues.
Anti-de Sitter-space/conformal-field-theory Casimir energy for rotating black holes.
Gibbons, G W; Perry, M J; Pope, C N
2005-12-02
We show that, if one chooses the Einstein static universe as the metric on the conformal boundary of Kerr-anti-de Sitter spacetime, then the Casimir energy of the boundary conformal field theory can easily be determined. The result is independent of the rotation parameters, and the total boundary energy then straightforwardly obeys the first law of thermodynamics. Other choices for the metric on the conformal boundary will give different, more complicated, results. As an application, we calculate the Casimir energy for free self-dual tensor multiplets in six dimensions and compare it with that of the seven-dimensional supergravity dual. They differ by a factor of 5/4.
Primordial cosmology in mimetic born-infeld gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouhmadi-Lopez, Mariam; Chen, Che -Yu; Chen, Pisin
Here, the Eddington-inspired-Born-Infeld (EiBI) model is reformulated within the mimetic approach. In the presence of a mimetic field, the model contains non-trivial vacuum solutions which could be free of spacetime singularity because of the Born-Infeld nature of the theory. We study a realistic primordial vacuum universe and prove the existence of regular solutions, such as primordial inflationary solutions of de Sitter type or bouncing solutions. Besides, the linear instabilities present in the EiBI model are found to be avoidable for some interesting bouncing solutions in which the physical metric as well as the auxiliary metric are regular at the backgroundmore » level.« less
Primordial cosmology in mimetic born-infeld gravity
Bouhmadi-Lopez, Mariam; Chen, Che -Yu; Chen, Pisin
2017-11-29
Here, the Eddington-inspired-Born-Infeld (EiBI) model is reformulated within the mimetic approach. In the presence of a mimetic field, the model contains non-trivial vacuum solutions which could be free of spacetime singularity because of the Born-Infeld nature of the theory. We study a realistic primordial vacuum universe and prove the existence of regular solutions, such as primordial inflationary solutions of de Sitter type or bouncing solutions. Besides, the linear instabilities present in the EiBI model are found to be avoidable for some interesting bouncing solutions in which the physical metric as well as the auxiliary metric are regular at the backgroundmore » level.« less
Volume weighting the measure of the universe from classical slow-roll expansion
NASA Astrophysics Data System (ADS)
Sloan, David; Silk, Joseph
2016-05-01
One of the most frustrating issues in early universe cosmology centers on how to reconcile the vast choice of universes in string theory and in its most plausible high energy sibling, eternal inflation, which jointly generate the string landscape with the fine-tuned and hence relatively small number of universes that have undergone a large expansion and can accommodate observers and, in particular, galaxies. We show that such observations are highly favored for any system whereby physical parameters are distributed at a high energy scale, due to the conservation of the Liouville measure and the gauge nature of volume, asymptotically approaching a period of large isotropic expansion characterized by w =-1 . Our interpretation predicts that all observational probes for deviations from w =-1 in the foreseeable future are doomed to failure. The purpose of this paper is not to introduce a new measure for the multiverse, but rather to show how what is perhaps the most natural and well-known measure, volume weighting, arises as a consequence of the conservation of the Liouville measure on phase space during the classical slow-roll expansion.
Qualitative analysis of Kantowski-Sachs metric in a generic class of f(R) models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leon, Genly; Roque, Armando A., E-mail: genly.leon@ucv.cl, E-mail: arestrada@ucf.edu.cu
2014-05-01
In this paper we investigate, from the dynamical systems perspective, the evolution of a Kantowski-Sachs metric in a generic class of f(R) models. We present conditions (i.e., differentiability conditions, existence of minima, monotony intervals, etc.) for a free input function related to the f(R), that guarantee the asymptotic stability of well-motivated physical solutions, specially, self-accelerated solutions, allowing to describe both inflationary- and late-time acceleration stages of the cosmic evolution. We discuss which f(R) theories allows for a cosmic evolution with an acceptable matter era, in correspondence to the modern cosmological paradigm. We find a very rich behavior, and amongst othersmore » the universe can result in isotropized solutions with observables in agreement with observations, such as de Sitter, quintessence-like, or phantom solutions. Additionally, we find that a cosmological bounce and turnaround are realized in a part of the parameter-space as a consequence of the metric choice.« less
Can rodents conceive hyperbolic spaces?
Urdapilleta, Eugenio; Troiani, Francesca; Stella, Federico; Treves, Alessandro
2015-01-01
The grid cells discovered in the rodent medial entorhinal cortex have been proposed to provide a metric for Euclidean space, possibly even hardwired in the embryo. Yet, one class of models describing the formation of grid unit selectivity is entirely based on developmental self-organization, and as such it predicts that the metric it expresses should reflect the environment to which the animal has adapted. We show that, according to self-organizing models, if raised in a non-Euclidean hyperbolic cage rats should be able to form hyperbolic grids. For a given range of grid spacing relative to the radius of negative curvature of the hyperbolic surface, such grids are predicted to appear as multi-peaked firing maps, in which each peak has seven neighbours instead of the Euclidean six, a prediction that can be tested in experiments. We thus demonstrate that a useful universal neuronal metric, in the sense of a multi-scale ruler and compass that remain unaltered when changing environments, can be extended to other than the standard Euclidean plane. PMID:25948611
Conformally non-flat spacetime representing dense compact objects
NASA Astrophysics Data System (ADS)
Singh, Ksh. Newton; Bhar, Piyali; Rahaman, Farook; Pant, Neeraj; Rahaman, Mansur
2017-06-01
A new conformally non-flat interior spacetime embedded in five-dimensional (5D) pseudo Euclidean space is explored in this paper. We proceed our calculation with the assumption of spherically symmetric anisotropic matter distribution and Karmarkar condition (necessary condition for class one). This solution is free from geometrical singularity and well-behaved in all respects. We ansatz a new type of metric potential g11 and solve for the metric potential g00 via Karmarkar condition. Further, all the physical parameters are determined from Einstein’s field equations using the two metric potentials. All the constants of integration are determined using boundary conditions. Due to its conformally non-flat character, it can represent bounded configurations. Therefore, we have used it to model two compact stars Vela X-1 and Cyg X-2. Indeed, the obtained masses and radii of these two objects from our solution are well matched with those observed values given in [T. Gangopadhyay et al., Mon. Not. R. Astron. Soc. 431, 3216 (2013)] and [J. Casares et al., Mon. Not. R. Astron. Soc. 401, 2517 (2010)]. The equilibrium of the models is investigated from generalized TOV-equation. We have adopted [L. Herrera’s, Phys. Lett. A 165, 206 (1992)] method and static stability criterion of Harisson-Zeldovich-Novikov [B. K. Harrison et al., Gravitational Theory and Gravitational Collapse (University of Chicago Press, 1965); Ya. B. Zeldovich and I. D. Novikov, Relativistic Astrophysics, Vol. 1, Stars and Relativity (University of Chicago Press, 1971)] to analyze the stability of the models.
Newton's absolute time and space in general relativity
NASA Astrophysics Data System (ADS)
Gautreau, Ronald
2000-04-01
I describe a reference system in a spherically symmetric gravitational field that is built around times recorded by radially moving geodesic clocks. The geodesic time coordinate t and the curvature spatial radial coordinate R result in spacetime descriptions of the motion of the geodesic clocks that are exactly identical with equations following from Newton's absolute time and space used with his inverse square law. I show how to use the resulting Newtonian/general-relativistic equations for geodesic clocks to generate exact relativistic metric forms in terms of the coordinates (R,t). Newtonian theory does not describe light. However, the motion of light can be determined from the (R,t) general-relativistic metric forms obtained from Newtonian theory by setting ds2(R,t)=0. In this sense, a theory of light can be related to absolute time and space of Newtonian gravitational theory. I illustrate the (R,t) methodology by first solving the equations that result from a Newtonian picture and then examining the exact metric forms for the general-relativistic problems of the Schwarzschild field, gravitational collapse and expansion of a zero-pressure perfect fluid, and zero-pressure big-bang cosmology. I also briefly describe other applications of the Newtonian/general-relativistic formulation to: embedding a Schwarzschild mass into cosmology; continuously following an expanding universe from radiation to matter domination; Dirac's Large Numbers hypothesis; the incompleteness of Kruskal-Szekeres spacetime; double valuedness in cosmology; and the de Sitter universe.
Bays, Rebecca B; Zabrucky, Karen M; Gagne, Phill
2012-01-01
In the current study we examined whether prevalence information and imagery encoding influence participants' general plausibility, personal plausibility, belief, and memory ratings for suggested childhood events. Results showed decreases in general and personal plausibility ratings for low prevalence events when encoding instructions were not elaborate; however, instructions to repeatedly imagine suggested events elicited personal plausibility increases for low-prevalence events, evidence that elaborate imagery negated the effect of our prevalence manipulation. We found no evidence of imagination inflation or false memory construction. We discuss critical differences in researchers' manipulations of plausibility and imagery that may influence results of false memory studies in the literature. In future research investigators should focus on the specific nature of encoding instructions when examining the development of false memories.
Influences of Social Capital on Health and Well-Being from Qualitative Approach
Yamaguchi, Ayano
2013-01-01
The social capital surrounding health including health and well-being, the way in which they function as multi-dimensional constructs, and the potential stability of relationships among the social capital were examined across universities in Hawaii and Japan. Maintaining or strengthening social factors of collective and individual health and well-being is a core factor of social capital and is instrumental in reducing worry and increasing trust. Qualitative in-depth interviews with 64 male and female college students (32 college students at the University of Hawaii at Manoa; 32 college students at Reitaku University in Japan) were used to collect information on social capital of health and well-being and associated concepts; students’ perceptions were grouped under 11 themes. The data indicates that social capital has an impact on college students’ health and well-being. They also suggest that differences in health status and well-being can be plausibly attributed to processes associated with socio-environmental circumstances and situations. PMID:23985117
Samatey, Fadel A; Matsunami, Hideyuki; Imada, Katsumi; Nagashima, Shigehiro; Shaikh, Tanvir R; Thomas, Dennis R; Chen, James Z; Derosier, David J; Kitao, Akio; Namba, Keiichi
2004-10-28
The bacterial flagellum is a motile organelle, and the flagellar hook is a short, highly curved tubular structure that connects the flagellar motor to the long filament acting as a helical propeller. The hook is made of about 120 copies of a single protein, FlgE, and its function as a nano-sized universal joint is essential for dynamic and efficient bacterial motility and taxis. It transmits the motor torque to the helical propeller over a wide range of its orientation for swimming and tumbling. Here we report a partial atomic model of the hook obtained by X-ray crystallography of FlgE31, a major proteolytic fragment of FlgE lacking unfolded terminal regions, and by electron cryomicroscopy and three-dimensional helical image reconstruction of the hook. The model reveals the intricate molecular interactions and a plausible switching mechanism for the hook to be flexible in bending but rigid against twisting for its universal joint function.
The metaphysics of quantum mechanics: Modal interpretations
NASA Astrophysics Data System (ADS)
Gluck, Stuart Murray
2004-11-01
This dissertation begins with the argument that a preferred way of doing metaphysics is through philosophy of physics. An understanding of quantum physics is vital to answering questions such as: What counts as an individual object in physical ontology? Is the universe fundamentally indeterministic? Are indiscernibles identical? This study explores how the various modal interpretations of quantum mechanics answer these sorts of questions; modal accounts are one of the two classes of interpretations along with so-called collapse accounts. This study suggests a new alternative within the class of modal views that yields a more plausible ontology, one in which the Principle of the Identity of Indisceribles is necessarily true. Next, it shows that modal interpretations can consistently deny that the universe must be fundamentally indeterministic so long as they accept certain other metaphysical commitments: either a perfect initial distribution of states in the universe or some form of primitive dispositional properties. Finally, the study sketches out a future research project for modal interpretations based on developing quantified quantum logic.
Justifying group-specific common morality.
Strong, Carson
2008-01-01
Some defenders of the view that there is a common morality have conceived such morality as being universal, in the sense of extending across all cultures and times. Those who deny the existence of such a common morality often argue that the universality claim is implausible. Defense of common morality must take account of the distinction between descriptive and normative claims that there is a common morality. This essay considers these claims separately and identifies the nature of the arguments for each claim. It argues that the claim that there is a universal common morality in the descriptive sense has not been successfully defended to date. It maintains that the claim that there is a common morality in the normative sense need not be understood as universalist. This paper advocates the concept of group specific common morality, including country-specific versions. It suggests that both the descriptive and the normative claims that there are country-specific common moralities are plausible, and that a country-specific normative common morality could provide the basis for a country's bioethics.
Key Financial Metrics on Australia's Higher Education Sector. Selected Insights--April 2016
ERIC Educational Resources Information Center
Australian Government Tertiary Education Quality and Standards Agency, 2016
2016-01-01
Tertiary Education Quality and Standards (TEQSA) is committed to ensuring that stakeholders in Australia's higher education sector have access to relevant information to enable and better inform decision making. TEQSA recognises that there is little publicly available information on Australia's higher education sector beyond the university sector.…
Purdue Extended Campus: Transparency, Accountability, and Assessment in Strategic Planning
ERIC Educational Resources Information Center
Cunningham, Robin; Eddy, Michael; Pagano, Mark; Ncube, Lisa
2011-01-01
In 2002 President Martin Jischke initiated a new era in strategic planning at Purdue. Under his leadership, strategic planning became a centralized activity with unit plans aligned to the university plan. Strategic goals were designed to have maximum impact, which would be measurable through metrics. Strategic planning at Purdue would be an…
Short-Term Field Study Programs: A Holistic and Experiential Approach to Learning
ERIC Educational Resources Information Center
Long, Mary M.; Sandler, Dennis M.; Topol, Martin T.
2017-01-01
For business schools, AACSB and Middle States' call for more experiential learning is one reason to provide study abroad programs. Universities must attend to the demand for continuous improvement and employ metrics to benchmark and evaluate their relative standing among peer institutions. One such benchmark is the National Survey of Student…
Infants Prefer the Musical Meter of Their Own Culture: A Cross-Cultural Comparison
ERIC Educational Resources Information Center
Soley, Gaye; Hannon, Erin E.
2010-01-01
Infants prefer native structures such as familiar faces and languages. Music is a universal human activity containing structures that vary cross-culturally. For example, Western music has temporally regular metric structures, whereas music of the Balkans (e.g., Bulgaria, Macedonia, Turkey) can have both regular and irregular structures. We…
Using Enrollment Data to Predict Retention Rate
ERIC Educational Resources Information Center
Bingham, Melissa A.; Solverson, Natalie Walleser
2016-01-01
First- to second-year retention rates are one metric reported by colleges and universities to convey institutional success to a variety of external constituents. But how much of a retention rate is institutional inputs, and how much can be understood by examining student inputs? The authors utilize multi-year, multi-institutional data to examine…
Getting a Tenure-Track Faculty Position at a Teaching-Centered Research University
ERIC Educational Resources Information Center
Wilkens, Robert; Comfort, Kristen
2016-01-01
The goal of this article is to provide critical information to chemical engineers seeking a tenure-track faculty position within academia. We outline the application and submission process from start to finish, including a discussion on critical evaluation metrics sought by search committees. In addition, we highlight frequent mistakes made by…
Coherence between Text Comments and the Quantitative Ratings in the UK's National Student Survey
ERIC Educational Resources Information Center
Langan, A. Mark; Scott, Nick; Partington, Shobana; Oczujda, Agnieszka
2017-01-01
Institutions are understandably interested in the profile of their own reputations based upon publicly available data about student experiences. The UK's National Student Survey (NSS) metrics are integrated into several "Good University" calculations, whereas teaching teams most often use the survey's text comments to change practices,…
Too Little and Too Much Trust: Performance Measurement in Australian Higher Education
ERIC Educational Resources Information Center
Woelert, Peter; Yates, Lyn
2015-01-01
A striking feature of contemporary Australian higher education governance is the strong emphasis on centralized, template style, metric-based, and consequential forms of performance measurement. Such emphasis is indicative of a low degree of political trust among the central authorities in Australia in the intrinsic capacity of universities and…
Metrics and Science Monograph Collections at the Marston Science Library, University of Florida
ERIC Educational Resources Information Center
Leonard, Michelle F.; Haas, Stephanie C.; Kisling, Vernon N.
2010-01-01
As academic libraries are increasingly supported by a matrix of database functions, the use of data mining and visualization techniques offer significant potential for future collection development and service initiatives based on quantifiable data. While data collection techniques are still not standardized and results may be skewed because of…
The implausibility of Mendel's theory before 1900.
Orel, V
Attention is paid to the category of the plausibility of Mendel's terminology in formulating the research problem, in describing experimental model and research method and in explaining his theory in the historical context of the long lasting enigma of generation, hybridization and heredity. The new research problem of heredity derived from the enigma of generation was plausible for the sheep breeders in Brno in 1836-1837 who also formulated the research question: what and how is inherited? But they did not find an approach to the experimental investigation. Later in 1852 the research problem of heredity was formulated by the physiologist of the Göttingen University, R. Wagner, who also outlined the method of crossing animals or artifical fertilization of plants for the investigation of the enigma of generation and heredity. But he could not carry out the recommended experiments at the University. His proposal remained without echo. Mendel first mentioned the motivation for his research arising from plant breeding experience and then from the experiments with plant crossing by botanists. He delivered his lectures in Brno to the community of naturalists, who paid attention to the appearance of hybrids in nature, but were not interested in plant breeding. After describing research model and experimental method Mendel presented the sequence of hypotheses proved in experiments and explained the origin and development of hybrids and at the same time also the mechanism of fertilization and of transmission of traits, what was heredity without using the term. The listeners of his lectures and later the readers of his paper did not understand his explanation. ...
NASA Astrophysics Data System (ADS)
Desmond, Timothy
In this dissertation I discern what Carl Jung calls the mandala image of the ultimate archetype of unity underlying and structuring cosmos and psyche by pointing out parallels between his transpersonal psychology and Stanford physicist Leonard Susskind's string theory. Despite his atheistic, materialistically reductionist interpretation of it, I demonstrate how Susskind's string theory of holographic information conservation at the event horizons of black holes, and the cosmic horizon of the universe, corroborates the following four topics about which Jung wrote: (1) his near-death experience of the cosmic horizon after a heart attack in 1944; ( 2) his equation relating psychic energy to mass, "Psyche=highest intensity in the smallest space" (1997, 162), which I translate into the equation, Psyche=Singularity; (3) his theory that the mandala, a circle or sphere with a central point, is the symbolic image of the ultimate archetype of unity through the union of opposites, which structures both cosmos and psyche, and which rises spontaneously from the collective unconscious to compensate a conscious mind torn by irreconcilable demands (1989, 334-335, 396-397); and (4) his theory of synchronicity. I argue that Susskind's inside-out black hole model of our Big Bang universe forms a geometrically perfect mandala: a central Singularity encompassed by a two-dimensional sphere which serves as a universal memory bank. Moreover, in precise fulfillment of Jung's theory, Susskind used that mandala to reconcile the notoriously incommensurable paradigms of general relativity and quantum mechanics, providing in the process a mathematically plausible explanation for Jung's near-death experience of his past, present, and future life simultaneously at the cosmic horizon. Finally, Susskind's theory also provides a plausible cosmological model to explain Jung's theory of synchronicity--meaningful coincidences may be tied together by strings at the cosmic horizon, from which they radiate inward as the holographic "movie" of our three-dimensional world.
SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, T; Ruan, D
Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject tomore » random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be correlated with the performance of relevant atlas selection and ultimate label fusion.« less
NASA Astrophysics Data System (ADS)
Membiela, Federico Agustín; Bellini, Mauricio
2010-02-01
Using a semiclassical approach to Gravitoelectromagnetic Inflation (GEMI), we study the origin and evolution of seminal inflaton and electromagnetic fields in the early inflationary universe from a 5D vacuum state. The difference with other previous works is that in this one we use a Lorentz gauge. Our formalism is naturally not conformal invariant on the effective 4D de Sitter metric, which make possible the super adiabatic amplification of magnetic field modes during the early inflationary epoch of the universe on cosmological scales.
Bianchi type-VIh string cloud cosmological models with bulk viscosity
NASA Astrophysics Data System (ADS)
Tripathy, Sunil K.; Behera, Dipanjali
2010-11-01
String cloud cosmological models are studied using spatially homogeneous and anisotropic Bianchi type VIh metric in the frame work of general relativity. The field equations are solved for massive string cloud in presence of bulk viscosity. A general linear equation of state of the cosmic string tension density with the proper energy density of the universe is considered. The physical and kinematical properties of the models have been discussed in detail and the limits of the anisotropic parameter responsible for different phases of the universe are explored.
Statistical estimation of ozone exposure metrics
NASA Astrophysics Data System (ADS)
Blankenship, Erin E.; Stefanski, L. A.
Data from recent experiments at North Carolina State University and other locations provide a unique opportunity to study the effect of ambient ozone on the growth of clover. The data consist of hourly ozone measurements over a 140 day growing season at eight sites in the US, coupled with clover growth response data measured every 28 days. The objective is to model an indicator of clover growth as a function of ozone exposure. A common strategy for dealing with the numerous hourly ozone measurements is to reduce these to a single summary measurement, a so-called exposure metric, for the growth period of interest. However, the mean ozone value is not necessarily the best summarization, as it is widely believed that low levels of ozone have a negligible effect on growth, whereas peak ozone values are deleterious to plant growth. There are also suspected interactions with available sunlight, temperature and humidity. A number of exposure metrics have been proposed that reflect these beliefs by assigning different weights to ozone values according to magnitude, time of day, temperature and humidity. These weighting schemes generally depend on parameters that have, to date, been subjectively determined. We propose a statistical approach based on profile likelihoods to estimate the parameters in these exposure metrics.
Off-diagonal ekpyrotic scenarios and equivalence of modified, massive and/or Einstein gravity
NASA Astrophysics Data System (ADS)
Vacaru, Sergiu I.
2016-01-01
Using our anholonomic frame deformation method, we show how generic off-diagonal cosmological solutions depending, in general, on all spacetime coordinates and undergoing a phase of ultra-slow contraction can be constructed in massive gravity. In this paper, there are found and studied new classes of locally anisotropic and (in)homogeneous cosmological metrics with open and closed spatial geometries. The late time acceleration is present due to effective cosmological terms induced by nonlinear off-diagonal interactions and graviton mass. The off-diagonal cosmological metrics and related Stückelberg fields are constructed in explicit form up to nonholonomic frame transforms of the Friedmann-Lamaître-Robertson-Walker (FLRW) coordinates. We show that the solutions include matter, graviton mass and other effective sources modeling nonlinear gravitational and matter fields interactions in modified and/or massive gravity, with polarization of physical constants and deformations of metrics, which may explain certain dark energy and dark matter effects. There are stated and analyzed the conditions when such configurations mimic interesting solutions in general relativity and modifications and recast the general Painlevé-Gullstrand and FLRW metrics. Finally, we elaborate on a reconstruction procedure for a subclass of off-diagonal cosmological solutions which describe cyclic and ekpyrotic universes, with an emphasis on open issues and observable signatures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riyahi, S; Choi, W; Bhooshan, N
2016-06-15
Purpose: To compare linear and deformable registration methods for evaluation of tumor response to Chemoradiation therapy (CRT) in patients with esophageal cancer. Methods: Linear and multi-resolution BSpline deformable registration were performed on Pre-Post-CRT CT/PET images of 20 patients with esophageal cancer. For both registration methods, we registered CT using Mean Square Error (MSE) metric, however to register PET we used transformation obtained using Mutual Information (MI) from the same CT due to being multi-modality. Similarity of Warped-CT/PET was quantitatively evaluated using Normalized Mutual Information and plausibility of DF was assessed using inverse consistency Error. To evaluate tumor response four groupsmore » of tumor features were examined: (1) Conventional PET/CT e.g. SUV, diameter (2) Clinical parameters e.g. TNM stage, histology (3)spatial-temporal PET features that describe intensity, texture and geometry of tumor (4)all features combined. Dominant features were identified using 10-fold cross-validation and Support Vector Machine (SVM) was deployed for tumor response prediction while the accuracy was evaluated by ROC Area Under Curve (AUC). Results: Average and standard deviation of Normalized mutual information for deformable registration using MSE was 0.2±0.054 and for linear registration was 0.1±0.026, showing higher NMI for deformable registration. Likewise for MI metric, deformable registration had 0.13±0.035 comparing to linear counterpart with 0.12±0.037. Inverse consistency error for deformable registration for MSE metric was 4.65±2.49 and for linear was 1.32±2.3 showing smaller value for linear registration. The same conclusion was obtained for MI in terms of inverse consistency error. AUC for both linear and deformable registration was 1 showing no absolute difference in terms of response evaluation. Conclusion: Deformable registration showed better NMI comparing to linear registration, however inverse consistency of transformation was lower in linear registration. We do not expect to see significant difference when warping PET images using deformable or linear registration. This work was supported in part by the National Cancer Institute Grants R01CA172638.« less
Filatov, Gleb; Bauwens, Bruno; Kertész-Farkas, Attila
2018-05-07
Bioinformatics studies often rely on similarity measures between sequence pairs, which often pose a bottleneck in large-scale sequence analysis. Here, we present a new convolutional kernel function for protein sequences called the LZW-Kernel. It is based on code words identified with the Lempel-Ziv-Welch (LZW) universal text compressor. The LZW-Kernel is an alignment-free method, it is always symmetric, is positive, always provides 1.0 for self-similarity and it can directly be used with Support Vector Machines (SVMs) in classification problems, contrary to normalized compression distance (NCD), which often violates the distance metric properties in practice and requires further techniques to be used with SVMs. The LZW-Kernel is a one-pass algorithm, which makes it particularly plausible for big data applications. Our experimental studies on remote protein homology detection and protein classification tasks reveal that the LZW-Kernel closely approaches the performance of the Local Alignment Kernel (LAK) and the SVM-pairwise method combined with Smith-Waterman (SW) scoring at a fraction of the time. Moreover, the LZW-Kernel outperforms the SVM-pairwise method when combined with BLAST scores, which indicates that the LZW code words might be a better basis for similarity measures than local alignment approximations found with BLAST. In addition, the LZW-Kernel outperforms n-gram based mismatch kernels, hidden Markov model based SAM and Fisher kernel, and protein family based PSI-BLAST, among others. Further advantages include the LZW-Kernel's reliance on a simple idea, its ease of implementation, and its high speed, three times faster than BLAST and several magnitudes faster than SW or LAK in our tests. LZW-Kernel is implemented as a standalone C code and is a free open-source program distributed under GPLv3 license and can be downloaded from https://github.com/kfattila/LZW-Kernel. akerteszfarkas@hse.ru. Supplementary data are available at Bioinformatics Online.
The performance of disk arrays in shared-memory database machines
NASA Technical Reports Server (NTRS)
Katz, Randy H.; Hong, Wei
1993-01-01
In this paper, we examine how disk arrays and shared memory multiprocessors lead to an effective method for constructing database machines for general-purpose complex query processing. We show that disk arrays can lead to cost-effective storage systems if they are configured from suitably small formfactor disk drives. We introduce the storage system metric data temperature as a way to evaluate how well a disk configuration can sustain its workload, and we show that disk arrays can sustain the same data temperature as a more expensive mirrored-disk configuration. We use the metric to evaluate the performance of disk arrays in XPRS, an operational shared-memory multiprocessor database system being developed at the University of California, Berkeley.
Initial conditions of inhomogeneous universe and the cosmological constant problem
NASA Astrophysics Data System (ADS)
Totani, Tomonori
2016-06-01
Deriving the Einstein field equations (EFE) with matter fluid from the action principle is not straightforward, because mass conservation must be added as an additional constraint to make rest-frame mass density variable in reaction to metric variation. This can be avoided by introducing a constraint 0δ(√-g) = to metric variations δ gμν, and then the cosmological constant Λ emerges as an integration constant. This is a removal of one of the four constraints on initial conditions forced by EFE at the birth of the universe, and it may imply that EFE are unnecessarily restrictive about initial conditions. I then adopt a principle that the theory of gravity should be able to solve time evolution starting from arbitrary inhomogeneous initial conditions about spacetime and matter. The equations of gravitational fields satisfying this principle are obtained, by setting four auxiliary constraints on δ gμν to extract six degrees of freedom for gravity. The cost of achieving this is a loss of general covariance, but these equations constitute a consistent theory if they hold in the special coordinate systems that can be uniquely specified with respect to the initial space-like hypersurface when the universe was born. This theory predicts that gravity is described by EFE with non-zero Λ in a homogeneous patch of the universe created by inflation, but Λ changes continuously across different patches. Then both the smallness and coincidence problems of the cosmological constant are solved by the anthropic argument. This is just a result of inhomogeneous initial conditions, not requiring any change of the fundamental physical laws in different patches.
Wave computation on the Poincaré dodecahedral space
NASA Astrophysics Data System (ADS)
Bachelot-Motet, Agnès
2013-12-01
We compute the waves propagating on a compact 3-manifold of constant positive curvature with a non-trivial topology: the Poincaré dodecahedral space that is a plausible model of multi-connected universe. We transform the Cauchy problem to a mixed problem posed on a fundamental domain determined by the quaternionic calculus. We adopt a variational approach using a space of finite elements that is invariant under the action of the binary icosahedral group. The computation of the transient waves is validated with their spectral analysis by computing a lot of eigenvalues of the Laplace-Beltrami operator.
Introducing the Balanced Scorecard: Creating Metrics to Measure Performance
ERIC Educational Resources Information Center
Gumbus, Andra
2005-01-01
This experiential exercise presents the concept of the Balanced Scorecard (BSC) and applies it in a university setting. The Balanced Scorecard was developed 12 years ago and has grown in popularity and is used by more than 50% of the Fortune 500 companies as a performance measurement and strategic management tool. The BSC expands the traditional…
ERIC Educational Resources Information Center
Campbell, Hilary L.; Barry, Carol L.; Joe, Jilliam N.; Finney, Sara J.
2008-01-01
There has been growing interest in comparing achievement goal orientations across ethnic groups. Such comparisons, however, cannot be made until validity evidence has been collected to support the use of an achievement goal orientation instrument for that purpose. Therefore, this study investigates the measurement invariance of a particular…
USF Sarasota-Manatee Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
New College of Florida Work Plan Presentation for 2014-15 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2014
2014-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
New College of Florida Work Plan Presentation for 2013-14 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2013
2013-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
New College of Florida Work Plan Presentation for 2012-13 Board of Governors Review
ERIC Educational Resources Information Center
Board of Governors, State University System of Florida, 2012
2012-01-01
The State University System of Florida has developed three tools that aid in guiding the System's future: (1) The Board of Governors' new "Strategic Plan 2012-2025" is driven by goals and associated metrics that stake out where the System is headed; (2) The Board's "Annual Accountability Report" provides yearly tracking for how…
ERIC Educational Resources Information Center
Saunders, Daniel B.; Blanco Ramírez, Gerardo
2017-01-01
In this article, the notion of excellence in relation to teaching is removed from its privileged place in order to render it, and its implications, for analysis. We argue that teaching excellence needs to be understood in the larger context of the neoliberal university in which competition is taken for granted, and therefore, metrics for…
An Instructional System in Physical Science, Teacher's Guide and Keys.
ERIC Educational Resources Information Center
Washington State Univ., Pullman.
This manual is a teacher's guide to a self-instructional program in basic physical science, designed for high school students who have not had a course in chemistry or physics. There are six units in the manual relating to these areas: problem solving and experimental procedures; universal standards, metric system and conversion; mechanics; the…
Information Operations & Security
2012-03-05
Fred B. Schneider, Cornell The Promise of Security Metrics • Users: Purchasing decisions – Which system is the better value? • Builders ...Engineering University of Maryland, College Park DISTRIBUTION A: Approved for public release; distribution is unlimited. Digital Multimedia Anti...fingerprints for multimedia content: • Determine the time and place of recordings • Detect tampering in the multimedia content; bind video and
Academic Sell-Out: How an Obsession with Metrics and Rankings Is Damaging Academia
ERIC Educational Resources Information Center
Gruber, Thorsten
2014-01-01
Increasingly, academics have to demonstrate that their research has academic impact. Universities normally use journal rankings and journal impact factors to assess the research impact of individual academics. More recently, citation counts for individual articles and the h-index have also been used to measure the academic impact of academics.…
ERIC Educational Resources Information Center
Stephen, Timothy D.
2011-01-01
The problem of how to rank academic journals in the communication field (human interaction, mass communication, speech, and rhetoric) is one of practical importance to scholars, university administrators, and librarians, yet there is no methodology that covers the field's journals comprehensively and objectively. This article reports a new ranking…
ERIC Educational Resources Information Center
Collins, Francis L.; Park, Gil-Sung
2016-01-01
Over the last two decades, enumeration has become a critical force in crafting the governmentalities of globalizing higher education. Whether in the glossy Web sites and documentation of the world's "top universities" or in more fine-tuned regional and subject guides, accreditation schemes, journal metrics or h-indexes, technologies for…
Local sensitivities of the gulf stream separation
Schoonover, Joseph; Dewar, William K.; Wienders, Nicolas; ...
2016-12-05
Robust and accurate Gulf Stream separation remains an unsolved problem in general circulation modeling whose resolution will positively impact the ocean and climate modeling communities. Oceanographic literature does not face a shortage of plausible hypotheses that attempt to explain the dynamics of the Gulf Stream separation, yet a single theory that the community agrees on is missing. We investigate the impact of the Deep Western Boundary Current, coastline curvature, and continental shelf steepening on the Gulf Stream separation within regional configurations of the MIT General Circulation Model. Artificial modifications to the regional bathymetry are introduced to investigate the sensitivity ofmore » the separation to each of these factors. Metrics for subsurface separation detection confirm the direct link between flow separation and the surface expression of the Gulf Stream in the Mid-Atlantic Bight. Conversely, the Gulf Stream separation exhibits minimal sensitivity to the presence of the DWBC and coastline curvature. The implications of these results to the development of a “separation recipe” for ocean modeling are discussed. Furthermore, we conclude adequate topographic resolution is a necessary, but not sufficient, condition for proper Gulf Stream separation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schoonover, Joseph; Dewar, William K.; Wienders, Nicolas
Robust and accurate Gulf Stream separation remains an unsolved problem in general circulation modeling whose resolution will positively impact the ocean and climate modeling communities. Oceanographic literature does not face a shortage of plausible hypotheses that attempt to explain the dynamics of the Gulf Stream separation, yet a single theory that the community agrees on is missing. We investigate the impact of the Deep Western Boundary Current, coastline curvature, and continental shelf steepening on the Gulf Stream separation within regional configurations of the MIT General Circulation Model. Artificial modifications to the regional bathymetry are introduced to investigate the sensitivity ofmore » the separation to each of these factors. Metrics for subsurface separation detection confirm the direct link between flow separation and the surface expression of the Gulf Stream in the Mid-Atlantic Bight. Conversely, the Gulf Stream separation exhibits minimal sensitivity to the presence of the DWBC and coastline curvature. The implications of these results to the development of a “separation recipe” for ocean modeling are discussed. Furthermore, we conclude adequate topographic resolution is a necessary, but not sufficient, condition for proper Gulf Stream separation.« less
Moore, Ashlee A; Neale, Michael C; Silberg, Judy L; Verhulst, Brad
2016-01-01
Depression is a highly heterogeneous condition, and identifying how symptoms present in various groups may greatly increase our understanding of its etiology. Importantly, Major Depressive Disorder is strongly linked with Substance Use Disorders, which may ameliorate or exacerbate specific depression symptoms. It is therefore quite plausible that depression may present with different symptom profiles depending on an individual's substance use status. Given these observations, it is important to examine the underlying construct of depression in groups of substance users compared to non-users. In this study we use a non-clinical sample to examine the measurement structure of the Beck Depression Inventory (BDI-II) in non-users and frequent-users of various substances. Specifically, measurement invariance was examined across those who do vs. do not use alcohol, nicotine, and cannabis. Results indicate strict factorial invariance across non-users and frequent-users of alcohol and cannabis, and metric invariance across non-users and frequent-users of nicotine. This implies that the factor structure of the BDI-II is similar across all substance use groups.
Ren, Jiaping; Wang, Xinjie; Manocha, Dinesh
2016-01-01
We present a biologically plausible dynamics model to simulate swarms of flying insects. Our formulation, which is based on biological conclusions and experimental observations, is designed to simulate large insect swarms of varying densities. We use a force-based model that captures different interactions between the insects and the environment and computes collision-free trajectories for each individual insect. Furthermore, we model the noise as a constructive force at the collective level and present a technique to generate noise-induced insect movements in a large swarm that are similar to those observed in real-world trajectories. We use a data-driven formulation that is based on pre-recorded insect trajectories. We also present a novel evaluation metric and a statistical validation approach that takes into account various characteristics of insect motions. In practice, the combination of Curl noise function with our dynamics model is used to generate realistic swarm simulations and emergent behaviors. We highlight its performance for simulating large flying swarms of midges, fruit fly, locusts and moths and demonstrate many collective behaviors, including aggregation, migration, phase transition, and escape responses. PMID:27187068
Identifying collaborative care teams through electronic medical record utilization patterns.
Chen, You; Lorenzi, Nancy M; Sandberg, Warren S; Wolgast, Kelly; Malin, Bradley A
2017-04-01
The goal of this investigation was to determine whether automated approaches can learn patient-oriented care teams via utilization of an electronic medical record (EMR) system. To perform this investigation, we designed a data-mining framework that relies on a combination of latent topic modeling and network analysis to infer patterns of collaborative teams. We applied the framework to the EMR utilization records of over 10 000 employees and 17 000 inpatients at a large academic medical center during a 4-month window in 2010. Next, we conducted an extrinsic evaluation of the patterns to determine the plausibility of the inferred care teams via surveys with knowledgeable experts. Finally, we conducted an intrinsic evaluation to contextualize each team in terms of collaboration strength (via a cluster coefficient) and clinical credibility (via associations between teams and patient comorbidities). The framework discovered 34 collaborative care teams, 27 (79.4%) of which were confirmed as administratively plausible. Of those, 26 teams depicted strong collaborations, with a cluster coefficient > 0.5. There were 119 diagnostic conditions associated with 34 care teams. Additionally, to provide clarity on how the survey respondents arrived at their determinations, we worked with several oncologists to develop an illustrative example of how a certain team functions in cancer care. Inferred collaborative teams are plausible; translating such patterns into optimized collaborative care will require administrative review and integration with management practices. EMR utilization records can be mined for collaborative care patterns in large complex medical centers. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Pilgrims sailing the Titanic: plausibility effects on memory for misinformation.
Hinze, Scott R; Slaten, Daniel G; Horton, William S; Jenkins, Ryan; Rapp, David N
2014-02-01
People rely on information they read even when it is inaccurate (Marsh, Meade, & Roediger, Journal of Memory and Language 49:519-536, 2003), but how ubiquitous is this phenomenon? In two experiments, we investigated whether this tendency to encode and rely on inaccuracies from text might be influenced by the plausibility of misinformation. In Experiment 1, we presented stories containing inaccurate plausible statements (e.g., "The Pilgrims' ship was the Godspeed"), inaccurate implausible statements (e.g., . . . the Titanic), or accurate statements (e.g., . . . the Mayflower). On a subsequent test of general knowledge, participants relied significantly less on implausible than on plausible inaccuracies from the texts but continued to rely on accurate information. In Experiment 2, we replicated these results with the addition of a think-aloud procedure to elicit information about readers' noticing and evaluative processes for plausible and implausible misinformation. Participants indicated more skepticism and less acceptance of implausible than of plausible inaccuracies. In contrast, they often failed to notice, completely ignored, and at times even explicitly accepted the misinformation provided by plausible lures. These results offer insight into the conditions under which reliance on inaccurate information occurs and suggest potential mechanisms that may underlie reported misinformation effects.
Phillips, Lawrence; Pearl, Lisa
2015-11-01
The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.
Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E
2014-01-01
Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons.
Health in the Arab world: a view from within 1
Batniji, Rajaie; Khatib, Lina; Cammett, Melani; Sweet, Jeffrey; Basu, Sanjay; Jamal, Amaney; Wise, Paul; Giacaman, Rita
2014-01-01
Since late 2010, the Arab world has entered a tumultuous period of change, with populations demanding more inclusive and accountable government. The region is characterised by weak political institutions, which exclude large proportions of their populations from political representation and government services. Building on work in political science and economics, we assess the extent to which the quality of governance, or the extent of electoral democracy, relates to adult, infant, and maternal mortality, and to the perceived accessibility and improvement of health services. We compiled a dataset from the World Bank, WHO, Institute for Health Metrics and Evaluation, Arab Barometer Survey, and other sources to measure changes in demographics, health status, and governance in the Arab World from 1980 to 2010. We suggest an association between more effective government and average reductions in mortality in this period; however, there does not seem to be any relation between the extent of democracy and mortality reductions. The movements for changing governance in the region threaten access to services in the short term, forcing migration and increasing the vulnerability of some populations. In view of the patterns observed in the available data, and the published literature, we suggest that efforts to improve government effectiveness and to reduce corruption are more plausibly linked to population health improvements than are efforts to democratise. However, these patterns are based on restricted mortality data, leaving out subjective health metrics, quality of life, and disease-specific data. To better guide efforts to transform political and economic institutions, more data are needed for healthcare access, health-care quality, health status, and access to services of marginalised groups. PMID:24452043
Quantifying species recovery and conservation success to develop an IUCN Green List of Species.
Akçakaya, H Resit; Bennett, Elizabeth L; Brooks, Thomas M; Grace, Molly K; Heath, Anna; Hedges, Simon; Hilton-Taylor, Craig; Hoffmann, Michael; Keith, David A; Long, Barney; Mallon, David P; Meijaard, Erik; Milner-Gulland, E J; Rodrigues, Ana S L; Rodriguez, Jon Paul; Stephenson, P J; Stuart, Simon N; Young, Richard P
2018-03-26
Stopping declines in biodiversity is critically important, but it is only a first step toward achieving more ambitious conservation goals. The absence of an objective and practical definition of species recovery that is applicable across taxonomic groups leads to inconsistent targets in recovery plans and frustrates reporting and maximization of conservation impact. We devised a framework for comprehensively assessing species recovery and conservation success. We propose a definition of a fully recovered species that emphasizes viability, ecological functionality, and representation; and use counterfactual approaches to quantify degree of recovery. This allowed us to calculate a set of 4 conservation metrics that demonstrate impacts of conservation efforts to date (conservation legacy); identify dependence of a species on conservation actions (conservation dependence); quantify expected gains resulting from conservation action in the medium term (conservation gain); and specify requirements to achieve maximum plausible recovery over the long term (recovery potential). These metrics can incentivize the establishment and achievement of ambitious conservation targets. We illustrate their use by applying the framework to a vertebrate, an invertebrate, and a woody and an herbaceous plant. Our approach is a preliminary framework for an International Union for Conservation of Nature (IUCN) Green List of Species, which was mandated by a resolution of IUCN members in 2012. Although there are several challenges in applying our proposed framework to a wide range of species, we believe its further development, implementation, and integration with the IUCN Red List of Threatened Species will help catalyze a positive and ambitious vision for conservation that will drive sustained conservation action. © 2018 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
Governance and health in the Arab world.
Batniji, Rajaie; Khatib, Lina; Cammett, Melani; Sweet, Jeffrey; Basu, Sanjay; Jamal, Amaney; Wise, Paul; Giacaman, Rita
2014-01-25
Since late 2010, the Arab world has entered a tumultuous period of change, with populations demanding more inclusive and accountable government. The region is characterised by weak political institutions, which exclude large proportions of their populations from political representation and government services. Building on work in political science and economics, we assess the extent to which the quality of governance, or the extent of electoral democracy, relates to adult, infant, and maternal mortality, and to the perceived accessibility and improvement of health services. We compiled a dataset from the World Bank, WHO, Institute for Health Metrics and Evaluation, Arab Barometer Survey, and other sources to measure changes in demographics, health status, and governance in the Arab World from 1980 to 2010. We suggest an association between more effective government and average reductions in mortality in this period; however, there does not seem to be any relation between the extent of democracy and mortality reductions. The movements for changing governance in the region threaten access to services in the short term, forcing migration and increasing the vulnerability of some populations. In view of the patterns observed in the available data, and the published literature, we suggest that efforts to improve government effectiveness and to reduce corruption are more plausibly linked to population health improvements than are efforts to democratise. However, these patterns are based on restricted mortality data, leaving out subjective health metrics, quality of life, and disease-specific data. To better guide efforts to transform political and economic institutions, more data are needed for health-care access, health-care quality, health status, and access to services of marginalised groups. Copyright © 2014 Elsevier Ltd. All rights reserved.
IDEAL characterization of isometry classes of FLRW and inflationary spacetimes
NASA Astrophysics Data System (ADS)
Canepa, Giovanni; Dappiaggi, Claudio; Khavkine, Igor
2018-02-01
In general relativity, an IDEAL (Intrinsic, Deductive, Explicit, ALgorithmic) characterization of a reference spacetime metric g 0 consists of a set of tensorial equations T[g] = 0, constructed covariantly out of the metric g, its Riemann curvature and their derivatives, that are satisfied if and only if g is locally isometric to the reference spacetime metric g 0. The same notion can be extended to also include scalar or tensor fields, where the equations T[g, φ]=0 are allowed to also depend on the extra fields ϕ. We give the first IDEAL characterization of cosmological FLRW spacetimes, with and without a dynamical scalar (inflaton) field. We restrict our attention to what we call regular geometries, which uniformly satisfy certain identities or inequalities. They roughly split into the following natural special cases: constant curvature spacetime, Einstein static universe, and flat or curved spatial slices. We also briefly comment on how the solution of this problem has implications, in general relativity and inflation theory, for the construction of local gauge invariant observables for linear cosmological perturbations and for stability analysis.
How likely are constituent quanta to initiate inflation?
Berezhiani, Lasha; Trodden, Mark
2015-08-06
In this study, we propose an intuitive framework for studying the problem of initial conditions in slow-roll inflation. In particular, we consider a universe at high, but sub-Planckian energy density and analyze the circumstances under which it is plausible for it to become dominated by inflated patches at late times, without appealing to the idea of self-reproduction. Our approach is based on defining a prior probability distribution for the constituent quanta of the pre-inflationary universe. To test the idea that inflation can begin under very generic circumstances, we make specific – yet quite general and well grounded – assumptions onmore » the prior distribution. As a result, we are led to the conclusion that the probability for a given region to ignite inflation at sub-Planckian densities is extremely small. Furthermore, if one chooses to use the enormous volume factor that inflation yields as an appropriate measure, we find that the regions of the universe which started inflating at densities below the self-reproductive threshold nevertheless occupy a negligible physical volume in the present universe as compared to those domains that have never inflated.« less
Plausibility Judgments in Conceptual Change and Epistemic Cognition
ERIC Educational Resources Information Center
Lombardi, Doug; Nussbaum, E. Michael; Sinatra, Gale M.
2016-01-01
Plausibility judgments rarely have been addressed empirically in conceptual change research. Recent research, however, suggests that these judgments may be pivotal to conceptual change about certain topics where a gap exists between what scientists and laypersons find plausible. Based on a philosophical and empirical foundation, this article…
Can scholarship in nursing/midwifery education result in a successful research career?
Cooper, Simon; Absalom, Irene; Cant, Robyn; Bogossian, Fiona; Kelly, Michelle; Levett-Jones, Tracy; McKenna, Lisa
2018-05-07
In a recent editorial we examined the research outputs of 150 Australian nursing and midwifery professors (McKenna, Cooper, Cant, Bogossian, 2017) identifying publication metrics on par with, and sometimes above those of professors in the UK (Watson, McDonagh & Thompson, 2016). Because global university rankings are heavily weighted towards research, there has been pressure on universities and on academics to maximise research performance (Nguyen, Rambaldi & Tang, 2017). Although many Australian universities have increasingly focused on education delivery, and despite the need for a strong evidence base for learning and teaching, academics are often cautioned against focusing too heavily on educational research. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Did Martian Meteorites Come From These Sources?
NASA Astrophysics Data System (ADS)
Martel, L. M. V.
2007-01-01
Large rayed craters on Mars, not immediately obvious in visible light, have been identified in thermal infrared data obtained from the Thermal Emission Imaging System (THEMIS) onboard Mars Odyssey. Livio Tornabene (previously at the University of Tennessee, Knoxville and now at the University of Arizona, Tucson) and colleagues have mapped rayed craters primarily within young (Amazonian) volcanic plains in or near Elysium Planitia. They found that rays consist of numerous chains of secondary craters, their overlapping ejecta, and possibly primary ejecta from the source crater. Their work also suggests rayed craters may have formed preferentially in volatile-rich targets by oblique impacts. The physical details of the rayed craters and the target surfaces combined with current models of Martian meteorite delivery and cosmochemical analyses of Martian meteorites lead Tornabene and coauthors to conclude that these large rayed craters are plausible source regions for Martian meteorites.
Universal properties of mythological networks
NASA Astrophysics Data System (ADS)
Mac Carron, Pádraig; Kenna, Ralph
2012-07-01
As in statistical physics, the concept of universality plays an important, albeit qualitative, role in the field of comparative mythology. Here we apply statistical mechanical tools to analyse the networks underlying three iconic mythological narratives with a view to identifying common and distinguishing quantitative features. Of the three narratives, an Anglo-Saxon and a Greek text are mostly believed by antiquarians to be partly historically based while the third, an Irish epic, is often considered to be fictional. Here we use network analysis in an attempt to discriminate real from imaginary social networks and place mythological narratives on the spectrum between them. This suggests that the perceived artificiality of the Irish narrative can be traced back to anomalous features associated with six characters. Speculating that these are amalgams of several entities or proxies, renders the plausibility of the Irish text comparable to the others from a network-theoretic point of view.
Sedlmeier, Peter; Jaeger, Sonia
2007-01-01
We explored how well common theories about the impact of post-event information on memories explain recollections that occur naturally in university students' study routines. Instead of starting from a familiar research paradigm, such as those used in hindsight-bias research, the present study used a situation common to university students, and examined how well three candidate explanations--judgemental anchoring, implicit theories of change, and motivational influences--could explain the results we obtained in a long-term memory study that included three sessions, six months apart. We found that about two thirds of the memories of study-related issues were indeed biased, and that the impact of post-event information being used as an anchor is the most plausible explanation for the results. There were also some indications that memory biases might have been due, at least in part, to motivational factors.
Source Effects and Plausibility Judgments When Reading about Climate Change
ERIC Educational Resources Information Center
Lombardi, Doug; Seyranian, Viviane; Sinatra, Gale M.
2014-01-01
Gaps between what scientists and laypeople find plausible may act as a barrier to learning complex and/or controversial socioscientific concepts. For example, individuals may consider scientific explanations that human activities are causing current climate change as implausible. This plausibility judgment may be due-in part-to individuals'…
Plausibility and Perspective Influence the Processing of Counterfactual Narratives
ERIC Educational Resources Information Center
Ferguson, Heather J.; Jayes, Lewis T.
2018-01-01
Previous research has established that readers' eye movements are sensitive to the difficulty with which a word is processed. One important factor that influences processing is the fit of a word within the wider context, including its plausibility. Here we explore the influence of plausibility in counterfactual language processing. Counterfactuals…
Spatial interpolation of river channel topography using the shortest temporal distance
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Xian, Cuiling; Chen, Huajin; Grieneisen, Michael L.; Liu, Jiaming; Zhang, Minghua
2016-11-01
It is difficult to interpolate river channel topography due to complex anisotropy. As the anisotropy is often caused by river flow, especially the hydrodynamic and transport mechanisms, it is reasonable to incorporate flow velocity into topography interpolator for decreasing the effect of anisotropy. In this study, two new distance metrics defined as the time taken by water flow to travel between two locations are developed, and replace the spatial distance metric or Euclidean distance that is currently used to interpolate topography. One is a shortest temporal distance (STD) metric. The temporal distance (TD) of a path between two nodes is calculated by spatial distance divided by the tangent component of flow velocity along the path, and the STD is searched using the Dijkstra algorithm in all possible paths between two nodes. The other is a modified shortest temporal distance (MSTD) metric in which both the tangent and normal components of flow velocity were combined. They are used to construct the methods for the interpolation of river channel topography. The proposed methods are used to generate the topography of Wuhan Section of Changjiang River and compared with Universal Kriging (UK) and Inverse Distance Weighting (IDW). The results clearly showed that the STD and MSTD based on flow velocity were reliable spatial interpolators. The MSTD, followed by the STD, presents improvement in prediction accuracy relative to both UK and IDW.
Skidmore, Jessica R.; Murphy, James G.; Martens, Matthew P.
2014-01-01
The aims of the current study were to examine the associations among behavioral economic measures of alcohol value derived from three distinct measurement approaches, and to evaluate their respective relations with traditional indicators of alcohol problem severity in college drinkers. Five behavioral economic metrics were derived from hypothetical demand curves that quantify reward value by plotting consumption and expenditures as a function of price, another metric measured proportional behavioral allocation and enjoyment related to alcohol versus other activities, and a final metric measured relative discretionary expenditures on alcohol. The sample included 207 heavy drinking college students (53% female) who were recruited through an on-campus health center or university courses. Factor analysis revealed that the alcohol valuation construct comprises two factors: one factor that reflects participants’ levels of alcohol price sensitivity (demand persistence), and a second factor that reflects participants’ maximum consumption and monetary and behavioral allocation towards alcohol (amplitude of demand). The demand persistence and behavioral allocation metrics demonstrated the strongest and most consistent multivariate relations with alcohol-related problems, even when controlling for other well-established predictors. The results suggest that behavioral economic indices of reward value show meaningful relations with alcohol problem severity in young adults. Despite the presence of some gender differences, these measures appear to be useful problem indicators for men and women. PMID:24749779
EnviroAtlas - Total reptile species by 12-digit HUC for the conterminous United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US Environmental Protection Agency (US EPA,) and the U.S. Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness fo
This EnviroAtlas dataset contains biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for reptile species, based on the number of reptile species as measured by predicted habitat present within a pixel. These metrics were created from grouping national level single species habitat models created by the USGS Gap Analysis Program into smaller ecologically based, phylogeny based, or stakeholder suggested composites. The dataset includes reptile species richness metrics for all reptile species, lizards, snakes, turtles, poisonous reptiles, Natureserve-listed G1,G2, and G3 reptile species, and reptile species listed by IUCN (International Union for Conservation of Nature), PARC (Partners in Amphibian and Reptile Conservation) and SWPARC (Southwest Partners in Amphibian and Reptile Conservation). This dataset was produced by a joint effort of New Mexico State University, US EPA, and USGS to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa
Measuring the Performance and Intelligence of Systems: Proceedings of the 2002 PerMIS Workshop
NASA Technical Reports Server (NTRS)
Messina, E. R.; Meystel, A. M.
2002-01-01
Contents include the following: Performance Metrics; Performance of Multiple Agents; Performance of Mobility Systems; Performance of Planning Systems; General Discussion Panel 1; Uncertainty of Representation I; Performance of Robots in Hazardous Domains; Modeling Intelligence; Modeling of Mind; Measuring Intelligence; Grouping: A Core Procedure of Intelligence; Uncertainty in Representation II; Towards Universal Planning/Control Systems.
ERIC Educational Resources Information Center
Alem, Jaouad; Boudreau-Lariviere, Celine
2012-01-01
The objective of the present study is to analyze four metric qualities of an assessment grid for internship placements used by professionals to evaluate a sample of 110 Franco-Ontarian student interns registered between 2006 and 2009 at Laurentian University in the School of Human Kinetics. The evaluation grid was composed of 26 criteria. The four…
Peer Review in Class: Metrics and Variations in a Senior Course
ERIC Educational Resources Information Center
Yankulov, Krassimir; Couto, Richard
2012-01-01
Peer reviews are the generally accepted mode of quality assessment in scholarly communities; however, they are rarely used for evaluation at college levels. Over a period of 5 years, we have performed a peer review simulation at a senior level course in molecular genetics at the University of Guelph and have accumulated 393 student peer reviews.…
ERIC Educational Resources Information Center
Gunn, Andrew
2018-01-01
The creation of the Teaching Excellence Framework (TEF) represents a significant development concerning the teaching mission of the university in the UK. This paper considers the background to, and the development of, the TEF. It explains the context from which the TEF emerged and unpacks a series of rationales which illustrate the need for, and…
2012-01-01
PROJECT NUMBER BYU1 5e. TASK NUMBER MA 5f. WORK UNIT NUMBER RY 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University of Maryland Office of...Research Administration & Advancement College Park MD 20742-5100 8. PERFORMING ORGANIZATION REPORT NUMBER N/A 9. SPONSORING...Armed with these metrics, the Undns ruleset is better revised, vestigial rules removed or demoted for maintenance, and redundant locations distinguished
Characterizing granular networks using topological metrics
NASA Astrophysics Data System (ADS)
Dijksman, Joshua A.; Kovalcinova, Lenka; Ren, Jie; Behringer, Robert P.; Kramar, Miroslav; Mischaikow, Konstantin; Kondic, Lou
2018-04-01
We carry out a direct comparison of experimental and numerical realizations of the exact same granular system as it undergoes shear jamming. We adjust the numerical methods used to optimally represent the experimental settings and outcomes up to microscopic contact force dynamics. Measures presented here range from microscopic through mesoscopic to systemwide characteristics of the system. Topological properties of the mesoscopic force networks provide a key link between microscales and macroscales. We report two main findings: (1) The number of particles in the packing that have at least two contacts is a good predictor for the mechanical state of the system, regardless of strain history and packing density. All measures explored in both experiments and numerics, including stress-tensor-derived measures and contact numbers depend in a universal manner on the fraction of nonrattler particles, fNR. (2) The force network topology also tends to show this universality, yet the shape of the master curve depends much more on the details of the numerical simulations. In particular we show that adding force noise to the numerical data set can significantly alter the topological features in the data. We conclude that both fNR and topological metrics are useful measures to consider when quantifying the state of a granular system.
FLRW Cosmology from Yang-Mills Gravity
NASA Astrophysics Data System (ADS)
Katz, Daniel
2013-04-01
We extend to basic cosmology the subject of Yang-Mills gravity - a theory of gravity based on local translational gauge invariance in flat spacetime. It has been shown that this particular gauge invariance leads to tensor factors in the macroscopic limit of the equations of motion of particles which plays the same role as the metric tensor of General Relativity. The assumption that this ``effective metric" tensor takes on the standard FLRW form is our starting point. Equations analogous to the Friedman equations are derived and then solved in closed form for the three special cases of a universe dominated by 1) matter, 2) radiation, and 3) dark energy. We find that the solutions for the scale factor are similar to, but distinct from, those found in the corresponding GR based treatment.
Mapping Phylogenetic Trees to Reveal Distinct Patterns of Evolution.
Kendall, Michelle; Colijn, Caroline
2016-10-01
Evolutionary relationships are frequently described by phylogenetic trees, but a central barrier in many fields is the difficulty of interpreting data containing conflicting phylogenetic signals. We present a metric-based method for comparing trees which extracts distinct alternative evolutionary relationships embedded in data. We demonstrate detection and resolution of phylogenetic uncertainty in a recent study of anole lizards, leading to alternate hypotheses about their evolutionary relationships. We use our approach to compare trees derived from different genes of Ebolavirus and find that the VP30 gene has a distinct phylogenetic signature composed of three alternatives that differ in the deep branching structure. phylogenetics, evolution, tree metrics, genetics, sequencing. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Some Early Ideas on the Metric Geometry of Thermodynamics
NASA Astrophysics Data System (ADS)
Ruppeiner, George
2016-11-01
It is a pleasure to write for this 90th anniversary volume of Journal of Low Temperature Physics dedicated to Horst Meyer at Duke University. I was a PhD student with Horst in the period 1975-1980, working in experimental low temperature physics. While in Horst's group, I also did a theoretical physics project on the side. This project in the metric geometry of thermodynamics was motivated by my work in Horst's lab, and helped me to understand the theory of critical phenomena, very much in play in Horst's lab. In this paper, I explain the essence of my theory project and give a few accounts of its future development, focussing on topics where I interacted with Horst. I pay particular attention to the pure fluid critical point.
Semantic and Plausibility Preview Benefit Effects in English: Evidence from Eye Movements
Schotter, Elizabeth R.; Jia, Annie
2016-01-01
Theories of preview benefit in reading hinge on integration across saccades and the idea that preview benefit is greater the more similar the preview and target are. Schotter (2013) reported preview benefit from a synonymous preview, but it is unclear whether this effect occurs because of similarity between the preview and target (integration), or because of contextual fit of the preview—synonyms satisfy both accounts. Studies in Chinese have found evidence for preview benefit for words that are unrelated to the target, but are contextually plausible (Yang, Li, Wang, Slattery, & Rayner, 2014; Yang, Wang, Tong, & Rayner, 2012), which is incompatible with an integration account but supports a contextual fit account. Here, we used plausible and implausible unrelated previews in addition to plausible synonym, antonym, and identical previews to further investigate these accounts for readers of English. Early reading measures were shorter for all plausible preview conditions compared to the implausible preview condition. In later reading measures, a benefit for the plausible unrelated preview condition was not observed. In a second experiment, we asked questions that probed whether the reader encoded the preview or target. Readers were more likely to report the preview when they had skipped the word and not regressed to it, and when the preview was plausible. Thus, under certain circumstances, the preview word is processed to a high level of representation (i.e., semantic plausibility) regardless of its relationship to the target, but its influence on reading is relatively short-lived, being replaced by the target word, when fixated. PMID:27123754
Günther, Fritz; Marelli, Marco
2016-01-01
Noun compounds, consisting of two nouns (the head and the modifier) that are combined into a single concept, differ in terms of their plausibility: school bus is a more plausible compound than saddle olive. The present study investigates which factors influence the plausibility of attested and novel noun compounds. Distributional Semantic Models (DSMs) are used to obtain formal (vector) representations of word meanings, and compositional methods in DSMs are employed to obtain such representations for noun compounds. From these representations, different plausibility measures are computed. Three of those measures contribute in predicting the plausibility of noun compounds: The relatedness between the meaning of the head noun and the compound (Head Proximity), the relatedness between the meaning of modifier noun and the compound (Modifier Proximity), and the similarity between the head noun and the modifier noun (Constituent Similarity). We find non-linear interactions between Head Proximity and Modifier Proximity, as well as between Modifier Proximity and Constituent Similarity. Furthermore, Constituent Similarity interacts non-linearly with the familiarity with the compound. These results suggest that a compound is perceived as more plausible if it can be categorized as an instance of the category denoted by the head noun, if the contribution of the modifier to the compound meaning is clear but not redundant, and if the constituents are sufficiently similar in cases where this contribution is not clear. Furthermore, compounds are perceived to be more plausible if they are more familiar, but mostly for cases where the relation between the constituents is less clear. PMID:27732599
Raykar, Neha; Nigam, Aditi; Chisholm, Dan
2016-01-01
Schizophrenia remains a priority condition in mental health policy and service development because of its early onset, severity and consequences for affected individuals and households. This paper reports on an 'extended' cost-effectiveness analysis (ECEA) for schizophrenia treatment in India, which seeks to evaluate through a modeling approach not only the costs and health effects of intervention but also the consequences of a policy of universal public finance (UPF) on health and financial outcomes across income quintiles. Using plausible values for input parameters, we conclude that health gains from UPF are concentrated among the poorest, whereas the non-health gains in the form of out-of-pocket private expenditures averted due to UPF are concentrated among the richest income quintiles. Value of insurance is the highest for the poorest quintile and declines with income. Universal public finance can play a crucial role in ameliorating the adverse economic and social consequences of schizophrenia and its treatment in resource-constrained settings where health insurance coverage is generally poor. This paper shows the potential distributional and financial risk protection effects of treating schizophrenia.
Waite, Ian R.; Brown, Larry R.; Kennen, Jonathan G.; May, Jason T.; Cuffney, Thomas F.; Orlando, James L.; Jones, Kimberly A.
2010-01-01
The successful use of macroinvertebrates as indicators of stream condition in bioassessments has led to heightened interest throughout the scientific community in the prediction of stream condition. For example, predictive models are increasingly being developed that use measures of watershed disturbance, including urban and agricultural land-use, as explanatory variables to predict various metrics of biological condition such as richness, tolerance, percent predators, index of biotic integrity, functional species traits, or even ordination axes scores. Our primary intent was to determine if effective models could be developed using watershed characteristics of disturbance to predict macroinvertebrate metrics among disparate and widely separated ecoregions. We aggregated macroinvertebrate data from universities and state and federal agencies in order to assemble stream data sets of high enough density appropriate for modeling in three distinct ecoregions in Oregon and California. Extensive review and quality assurance of macroinvertebrate sampling protocols, laboratory subsample counts and taxonomic resolution was completed to assure data comparability. We used widely available digital coverages of land-use and land-cover data summarized at the watershed and riparian scale as explanatory variables to predict macroinvertebrate metrics commonly used by state resource managers to assess stream condition. The “best” multiple linear regression models from each region required only two or three explanatory variables to model macroinvertebrate metrics and explained 41–74% of the variation. In each region the best model contained some measure of urban and/or agricultural land-use, yet often the model was improved by including a natural explanatory variable such as mean annual precipitation or mean watershed slope. Two macroinvertebrate metrics were common among all three regions, the metric that summarizes the richness of tolerant macroinvertebrates (RICHTOL) and some form of EPT (Ephemeroptera, Plecoptera, and Trichoptera) richness. Best models were developed for the same two invertebrate metrics even though the geographic regions reflect distinct differences in precipitation, geology, elevation, slope, population density, and land-use. With further development, models like these can be used to elicit better causal linkages to stream biological attributes or condition and can be used by researchers or managers to predict biological indicators of stream condition at unsampled sites.
ERIC Educational Resources Information Center
Gauld, Colin
1998-01-01
Reports that many students do not believe Newton's law of action and reaction and suggests ways in which its plausibility might be enhanced. Reviews how this law has been made more plausible over time by Newton and those who succeeded him. Contains 25 references. (DDR)
Plausibility Reappraisals and Shifts in Middle School Students' Climate Change Conceptions
ERIC Educational Resources Information Center
Lombardi, Doug; Sinatra, Gale M.; Nussbaum, E. Michael
2013-01-01
Plausibility is a central but under-examined topic in conceptual change research. Climate change is an important socio-scientific topic; however, many view human-induced climate change as implausible. When learning about climate change, students need to make plausibility judgments but they may not be sufficiently critical or reflective. The…
Initial conditions of inhomogeneous universe and the cosmological constant problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Totani, Tomonori, E-mail: totani@astron.s.u-tokyo.ac.jp
Deriving the Einstein field equations (EFE) with matter fluid from the action principle is not straightforward, because mass conservation must be added as an additional constraint to make rest-frame mass density variable in reaction to metric variation. This can be avoided by introducing a constraint 0δ(√− g ) = to metric variations δ g {sup μν}, and then the cosmological constant Λ emerges as an integration constant. This is a removal of one of the four constraints on initial conditions forced by EFE at the birth of the universe, and it may imply that EFE are unnecessarily restrictive about initialmore » conditions. I then adopt a principle that the theory of gravity should be able to solve time evolution starting from arbitrary inhomogeneous initial conditions about spacetime and matter. The equations of gravitational fields satisfying this principle are obtained, by setting four auxiliary constraints on δ g {sup μν} to extract six degrees of freedom for gravity. The cost of achieving this is a loss of general covariance, but these equations constitute a consistent theory if they hold in the special coordinate systems that can be uniquely specified with respect to the initial space-like hypersurface when the universe was born. This theory predicts that gravity is described by EFE with non-zero Λ in a homogeneous patch of the universe created by inflation, but Λ changes continuously across different patches. Then both the smallness and coincidence problems of the cosmological constant are solved by the anthropic argument. This is just a result of inhomogeneous initial conditions, not requiring any change of the fundamental physical laws in different patches.« less
Standard rulers, candles, and clocks from the low-redshift universe.
Heavens, Alan; Jimenez, Raul; Verde, Licia
2014-12-12
We measure the length of the baryon acoustic oscillation (BAO) feature, and the expansion rate of the recent Universe, from low-redshift data only, almost model independently. We make only the following minimal assumptions: homogeneity and isotropy, a metric theory of gravity, a smooth expansion history, and the existence of standard candles (supernovæ) and a standard BAO ruler. The rest is determined by the data, which are compilations of recent BAO and type IA supernova results. Making only these assumptions, we find for the first time that the standard ruler has a length of 103.9±2.3h⁻¹ Mpc. The value is a measurement, in contrast to the model-dependent theoretical prediction determined with model parameters set by Planck data (99.3±2.1h⁻¹ Mpc). The latter assumes the cold dark matter model with a cosmological constant, and that the ruler is the sound horizon at radiation drag. Adding passive galaxies as standard clocks or a local Hubble constant measurement allows the absolute BAO scale to be determined (142.8±3.7 Mpc), and in the former case the additional information makes the BAO length determination more precise (101.9±1.9h⁻¹ Mpc). The inverse curvature radius of the Universe is weakly constrained and consistent with zero, independently of the gravity model, provided it is metric. We find the effective number of relativistic species to be N(eff)=3.53±0.32, independent of late-time dark energy or gravity physics.
Creating a mission-based reporting system at an academic health center.
Howell, Lydia Pleotis; Hogarth, Michael; Anders, Thomas F
2002-02-01
The authors developed a Web-based mission-based reporting (MBR) system for their university's (UC Davis's) health system to report faculty members' activities in research and creative work, clinical service, education, and community/university service. They developed the system over several years (1998-2001) in response to a perceived need to better define faculty members' productivity for faculty development, financial management, and program assessment. The goal was to create a measurement tool that could be used by department chairs to counsel faculty on their performances. The MBR system provides measures of effort for each of the university's four missions. Departments or the school can use the output to better define expenditures and allocations of resources. The system provides both a quantitative metric of times spent on various activities within each mission, and a qualitative metric for the effort expended. The authors report the process of developing the MBR system and making it applicable for both clinical and basic science departments, and the mixed success experienced in its implementation. The system appears to depict the activities of most faculty fairly accurately, and chairs of test departments have been generally enthusiastic. However, resistance to general implementation remains, chiefly due to concerns about reliability, validity, and time required for completing the report. The authors conclude that MBR can be useful but will require some streamlining and the elimination of other redundant reporting instruments. A well-defined purpose is required to motivate its use.
Schloneger, Matthew; Hunter, Eric
2016-01-01
The multiple social and performance demands placed on college/university singers could put their still developing voices at risk. Previous ambulatory monitoring studies have analyzed the duration, intensity, and frequency (in Hz) of voice use among such students. Nevertheless, no studies to date have incorporated the simultaneous acoustic voice quality measures into the acquisition of these measures to allow for direct comparison during the same voicing period. Such data could provide greater insight into how young singers use their voices, as well as identify potential correlations between vocal dose and acoustic changes in voice quality. The purpose of this study was to assess the voice use and estimated voice quality of college/university singing students (18–24 y/o, N = 19). Ambulatory monitoring was conducted over three full, consecutive weekdays measuring voice from an unprocessed accelerometer signal measured at the neck. From this signal were analyzed traditional vocal dose metrics such as phonation percentage, dose time, cycle dose, and distance dose. Additional acoustic measures included perceived pitch, pitch strength, LTAS slope, alpha ratio, dB SPL 1–3 kHz, and harmonic-to-noise ratio. Major findings from more than 800 hours of recording indicated that among these students (a) higher vocal doses correlated significantly with greater voice intensity, more vocal clarity and less perturbation; and (b) there were significant differences in some acoustic voice quality metrics between non-singing, solo singing and choral singing. PMID:26897545
NASA Astrophysics Data System (ADS)
Horecka, Hannah M.; Thomas, Andrew C.; Weatherbee, Ryan A.
2014-05-01
The Gulf of Maine experiences annual closures of shellfish harvesting due to the accumulation of toxins produced by dinoflagellates of the genus Alexandrium. Factors controlling the timing, location, and magnitude of these events in eastern Maine remain poorly understood. Previous work identified possible linkages between interannual variability of oceanographic variables and shellfish toxicity along the western Maine coastline but no such linkages were evident along the eastern Maine coast in the vicinity of Cobscook Bay, where strong tidal mixing tends to reduce seasonal variability in oceanographic properties. Using 21 years (1985-2005) of shellfish toxicity data, interannual variability in two metrics of annual toxicity, maximum magnitude and total annual toxicity, from stations in the Cobscook Bay region are examined for relationships to a suite of available environmental variables. Consistent with earlier work, no (or only weak) correlations were found between toxicity and oceanographic variables, even those very proximate to the stations such as local sea surface temperature. Similarly no correlations were evident between toxicity and air temperature, precipitation or relative humidity. The data suggest possible connections to local river discharge, but plausible mechanisms are not obvious. Correlations between toxicity and two variables indicative of local meteorological conditions, dew point and atmospheric pressure, both suggest a link between increased toxicity in these eastern Maine stations and weather conditions characterized by clearer skies/drier air (or less stormy/humid conditions). As no correlation of opposite sign was evident between toxicity and local precipitation, one plausible link is through light availability and its positive impact on phytoplankton production in this persistently foggy section of coast. These preliminary findings point to both the value of maintaining long-term shellfish toxicity sampling and a need for inclusion of weather variability in future modeling studies aimed at development of a more mechanistic understanding of factors controlling interannual differences in eastern Gulf of Maine shellfish toxicity.
Wattanapisit, Apichai; Vijitpongjinda, Surasak; Saengow, Udomsak; Amaek, Waluka; Thanamee, Sanhapan; Petchuay, Prachyapan
2017-01-01
Introduction Physical activity (PA) is important in promoting health, as well as in the treatment and prevention of diseases. However, insufficient PA is still a global health problem and it is also a problem in medical schools. PA training in medical curricula is still sparse or non-existent. There is a need for a comprehensive understanding of the extent of PA in medical schools through several indicators, including people, places and policies. This study includes a survey of the PA prevalence in a medical school and development of a tool, the Medical School Physical Activity Report Card (MSPARC), which will contain concise and understandable infographics and information for exploring, monitoring and reporting information relating to PA prevalence. Methods and analysis This mixed methods study will run from January to September 2017. We will involve the School of Medicine, Walailak University, Thailand, and its medical students (n=285). Data collection will consist of both primary and secondary data, divided into four parts: general information, people, places and policies. We will investigate the PA metrics about (1) people: the prevalence of PA and sedentary behaviours; (2) place: the quality and accessibility of walkable neighbourhoods, bicycle facilities and recreational areas; and (3) policy: PA promotion programmes for medical students, education metrics and investments related to PA. The MSPARC will be developed using simple symbols, infographics and short texts to evaluate the PA metrics of the medical school. Ethics and dissemination This study has been approved by the Human Research Ethics Committee of Walailak University (protocol number: WUEC-16-005-01). Findings will be published in peer-reviewed journals and presented at national or international conferences. The MSPARC and full report will be disseminated to relevant stakeholders, policymakers, staff and clients. PMID:28963299
Kim, Miyong; Han, Hae-Ra; Phillips, Linda
2003-01-01
Metric equivalence is a quantitative way to assess cross-cultural equivalences of translated instruments by examining the patterns of psychometric properties based on cross-cultural data derived from both versions of the instrument. Metric equivalence checks at item and instrument levels can be used as a valuable tool to refine cross-cultural instruments. Korean and English versions of the Center for Epidemiological Studies-Depression Scale (CES-D) were administered to 154 Korean Americans and 151 Anglo Americans to illustrate approaches to assessing their metric equivalence. Inter-item and item-total correlations, Cronbach's alpha coefficients, and factor analysis were used for metric equivalence checks. The alpha coefficient for the Korean-American sample was 0.85 and 0.92 for the Anglo American sample. Although all items of the CES-D surpassed the desirable minimum of 0.30 in the Anglo American sample, four items did not meet the standard in the Korean American sample. Differences in average inter-item correlations were also noted between the two groups (0.25 for Korean Americans and 0.37 for Anglo Americans). Factor analysis identified two factors for both groups, and factor loadings showed similar patterns and congruence coefficients. Results of the item analysis procedures suggest the possibility of bias in certain items that may influence the sensitivity of the Korean version of the CES-D. These item biases also provide a possible explanation for the alpha differences. Although factor loadings showed similar patterns for the Korean and English versions of the CES-D, factorial similarity alone is not sufficient for testing the universality of the structure underlying an instrument.
Scientist impact factor (SIF): a new metric for improving scientists' evaluation?
Lippi, Giuseppe; Mattiuzzi, Camilla
2017-08-01
The publication of scientific research is the mainstay for knowledge dissemination, but is also an essential criterion of scientists' evaluation for recruiting funds and career progression. Although the most widespread approach for evaluating scientists is currently based on the H-index, the total impact factor (IF) and the overall number of citations, these metrics are plagued by some well-known drawbacks. Therefore, with the aim to improve the process of scientists' evaluation, we developed a new and potentially useful indicator of recent scientific output. The new metric scientist impact factor (SIF) was calculated as all citations of articles published in the two years following the publication year of the articles, divided by the overall number of articles published in that year. The metrics was then tested by analyzing data of the 40 top scientists of the local University. No correlation was found between SIF and H-index (r=0.15; P=0.367) or 2 years H-index (r=-0.01; P=0.933), whereas the H-index and 2 years H-index values were found to be highly correlated (r=0.57; P<0.001). A highly significant correlation was also observed between the articles published in one year and the total number of citations to these articles in the two following years (r=0.62; P<0.001). According to our data, the SIF may be a useful measure to complement current metrics for evaluating scientific output. Its use may be especially helpful for young scientists, wherein the SIF reflects the scientific output over the past two years thus increasing their chances to apply to and obtain competitive funding.
Scientist impact factor (SIF): a new metric for improving scientists’ evaluation?
Mattiuzzi, Camilla
2017-01-01
Background The publication of scientific research is the mainstay for knowledge dissemination, but is also an essential criterion of scientists’ evaluation for recruiting funds and career progression. Although the most widespread approach for evaluating scientists is currently based on the H-index, the total impact factor (IF) and the overall number of citations, these metrics are plagued by some well-known drawbacks. Therefore, with the aim to improve the process of scientists’ evaluation, we developed a new and potentially useful indicator of recent scientific output. Methods The new metric scientist impact factor (SIF) was calculated as all citations of articles published in the two years following the publication year of the articles, divided by the overall number of articles published in that year. The metrics was then tested by analyzing data of the 40 top scientists of the local University. Results No correlation was found between SIF and H-index (r=0.15; P=0.367) or 2 years H-index (r=−0.01; P=0.933), whereas the H-index and 2 years H-index values were found to be highly correlated (r=0.57; P<0.001). A highly significant correlation was also observed between the articles published in one year and the total number of citations to these articles in the two following years (r=0.62; P<0.001). Conclusions According to our data, the SIF may be a useful measure to complement current metrics for evaluating scientific output. Its use may be especially helpful for young scientists, wherein the SIF reflects the scientific output over the past two years thus increasing their chances to apply to and obtain competitive funding. PMID:28856143
Principal semantic components of language and the measurement of meaning.
Samsonovich, Alexei V; Samsonovic, Alexei V; Ascoli, Giorgio A
2010-06-11
Metric systems for semantics, or semantic cognitive maps, are allocations of words or other representations in a metric space based on their meaning. Existing methods for semantic mapping, such as Latent Semantic Analysis and Latent Dirichlet Allocation, are based on paradigms involving dissimilarity metrics. They typically do not take into account relations of antonymy and yield a large number of domain-specific semantic dimensions. Here, using a novel self-organization approach, we construct a low-dimensional, context-independent semantic map of natural language that represents simultaneously synonymy and antonymy. Emergent semantics of the map principal components are clearly identifiable: the first three correspond to the meanings of "good/bad" (valence), "calm/excited" (arousal), and "open/closed" (freedom), respectively. The semantic map is sufficiently robust to allow the automated extraction of synonyms and antonyms not originally in the dictionaries used to construct the map and to predict connotation from their coordinates. The map geometric characteristics include a limited number ( approximately 4) of statistically significant dimensions, a bimodal distribution of the first component, increasing kurtosis of subsequent (unimodal) components, and a U-shaped maximum-spread planar projection. Both the semantic content and the main geometric features of the map are consistent between dictionaries (Microsoft Word and Princeton's WordNet), among Western languages (English, French, German, and Spanish), and with previously established psychometric measures. By defining the semantics of its dimensions, the constructed map provides a foundational metric system for the quantitative analysis of word meaning. Language can be viewed as a cumulative product of human experiences. Therefore, the extracted principal semantic dimensions may be useful to characterize the general semantic dimensions of the content of mental states. This is a fundamental step toward a universal metric system for semantics of human experiences, which is necessary for developing a rigorous science of the mind.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graber, Harry L., E-mail: harry.graber@downstate.edu; Xu, Yong; Barbour, Randall L.
Purpose: The work presented here demonstrates an application of diffuse optical tomography (DOT) to the problem of breast-cancer diagnosis. The potential for using spatial and temporal variability measures of the hemoglobin signal to identify useful biomarkers was studied. Methods: DOT imaging data were collected using two instrumentation platforms the authors developed, which were suitable for exploring tissue dynamics while performing a simultaneous bilateral exam. For each component of the hemoglobin signal (e.g., total, oxygenated), the image time series was reduced to eight scalar metrics that were affected by one or more dynamic properties of the breast microvasculature (e.g., average amplitude,more » amplitude heterogeneity, strength of spatial coordination). Receiver-operator characteristic (ROC) analyses, comparing groups of subjects with breast cancer to various control groups (i.e., all noncancer subjects, only those with diagnosed benign breast pathology, and only those with no known breast pathology), were performed to evaluate the effect of cancer on the magnitudes of the metrics and of their interbreast differences and ratios. Results: For women with known breast cancer, simultaneous bilateral DOT breast measures reveal a marked increase in the resting-state amplitude of the vasomotor response in the hemoglobin signal for the affected breast, compared to the contralateral, noncancer breast. Reconstructed 3D spatial maps of observed dynamics also show that this behavior extends well beyond the tumor border. In an effort to identify biomarkers that have the potential to support clinical aims, a group of scalar quantities extracted from the time series measures was systematically examined. This analysis showed that many of the quantities obtained by computing paired responses from the bilateral scans (e.g., interbreast differences, ratios) reveal statistically significant differences between the cancer-positive and -negative subject groups, while the corresponding measures derived from individual breast scans do not. ROC analyses yield area-under-curve values in the 77%–87% range, depending on the metric, with sensitivity and specificity values ranging from 66% to 91%. An interesting result is the initially unexpected finding that the hemodynamic-image metrics are only weakly dependent on the tumor burden, implying that the DOT technique employed is sensitive to tumor-induced changes in the vascular dynamics of the surrounding breast tissue as well. Computational modeling studies serve to identify which properties of the vasomotor response (e.g., average amplitude, amplitude heterogeneity, and phase heterogeneity) principally determine the values of the metrics and their codependences. Findings from the modeling studies also serve to clarify the influence of spatial-response heterogeneity and of system-design limitations, and they reveal the impact that a complex dependence of metric values on the modeled behaviors has on the success in distinguishing between cancer-positive and -negative subjects. Conclusions: The authors identified promising hemoglobin-based biomarkers for breast cancer from measures of the resting-state dynamics of the vascular bed. A notable feature of these biomarkers is that their spatial extent encompasses a large fraction of the breast volume, which is mainly independent of tumor size. Tumor-induced induction of nitric oxide synthesis, a well-established concomitant of many breast cancers, is offered as a plausible biological causal factor for the reported findings.« less
NASA Astrophysics Data System (ADS)
Lombardi, D.
2011-12-01
Plausibility judgments-although well represented in conceptual change theories (see, for example, Chi, 2005; diSessa, 1993; Dole & Sinatra, 1998; Posner et al., 1982)-have received little empirical attention until our recent work investigating teachers' and students' understanding of and perceptions about human-induced climate change (Lombardi & Sinatra, 2010, 2011). In our first study with undergraduate students, we found that greater plausibility perceptions of human-induced climate accounted for significantly greater understanding of weather and climate distinctions after instruction, even after accounting for students' prior knowledge (Lombardi & Sinatra, 2010). In a follow-up study with inservice science and preservice elementary teachers, we showed that anger about the topic of climate change and teaching about climate change was significantly related to implausible perceptions about human-induced climate change (Lombardi & Sinatra, 2011). Results from our recent studies helped to inform our development of a model of the role of plausibility judgments in conceptual change situations. The model applies to situations involving cognitive dissonance, where background knowledge conflicts with an incoming message. In such situations, we define plausibility as a judgment on the relative potential truthfulness of incoming information compared to one's existing mental representations (Rescher, 1976). Students may not consciously think when making plausibility judgments, expending only minimal mental effort in what is referred to as an automatic cognitive process (Stanovich, 2009). However, well-designed instruction could facilitate students' reappraisal of plausibility judgments in more effortful and conscious cognitive processing. Critical evaluation specifically may be one effective method to promote plausibility reappraisal in a classroom setting (Lombardi & Sinatra, in progress). In science education, critical evaluation involves the analysis of how evidentiary data support a hypothesis and its alternatives. The presentation will focus on how instruction promoting critical evaluation can encourage individuals to reappraise their plausibility judgments and initiate knowledge reconstruction. In a recent pilot study, teachers experienced an instructional scaffold promoting critical evaluation of two competing climate change theories (i.e., human-induced and increasing solar irradiance) and significantly changed both their plausibility judgments and perceptions of correctness toward the scientifically-accepted model of human-induced climate change. A comparison group of teachers who did not experience the critical evaluation activity showed no significant change. The implications of these studies for future research and instruction will be discussed in the presentation, including effective ways to increase students' and teachers' ability to be critically evaluative and reappraise their plausibility judgments. With controversial science issues, such as climate change, such abilities may be necessary to facilitate conceptual change.
Metric Tests for Curvature from Weak Lensing and Baryon Acoustic Oscillations
NASA Astrophysics Data System (ADS)
Bernstein, G.
2006-02-01
We describe a practical measurement of the curvature of the universe which, unlike current constraints, relies purely on the properties of the Robertson-Walker metric rather than any assumed model for the dynamics and content of the universe. The observable quantity is the cross-correlation between foreground mass and gravitational shear of background galaxies, which depends on the angular diameter distances dA(zl), dA(zs), and dA(zs,zl) on the degenerate triangle formed by observer, source, and lens. In a flat universe, dA(zl,zs)=dA(zs)-dA(zl), but in curved universes an additional term ~Ωk appears and alters the lensing observables even if dA(z) is fixed. We describe a method whereby weak-lensing data can be used to solve simultaneously for dA and the curvature. This method is completely insensitive to the equation of state of the contents of the universe, or amendments to general relativity that alter the gravitational deflection of light or the growth of structure. The curvature estimate is also independent of biases in the photometric redshift scale. This measurement is shown to be subject to a degeneracy among dA, Ωk, and the galaxy bias factors that may be broken by using the same imaging data to measure the angular scale of baryon acoustic oscillations. Simplified estimates of the accuracy attainable by this method indicate that ambitious weak-lensing + baryon-oscillation surveys would measure Ωk to an accuracy ~0.04f-1/2sky(σlnz/0.04)1/2, where σlnz is the photometric redshift error. The Fisher-matrix formalism developed here is also useful for predicting bounds on curvature and other characteristics of parametric dark energy models. We forecast some representative error levels and compare ours to other analyses of the weak-lensing cross-correlation method. We find both curvature and parametric constraints to be surprisingly insensitive to the systematic shear calibration errors.
Acquisition Management for System of Systems: Affordability through Effective Portfolio Management
2013-04-01
the management of strategic “ portfolios of systems” in military acquisitions; this includes application of Real Options (RO) theory and metrics such...Affordability Through Effective Portfolio Management Navindran Davendralingam and Daniel DeLaurentis Purdue University Published April 1, 2013...Systems: Affordability Through Effective Portfolio Management 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d
2003-04-15
the monitors, the authors are confident that the color fidelity is accurate. The primary physical difference of field versus lab tests is the level... Creelman , C. Douglas, Detection theory: A user’s guide, Cambridge University Press, Cambridge, U.K., 1991, pp. 189-190. *For more information, contact Dr. Thomas Meitzler at (586) 574-5405, email: meitzlet@tacom.army.mil
Screen Fingerprints as a Novel Modality for Active Authentication
2014-03-01
and mouse dynamics [9]. Some other examples of the computational behavior metrics of the cognitive fingerprint include eye tracking, how Approved...SCREEN FINGERPRINTS AS A NOVEL MODALITY FOR ACTIVE AUTHENTICATION UNIVERSITY OF MARYLAND MARCH 2014 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC...COVERED (From - To) MAY 2012 – OCT 2013 4. TITLE AND SUBTITLE SCREEN FINGERPRINTS AS A NOVEL MODALITY FOR ACTIVE AUTHENTICATION 5a. CONTRACT
2011-07-01
joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement. Analyzing data with modern statistical techniques to determine the
2016-10-01
total of 52 subjects have enrolled in the study: Veteran subjects n=34 and University of Utah subjects n= 18 . Preliminary analysis indicates that daily...relevant change 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18 . NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC a. REPORT
Spiroski, Mirko
2014-01-01
To analyse current ranking (2013) of institutions, journals and researchers in the Republic of Macedonia. the country rankings of R. Macedonia were analyzed with SCImago Country & Journal Rank (SJR) for subject area Medicine in the years 1996-2013, and ordered by H-index. SCImago Institutions Rankings for 2013 was used for the scientific impact of biomedical institutions in the Republic of Macedonia. Journal metrics from Elsevier for the Macedonian scholarly journals for the period 2009-2013 were performed. Source Normalized Impact per Paper (SNIP), the Impact per Publication (IPP), and SCImago Journal Rank (SJR) were analysed. Macedonian scholarly biomedical journals included in Google Scholar metrics (2013, 2012) were analysed with h5-index and h5-median (June 2014). A semantic analysis of the PubMed database was performed with GoPubMed on November 2, 2014 in order to identify published papers from the field of biomedical sciences affiliated with the country of Macedonia. Harzing's Publish or Perish software was used for author impact analysis and the calculation of the Hirsh-index based on Google Scholar query. The rank of subject area Medicine of R. Macedonia according to the SCImago Journal & Country Rank (SJR) is 110th in the world and 17th in Eastern Europe. Of 20 universities in Macedonia, only Ss Cyril and Methodius University, Skopje, and the University St Clement of Ohrid, Bitola, are listed in the SCImago Institutions Rankings (SIR) for 2013. A very small number of Macedonian scholarly journals is included in Web of Sciences (2), PubMed (1), PubMed Central (1), SCOPUS (6), SCImago (6), and Google Scholar metrics (6). The rank of Hirsh index (h-index) was different from the rank of number of abstracts indexed in PubMed for the top 20 authors from R. Macedonia. The current biomedical scientific impact (2013) of institutions, academic journals and researchers in R. Macedonia is very low. There is an urgent need for organized measures to improve the quality and output of institutions, scholarly journals, and researchers in R. Macedonia in order to achieve higher international standards.
Ultrabaric relativistic superfluids
NASA Astrophysics Data System (ADS)
Papini, G.; Weiss, M.
1985-09-01
Ultrabaric superfluid solutions are obtained for Einstein's equations to examine the possibility of the existence of superluminal sound speeds. The discussion is restricted only by requiring the energy-momentum tensor and the equation of state of matter to be represented by full relativistic equations. Only a few universes are known to satisfy the conditions, and those exhibit tension and are inflationary. Superluminal sound velocities are shown, therefore, to be possible for the interior Schwarzchild metric, which has been used to explain the red shift of quasars, and the Stephiani solution (1967). The latter indicates repeated transitions between superluminal and subliminal sound velocities in the hyperbaric superfluid of the early universe.
Anisotropy effects on baryogenesis in f(R) theories of gravity
NASA Astrophysics Data System (ADS)
Aghamohammadi, A.; Hossienkhani, H.; Saaidi, Kh.
2018-04-01
We study the f(R) theory of gravity in an anisotropic metric and its effect on the baryon number-to-entropy ratio. The mechanism of gravitational baryogenesis based on the CPT-violating gravitational interaction between derivative of the Ricci scalar curvature and the baryon-number current is investigated in the context of the f(R) gravity. The gravitational baryogenesis in the Bianchi type I (BI) Universe is examined. We survey the effect of anisotropy of the Universe on the baryon asymmetry from the point of view of the f(R) theories of gravity and its effect on nb/s for radiation dominant regime.
Archimedes force on Casimir apparatus
NASA Astrophysics Data System (ADS)
Shevchenko, Vladimir; Shevrin, Efim
2016-08-01
This paper addresses a problem of Casimir apparatus in dense medium, put in weak gravitational field. The falling of the apparatus has to be governed by the equivalence principle with proper account for contributions to the weight of the apparatus from its material part and from distorted quantum fields. We discuss general expression for the corresponding force in metric with cylindrical symmetry. By way of example, we compute explicit expression for Archimedes force, acting on the Casimir apparatus of finite size, immersed into thermal bath of free scalar field. It is shown that besides universal term, proportional to the volume of the apparatus, there are non-universal quantum corrections, depending on the boundary conditions.
NONSINGULAR UNIVERSES IN GAUSS–BONNET GRAVITY’S RAINBOW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendi, Seyed Hossein; Momennia, Mehrab; Panah, Behzad Eslam
In this paper, we study the rainbow deformation of Friedmann-Robertson-Walker (FRW) cosmology in both Einstein gravity and Gauss–Bonnet (GB) gravity. We demonstrate that the singularity in FRW cosmology can be removed because of the rainbow deformation of the FRW metric. We obtain the general constraints required for FRW cosmology to be free of singularities. We observe that the inclusion of GB gravity can significantly change the constraints required to obtain nonsingular universes. We use rainbow functions motivated by the hard spectra of gamma-ray bursts to deform FRW cosmology and explicitly demonstrate that such a deformation removes the singularity in FRWmore » cosmology.« less
Gödel universes in string theory
NASA Astrophysics Data System (ADS)
Barrow, John D.; Dabrowski, Mariusz P.
1998-11-01
We show that homogeneous Gödel spacetimes need not contain closed timelike curves in low-energy-effective string theories. We find exact solutions for the Gödel metric in string theory for the full O(α') action including both dilaton and axion fields. The results are valid for bosonic, heterotic and super-strings. To first order in the inverse string tension α', these solutions display a simple relation between the angular velocity of the Gödel universe, Ω, and the inverse string tension of the form α'=1/Ω2 in the absence of the axion field. The generalization of this relationship is also found when the axion field is present.
FLRW Cosmology from Yang-Mills Gravity with Translational Gauge Symmetry
NASA Astrophysics Data System (ADS)
Katz, Daniel
2013-03-01
We extend to basic cosmology the subject of Yang-Mills gravity — a theory of gravity based on local translational gauge invariance in flat space-time. It has been shown that this particular gauge invariance leads to tensor factors in the macroscopic limit of the equations of motion of particles which plays the same role as the metric tensor of general relativity (GR). The assumption that this "effective metric" tensor takes on the standard FLRW form is our starting point. Equations analogous to the Friedmann equations are derived and then solved in closed form for the three special cases of a universe dominated by (1) matter, (2) radiation and (3) dark energy. We find that the solutions for the scale factor are similar to, but distinct from, those found in the corresponding GR based treatment.
Han, Aaron L-F; Wong, Derek F; Chao, Lidia S; He, Liangye; Lu, Yi
2014-01-01
With the rapid development of machine translation (MT), the MT evaluation becomes very important to timely tell us whether the MT system makes any progress. The conventional MT evaluation methods tend to calculate the similarity between hypothesis translations offered by automatic translation systems and reference translations offered by professional translators. There are several weaknesses in existing evaluation metrics. Firstly, the designed incomprehensive factors result in language-bias problem, which means they perform well on some special language pairs but weak on other language pairs. Secondly, they tend to use no linguistic features or too many linguistic features, of which no usage of linguistic feature draws a lot of criticism from the linguists and too many linguistic features make the model weak in repeatability. Thirdly, the employed reference translations are very expensive and sometimes not available in the practice. In this paper, the authors propose an unsupervised MT evaluation metric using universal part-of-speech tagset without relying on reference translations. The authors also explore the performances of the designed metric on traditional supervised evaluation tasks. Both the supervised and unsupervised experiments show that the designed methods yield higher correlation scores with human judgments.
NASA Astrophysics Data System (ADS)
Bonnefoy, M.; Currie, T.; Marleau, G.-D.; Schlieder, J. E.; Wisniewski, J.; Carson, J.; Covey, K. R.; Henning, T.; Biller, B.; Hinz, P.; Klahr, H.; Marsh Boyer, A. N.; Zimmerman, N.; Janson, M.; McElwain, M.; Mordasini, C.; Skemer, A.; Bailey, V.; Defrère, D.; Thalmann, C.; Skrutskie, M.; Allard, F.; Homeier, D.; Tamura, M.; Feldt, M.; Cumming, A.; Grady, C.; Brandner, W.; Helling, C.; Witte, S.; Hauschildt, P.; Kandori, R.; Kuzuhara, M.; Fukagawa, M.; Kwon, J.; Kudo, T.; Hashimoto, J.; Kusakabe, N.; Abe, L.; Brandt, T.; Egner, S.; Guyon, O.; Hayano, Y.; Hayashi, M.; Hayashi, S.; Hodapp, K.; Ishii, M.; Iye, M.; Knapp, G.; Matsuo, T.; Mede, K.; Miyama, M.; Morino, J.-I.; Moro-Martin, A.; Nishimura, T.; Pyo, T.; Serabyn, E.; Suenaga, T.; Suto, H.; Suzuki, R.; Takahashi; Takami, M.; Takato, N.; Terada, H.; Tomono, D.; Turner, E.; Watanabe, M.; Yamada, T.; Takami, H.; Usuda, T.
2014-02-01
Context. We previously reported the direct detection of a low-mass companion at a projected separation of 55 ± 2 AU around the B9-type star κ Andromedae. The properties of the system (mass ratio, separation) make it a benchmark for understanding the formation and evolution of gas giant planets and brown dwarfs on wide orbits. Aims: We present new angular differential imaging (ADI) images of the system at 2.146 (Ks), 3.776 (L'), 4.052 (NB_4.05), and 4.78 μm (M') obtained with Keck/NIRC2 and LBTI/LMIRCam, as well as more accurate near-infrared photometry of the star with the MIMIR instrument. We aim to determine the near-infrared spectral energy distribution of the companion and use it to characterize the object. Methods: We used analysis methods adapted to ADI to extract the companion flux. We compared the photometry of the object to reference young, and old objects and to a set of seven PHOENIX-based atmospheric models of cool objects accounting for the formation of dust. We used evolutionary models to derive mass estimates considering a wide range of plausible initial conditions. Finally, we used dedicated formation models to discuss the possible origin of the companion. Results: We derive a more accurate J = 15.86 ± 0.21, H = 14.95 ± 0.13, Ks = 14.32 ± 0.09 mag for κ And b. We detect the companion in all our high-contrast observations. We confirm previous contrasts obtained at Ks and L' band. We derive NB_4.05 = 13.0 ± 0.2, and M' = 13.3 ± 0.3 mag and estimate log 10(L/L⊙) = -3.76 ± 0.06. Atmospheric models yield Teff = 1900+100-200 K. They do not set any constraint on the surface gravity. "Hot-start" evolutionary models predict masses of 14+25-2 MJup based on the luminosity and temperature estimates, and when considering a conservative age range for the system (30+120-10 Myr), "warm-start" evolutionary tracks constrain the mass to M ≥ 10MJup. Conclusions: The mass of κ Andromedae b mostly falls in the brown-dwarf regime, owing to remaining uncertainties in age and in mass-luminosity models. According to the formation models, disk instability in a primordial disk may account for the position and a wide range of plausible masses of κ And b. The LBT is an international collaboration among institutions in the United States, Italy, and Germany. LBT Corporation partners are: The University of Arizona on behalf of the Arizona university system; Instituto Nazionale di Astrofisica, Italy; LBT Beteiligungsgesellschaft, Germany, representing the Max-Planck Society, the Astrophysical Institute Potsdam, and Heidelberg University; The Ohio State University, and The Research Corporation, on behalf of the University of Notre Dame, University of Minnesota, and University of Virginia.Appendices are available in electronic form at http://www.aanda.org
NASA Astrophysics Data System (ADS)
Totani, Tomonori
2017-10-01
In standard general relativity the universe cannot be started with arbitrary initial conditions, because four of the ten components of the Einstein's field equations (EFE) are constraints on initial conditions. In the previous work it was proposed to extend the gravity theory to allow free initial conditions, with a motivation to solve the cosmological constant problem. This was done by setting four constraints on metric variations in the action principle, which is reasonable because the gravity's physical degrees of freedom are at most six. However, there are two problems about this theory; the three constraints in addition to the unimodular condition were introduced without clear physical meanings, and the flat Minkowski spacetime is unstable against perturbations. Here a new set of gravitational field equations is derived by replacing the three constraints with new ones requiring that geodesic paths remain geodesic against metric variations. The instability problem is then naturally solved. Implications for the cosmological constant Λ are unchanged; the theory converges into EFE with nonzero Λ by inflation, but Λ varies on scales much larger than the present Hubble horizon. Then galaxies are formed only in small Λ regions, and the cosmological constant problem is solved by the anthropic argument. Because of the increased degrees of freedom in metric dynamics, the theory predicts new non-oscillatory modes of metric anisotropy generated by quantum fluctuation during inflation, and CMB B -mode polarization would be observed differently from the standard predictions by general relativity.
Universe without dark energy: Cosmic acceleration from dark matter-baryon interactions
NASA Astrophysics Data System (ADS)
Berezhiani, Lasha; Khoury, Justin; Wang, Junpu
2017-06-01
Cosmic acceleration is widely believed to require either a source of negative pressure (i.e., dark energy), or a modification of gravity, which necessarily implies new degrees of freedom beyond those of Einstein gravity. In this paper we present a third possibility, using only dark matter (DM) and ordinary matter. The mechanism relies on the coupling between dark matter and ordinary matter through an effective metric. Dark matter couples to an Einstein-frame metric, and experiences a matter-dominated, decelerating cosmology up to the present time. Ordinary matter couples to an effective metric that depends also on the DM density, in such a way that it experiences late-time acceleration. Linear density perturbations are stable and propagate with arbitrarily small sound speed, at least in the case of "pressure" coupling. Assuming a simple parametrization of the effective metric, we show that our model can successfully match a set of basic cosmological observables, including luminosity distance, baryon acoustic oscillation measurements, angular-diameter distance to last scattering, etc. For the growth history of density perturbations, we find an intriguing connection between the growth factor and the Hubble constant. To get a growth history similar to the Λ CDM prediction, our model predicts a higher H0, closer to the value preferred by direct estimates. On the flip side, we tend to overpredict the growth of structures whenever H0 is comparable to the Planck preferred value. The model also tends to predict larger redshift-space distortions at low redshift than Λ CDM .
NASA Astrophysics Data System (ADS)
Kaewkhao, Narakorn; Gumjudpai, Burin
2018-06-01
We consider, in Palatini formalism, a modified gravity of which the scalar field derivative couples to Einstein tensor. In this scenario, Ricci scalar, Ricci tensor and Einstein tensor are functions of connection field. As a result, the connection field gives rise to relation, hμν = fgμν between effective metric, hμν and the usual metric gμν where f = 1 - κϕ,αϕ,α / 2. In FLRW universe, NMDC coupling constant is limited in a range of - 2 /ϕ˙2 < κ ≤ ∞ preserving Lorentz signature of the effective metric. Slowly-rolling regime provides κ < 0 forbidding graviton from traveling at superluminal speed. Effective gravitational coupling and entropy of blackhole's apparent horizon are derived. In case of negative coupling, acceleration could happen even with weff > - 1 / 3. Power-law potentials of chaotic inflation are considered. For V ∝ϕ2 and V ∝ϕ4, it is possible to obtain tensor-to-scalar ratio lower than that of GR so that it satisfies r < 0 . 12 as constrained by Planck 2015 (Ade et al., 2016). The V ∝ϕ2 case yields acceptable range of spectrum index and r values. The quartic potential's spectrum index is disfavored by the Planck results. Viable range of κ for V ∝ϕ2 case lies in positive region, resulting in less blackhole's entropy, superluminal metric, more amount of inflation, avoidance of super-Planckian field initial value and stronger gravitational constant.
NASA Astrophysics Data System (ADS)
Vacaru, Sergiu I.
2015-04-01
We reinvestigate how generic off-diagonal cosmological solutions depending, in general, on all spacetime coordinates can be constructed in massive and -modified gravity using the anholonomic frame deformation method. New classes of locally anisotropic and (in-) homogeneous cosmological metrics are constructed with open and closed spatial geometries. By resorting to such solutions, we show that they describe the late time acceleration due to effective cosmological terms induced by nonlinear off-diagonal interactions, possible modifications of the gravitational action and graviton mass. The cosmological metrics and related Stückelberg fields are constructed in explicit form up to nonholonomic frame transforms of the Friedmann-Lamaître-Robertson-Walker (FLRW) coordinates. The solutions include matter, graviton mass, and other effective sources modeling nonlinear gravitational and matter field interactions with polarization of physical constants and deformations of metrics, which may explain dark energy and dark matter effects. However, we argue that it is not always necessary to modify gravity if we consider the effective generalized Einstein equations with nontrivial vacuum and/or non-minimal coupling with matter. Indeed, we state certain conditions when such configurations mimic interesting solutions in general relativity and modifications, for instance, when we can extract the general Painlevé-Gullstrand and FLRW metrics. In a more general context, we elaborate on a reconstruction procedure for off-diagonal cosmological solutions which describe cyclic and ekpyrotic universes. Finally, open issues and further perspectives are discussed.
Vassilev, Angel; Murzac, Adrian; Zlatkova, Margarita B; Anderson, Roger S
2009-03-01
Weber contrast, DeltaL/L, is a widely used contrast metric for aperiodic stimuli. Zele, Cao & Pokorny [Zele, A. J., Cao, D., & Pokorny, J. (2007). Threshold units: A correct metric for reaction time? Vision Research, 47, 608-611] found that neither Weber contrast nor its transform to detection-threshold units equates human reaction times in response to luminance increments and decrements under selective rod stimulation. Here we show that their rod reaction times are equated when plotted against the spatial luminance ratio between the stimulus and its background (L(max)/L(min), the larger and smaller of background and stimulus luminances). Similarly, reaction times to parafoveal S-cone selective increments and decrements from our previous studies [Murzac, A. (2004). A comparative study of the temporal characteristics of processing of S-cone incremental and decremental signals. PhD thesis, New Bulgarian University, Sofia, Murzac, A., & Vassilev, A. (2004). Reaction time to S-cone increments and decrements. In: 7th European conference on visual perception, Budapest, August 22-26. Perception, 33, 180 (Abstract).], are better described by the spatial luminance ratio than by Weber contrast. We assume that the type of stimulus detection by temporal (successive) luminance discrimination, by spatial (simultaneous) luminance discrimination or by both [Sperling, G., & Sondhi, M. M. (1968). Model for visual luminance discrimination and flicker detection. Journal of the Optical Society of America, 58, 1133-1145.] determines the appropriateness of one or other contrast metric for reaction time.
The concept of surgical operating list 'efficiency': a formula to describe the term.
Pandit, J J; Westbury, S; Pandit, M
2007-09-01
While numerous reports have sought ways of improving the efficiency of surgical operating lists, none has defined 'efficiency'. We describe a formula that defines efficiency as incorporating three elements: maximising utilisation, minimising over-running and minimising cancellations on a list. We applied this formula to hypothetical (but realistic) scenarios, and our formula yielded plausible descriptions of these. We also applied the formula to 16 consecutive elective surgical lists from three gynaecology teams (two at a university hospital and one at a non-university hospital). Again, the formula gave useful insights into problems faced by the teams in improving their performance, and it also guided possible solutions. The formula confirmed that a team that schedules cases according to the predicted durations of the operations listed (i.e. the non-university hospital team) suffered fewer cancellations (median 5% vs 8% and 13%) and fewer list over-runs (6% vs 38% and 50%), and performed considerably more efficiently (90% vs 79% and 72%; p = 0.038) than teams that did not do so (i.e. those from the university hospital). We suggest that surgical list performance is more completely described by our formula for efficiency than it is by other conventional measures such as list utilisation or cancellation rate alone.
ERIC Educational Resources Information Center
Staub, Adrian; Rayner, Keith; Pollatsek, Alexander; Hyona, Jukka; Majewski, Helen
2007-01-01
Readers' eye movements were monitored as they read sentences containing noun-noun compounds that varied in frequency (e.g., elevator mechanic, mountain lion). The left constituent of the compound was either plausible or implausible as a head noun at the point at which it appeared, whereas the compound as a whole was always plausible. When the head…
NASA Astrophysics Data System (ADS)
Wallick, R.; Anderson, S.; Keith, M.; Cannon, C.; O'Connor, J. E.
2010-12-01
Gravel bed rivers in the Pacific Northwest and elsewhere provide an important source of commercial aggregate. Mining in-stream gravel, however, can alter channel and bar morphology, resulting in habitat degradation for aquatic species. In order to sustainably manage rivers subject to in-stream gravel extraction, regulatory agencies in Oregon have requested that the USGS complete a series of comprehensive geomorphic and sediment transport studies to provide context for regulatory-agency management of in-stream gravel extraction in Oregon streams. The Umpqua River in western Oregon poses special challenges to this type of assessment. Whereas most rivers subject to gravel extraction are relatively rich in bed-material sediment, the Umpqua River is a mixed bedrock-alluvium system draining a large (1,804 km2) basin; hence typical bed-material transport analyses and ecologic and geomorphic lessons of in-stream gravel extraction on more gravel-rich rivers have limited applicability. Consequently, we have relied upon multiple analyses, including comprehensive historical mapping, bedload transport modeling, and a GIS-based sediment yield analysis to assess patterns of bed-material transport and annual rates of bed-material flux. These analyses, combined with numerous historical accounts, indicate that since at least the 1840’s, the Umpqua River planform has been stable, as bar geometry is largely fixed by valley physiography and the channel itself is underlain mainly by bedrock. Preliminary estimates of annual bedload transport rates calculated for the period 1951-2008 from bed-material transport capacity relations at 42 bars along the South Umpqua and mainstem Umpqua Rivers vary from 0 to 600,000 metric tons per year, with this large spread reflecting variability in bar geometry and grainsize. Large stable bars are activated only during exceptionally large floods and have negligible transport during most years whereas smaller, low elevation bars serve as transient storage for gravel transported during typical flood events. A more plausible range of average annual transport rates, based on bedload transport capacity estimates for bars with reasonable values for reference shear stress, is 500-50,000 metric tons/year. Our sediment yield and mapping analyses support these more conservative estimates, providing annual transport rates of 13,000-50,000 metric tons per year for the South Umpqua River and mainstem Umpqua River through the Coast Range. Downstream, predicted flux rates decrease as attrition exceeds input of bed material, gradually diminishing to 30,000-40,000 metric tons at the head of tide. Because bed-material transport along the supply-limited Umpqua River is highly variable in time and space, the range of predicted flux values is thought to characterize the upper bounds of annual gravel transport.
NASA Astrophysics Data System (ADS)
Akhtar, Taimoor; Shoemaker, Christine
2016-04-01
Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saez, D.
1987-03-15
In this work the metric is coupled with a scalar field phi in a simple way. Although this coupling becomes problematic because the energy density of phi appears to be unbounded from below, it is displayed as a very simple coupling leading to a nonsingular cosmological model with an early antigravity regime. A basic study of the inflationary period and various suggestions are presented.
Transition of Army Ground Systems from Production to Sustainment
2016-04-03
Management in partial fulfillment of the degree of Master of Global Leadership and Management Submitted to Defense Acquisition University in...gathered from the survey was analyzed using descriptive statistics for correlations among the data collected, and to analyze items that were grouped...that is used as a detailed plan to describe how the system will be sustained. LCSP’s include how sustainment activity metrics will be determined
What Are Altmetrics? Are They Important? | Poster
Altmetrics, also known as alternative metrics, offer a way to gauge the attention that research output (such as articles, data sets, program code, slide decks, and posters) is generating on social media, reference sharing sites (such as Mendeley), blogs, and other online media.? The term “altmetrics” was coined on Twitter in 2010 by Jason Priem, then a graduate student at the University of North Carolina’s School of Information and Library...
General relativistic considerations of the field shedding model of fast radio bursts
NASA Astrophysics Data System (ADS)
Punsly, Brian; Bini, Donato
2016-06-01
Popular models of fast radio bursts (FRBs) involve the gravitational collapse of neutron star progenitors to black holes. It has been proposed that the shedding of the strong neutron star magnetic field (B) during the collapse is the power source for the radio emission. Previously, these models have utilized the simplicity of the Schwarzschild metric which has the restriction that the magnetic flux is magnetic `hair' that must be shed before final collapse. But neutron stars have angular momentum and charge and a fully relativistic Kerr-Newman solution exists in which B has its source inside of the event horizon. In this Letter, we consider the magnetic flux to be shed as a consequence of the electric discharge of a metastable collapsed state of a Kerr-Newman black hole. It has also been argued that the shedding model will not operate due to pair creation. By considering the pulsar death line, we find that for a neutron star with B = 1011-1013 G and a long rotation period, >1s this is not a concern. We also discuss the observational evidence supporting the plausibility of magnetic flux shedding models of FRBs that are spawned from rapidly rotating progenitors.
Spatial Learning and Action Planning in a Prefrontal Cortical Network Model
Martinet, Louis-Emmanuel; Sheynikhovich, Denis; Benchenane, Karim; Arleo, Angelo
2011-01-01
The interplay between hippocampus and prefrontal cortex (PFC) is fundamental to spatial cognition. Complementing hippocampal place coding, prefrontal representations provide more abstract and hierarchically organized memories suitable for decision making. We model a prefrontal network mediating distributed information processing for spatial learning and action planning. Specific connectivity and synaptic adaptation principles shape the recurrent dynamics of the network arranged in cortical minicolumns. We show how the PFC columnar organization is suitable for learning sparse topological-metrical representations from redundant hippocampal inputs. The recurrent nature of the network supports multilevel spatial processing, allowing structural features of the environment to be encoded. An activation diffusion mechanism spreads the neural activity through the column population leading to trajectory planning. The model provides a functional framework for interpreting the activity of PFC neurons recorded during navigation tasks. We illustrate the link from single unit activity to behavioral responses. The results suggest plausible neural mechanisms subserving the cognitive “insight” capability originally attributed to rodents by Tolman & Honzik. Our time course analysis of neural responses shows how the interaction between hippocampus and PFC can yield the encoding of manifold information pertinent to spatial planning, including prospective coding and distance-to-goal correlates. PMID:21625569
Interactive mixture of inhomogeneous dark fluids driven by dark energy: a dynamical system analysis
NASA Astrophysics Data System (ADS)
Izquierdo, Germán; Blanquet-Jaramillo, Roberto C.; Sussman, Roberto A.
2018-03-01
We examine the evolution of an inhomogeneous mixture of non-relativistic pressureless cold dark matter (CDM), coupled to dark energy (DE) characterised by the equation of state parameter w<-1/3, with the interaction term proportional to the DE density. This coupled mixture is the source of a spherically symmetric Lemaître-Tolman-Bondi (LTB) metric admitting an asymptotic Friedman-Lemaître-Robertson-Walker (FLRW) background. Einstein's equations reduce to a 5-dimensional autonomous dynamical system involving quasi-local variables related to suitable averages of covariant scalars and their fluctuations. The phase space evolution around the critical points (past/future attractors and five saddles) is examined in detail. For all parameter values and both directions of energy flow (CDM to DE and DE to CDM) the phase space trajectories are compatible with a physically plausible early cosmic times behaviour near the past attractor. This result compares favourably with mixtures with interaction driven by the CDM density, whose past evolution is unphysical for DE to CDM energy flow. Numerical examples are provided describing the evolution of an initial profile that can be associated with idealised structure formation scenarios.
A case study of evolutionary computation of biochemical adaptation
NASA Astrophysics Data System (ADS)
François, Paul; Siggia, Eric D.
2008-06-01
Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein-protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature.
New Challenges for Intervertebral Disc Treatment Using Regenerative Medicine
Masuda, Koichi
2010-01-01
The development of tissue engineering therapies for the intervertebral disc is challenging due to ambiguities of disease and pain mechanisms in patients, and lack of consensus on preclinical models for safety and efficacy testing. Although the issues associated with model selection for studying orthopedic diseases or treatments have been discussed often, the multifaceted challenges associated with developing intervertebral disc tissue engineering therapies require special discussion. This review covers topics relevant to the clinical translation of tissue-engineered technologies: (1) the unmet clinical need, (2) appropriate models for safety and efficacy testing, (3) the need for standardized model systems, and (4) the translational pathways leading to a clinical trial. For preclinical evaluation of new therapies, we recommend establishing biologic plausibility of efficacy and safety using models of increasing complexity, starting with cell culture, small animals (rats and rabbits), and then large animals (goat and minipig) that more closely mimic nutritional, biomechanical, and surgical realities of human application. The use of standardized and reproducible experimental procedures and outcome measures is critical for judging relative efficacy. Finally, success will hinge on carefully designed clinical trials with well-defined patient selection criteria, gold-standard controls, and objective outcome metrics to assess performance in the early postoperative period. PMID:19903086
Effects of plausibility on structural priming.
Christianson, Kiel; Luke, Steven G; Ferreira, Fernanda
2010-03-01
We report a replication and extension of Ferreira (2003), in which it was observed that native adult English speakers misinterpret passive sentences that relate implausible but not impossible semantic relationships (e.g., The angler was caught by the fish) significantly more often than they do plausible passives or plausible or implausible active sentences. In the experiment reported here, participants listened to the same plausible and implausible passive and active sentences as in Ferreira (2003), answered comprehension questions, and then orally described line drawings of simple transitive actions. The descriptions were analyzed as a measure of structural priming (Bock, 1986). Question accuracy data replicated Ferreira (2003). Production data yielded an interaction: Passive descriptions were produced more often after plausible passives and implausible actives. We interpret these results as indicative of a language processor that proceeds along differentiated morphosyntactic and semantic routes. The processor may end up adjudicating between conflicting outputs from these routes by settling on a "good enough" representation that is not completely faithful to the input.
Disconnection of network hubs and cognitive impairment after traumatic brain injury.
Fagerholm, Erik D; Hellyer, Peter J; Scott, Gregory; Leech, Robert; Sharp, David J
2015-06-01
Traumatic brain injury affects brain connectivity by producing traumatic axonal injury. This disrupts the function of large-scale networks that support cognition. The best way to describe this relationship is unclear, but one elegant approach is to view networks as graphs. Brain regions become nodes in the graph, and white matter tracts the connections. The overall effect of an injury can then be estimated by calculating graph metrics of network structure and function. Here we test which graph metrics best predict the presence of traumatic axonal injury, as well as which are most highly associated with cognitive impairment. A comprehensive range of graph metrics was calculated from structural connectivity measures for 52 patients with traumatic brain injury, 21 of whom had microbleed evidence of traumatic axonal injury, and 25 age-matched controls. White matter connections between 165 grey matter brain regions were defined using tractography, and structural connectivity matrices calculated from skeletonized diffusion tensor imaging data. This technique estimates injury at the centre of tract, but is insensitive to damage at tract edges. Graph metrics were calculated from the resulting connectivity matrices and machine-learning techniques used to select the metrics that best predicted the presence of traumatic brain injury. In addition, we used regularization and variable selection via the elastic net to predict patient behaviour on tests of information processing speed, executive function and associative memory. Support vector machines trained with graph metrics of white matter connectivity matrices from the microbleed group were able to identify patients with a history of traumatic brain injury with 93.4% accuracy, a result robust to different ways of sampling the data. Graph metrics were significantly associated with cognitive performance: information processing speed (R(2) = 0.64), executive function (R(2) = 0.56) and associative memory (R(2) = 0.25). These results were then replicated in a separate group of patients without microbleeds. The most influential graph metrics were betweenness centrality and eigenvector centrality, which provide measures of the extent to which a given brain region connects other regions in the network. Reductions in betweenness centrality and eigenvector centrality were particularly evident within hub regions including the cingulate cortex and caudate. Our results demonstrate that betweenness centrality and eigenvector centrality are reduced within network hubs, due to the impact of traumatic axonal injury on network connections. The dominance of betweenness centrality and eigenvector centrality suggests that cognitive impairment after traumatic brain injury results from the disconnection of network hubs by traumatic axonal injury. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.
The Plausibility of a String Quartet Performance in Virtual Reality.
Bergstrom, Ilias; Azevedo, Sergio; Papiotis, Panos; Saldanha, Nuno; Slater, Mel
2017-04-01
We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a virtual environment that depicts the performance of a string quartet. 'Plausibility' refers to the component of presence that is the illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians ignored the participant, the musicians sometimes looked towards and followed the participant's movements), Sound Spatialization (Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived, reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind corresponding to the outside scene). We adopted the methodology based on color matching theory, where 20 participants were first able to assess their feeling of plausibility in the environment with each of the four features at their highest setting. Then five times participants started from a low setting on all features and were able to make transitions from one system configuration to another until they matched their original feeling of plausibility. From these transitions a Markov transition matrix was constructed, and also probabilities of a match conditional on feature configuration. The results show that Environment and Gaze were individually the most important factors influencing the level of plausibility. The highest probability transitions were to improve Environment and Gaze, and then Auralization and Spatialization. We present this work as both a contribution to the methodology of assessing presence without questionnaires, and showing how various aspects of a musical performance can influence plausibility.
What if? Neural activity underlying semantic and episodic counterfactual thinking.
Parikh, Natasha; Ruzic, Luka; Stewart, Gregory W; Spreng, R Nathan; De Brigard, Felipe
2018-05-25
Counterfactual thinking (CFT) is the process of mentally simulating alternative versions of known facts. In the past decade, cognitive neuroscientists have begun to uncover the neural underpinnings of CFT, particularly episodic CFT (eCFT), which activates regions in the default network (DN) also activated by episodic memory (eM) recall. However, the engagement of DN regions is different for distinct kinds of eCFT. More plausible counterfactuals and counterfactuals about oneself show stronger activity in DN regions compared to implausible and other- or object-focused counterfactuals. The current study sought to identify a source for this difference in DN activity. Specifically, self-focused counterfactuals may also be more plausible, suggesting that DN core regions are sensitive to the plausibility of a simulation. On the other hand, plausible and self-focused counterfactuals may involve more episodic information than implausible and other-focused counterfactuals, which would imply DN sensitivity to episodic information. In the current study, we compared episodic and semantic counterfactuals generated to be plausible or implausible against episodic and semantic memory reactivation using fMRI. Taking multivariate and univariate approaches, we found that the DN is engaged more during episodic simulations, including eM and all eCFT, than during semantic simulations. Semantic simulations engaged more inferior temporal and lateral occipital regions. The only region that showed strong plausibility effects was the hippocampus, which was significantly engaged for implausible CFT but not for plausible CFT, suggestive of binding more disparate information. Consequences of these findings for the cognitive neuroscience of mental simulation are discussed. Published by Elsevier Inc.
Schmid, Annina B; Coppieters, Michel W
2011-12-01
A high prevalence of dual nerve disorders is frequently reported. How a secondary nerve disorder may develop following a primary nerve disorder remains largely unknown. Although still frequently cited, most explanatory theories were formulated many years ago. Considering recent advances in neuroscience, it is uncertain whether these theories still reflect current expert opinion. A Delphi study was conducted to update views on potential mechanisms underlying dual nerve disorders. In three rounds, seventeen international experts in the field of peripheral nerve disorders were asked to list possible mechanisms and rate their plausibility. Mechanisms with a median plausibility rating of ≥7 out of 10 were considered highly plausible. The experts identified fourteen mechanisms associated with a first nerve disorder that may predispose to the development of another nerve disorder. Of these fourteen mechanisms, nine have not previously been linked to double crush. Four mechanisms were considered highly plausible (impaired axonal transport, ion channel up or downregulation, inflammation in the dorsal root ganglia and neuroma-in-continuity). Eight additional mechanisms were listed which are not triggered by a primary nerve disorder, but may render the nervous system more vulnerable to multiple nerve disorders, such as systemic diseases and neurotoxic medication. Even though many mechanisms were classified as plausible or highly plausible, overall plausibility ratings varied widely. Experts indicated that a wide range of mechanisms has to be considered to better understand dual nerve disorders. Previously listed theories cannot be discarded, but may be insufficient to explain the high prevalence of dual nerve disorders. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L'Huillier, Benjamin; Shafieloo, Arman, E-mail: benjamin@kasi.re.kr, E-mail: shafieloo@kasi.re.kr
Using measurements of H ( z ) and d {sub A}( z ) from the Baryon Oscillation Spectroscopic Survey (BOSS) DR12 and luminosity distances from the Joint Lightcurve Analysis (JLA) compilation of supernovae (SN), we measure H {sub 0} {sub r} {sub d} without any model assumption. Our measurement of H {sub 0} r {sub d} = (10033.20{sup +333.10}{sub −371.81} (SN) ± 128.19 (BAO)) km s{sup −1} is consistent with Planck constrains for the flat ΛCDM model. We also report that higher expansion history rates h ( z ) (among the possibilities) as well as lower-bound values of H {submore » 0} r {sub d} result in better internal consistency among the independent data ( H ( z ) r {sub d} and d {sub A}( z )/ r {sub d} from BAO at z =0.32 and z =0.57 and d {sub L} from JLA) we used in this work. This can be interpreted as an interesting and independent support of Planck cosmology without using any cosmic microwave background data. We then combine these observables to test the Friedmann-Lemaȋtre-Robertson-Walker (FLRW) metric and the flatness of the Universe in a model-independent way at two redshifts, namely 0.32 and 0.57, by introducing a new diagnostic for flat-FLRW, Θ( z ), which only depends on observables of BAO and SN data. Our results are consistent with a flat-FLRW Universe within 2σ.« less
Schloneger, Matthew J; Hunter, Eric J
2017-01-01
The multiple social and performance demands placed on college/university singers could put their still-developing voices at risk. Previous ambulatory monitoring studies have analyzed the duration, intensity, and frequency (in Hertz) of voice use among such students. Nevertheless, no studies to date have incorporated the simultaneous acoustic voice quality measures into the acquisition of these measures to allow for direct comparison during the same voicing period. Such data could provide greater insight into how young singers use their voices, as well as identify potential correlations between vocal dose and acoustic changes in voice quality. The purpose of this study was to assess the voice use and the estimated voice quality of college/university singing students (18-24 years old, N = 19). Ambulatory monitoring was conducted over three full, consecutive weekdays measuring voice from an unprocessed accelerometer signal measured at the neck. From this signal, traditional vocal dose metrics such as phonation percentage, dose time, cycle dose, and distance dose were analyzed. Additional acoustic measures included perceived pitch, pitch strength, long-term average spectrum slope, alpha ratio, dB sound pressure level 1-3 kHz, and harmonic-to-noise ratio. Major findings from more than 800 hours of recording indicated that among these students (a) higher vocal doses correlated significantly with greater voice intensity, more vocal clarity and less perturbation; and (b) there were significant differences in some acoustic voice quality metrics between nonsinging, solo singing, and choral singing. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Cosmological models constructed by van der Waals fluid approximation and volumetric expansion
NASA Astrophysics Data System (ADS)
Samanta, G. C.; Myrzakulov, R.
The universe modeled with van der Waals fluid approximation, where the van der Waals fluid equation of state contains a single parameter ωv. Analytical solutions to the Einstein’s field equations are obtained by assuming the mean scale factor of the metric follows volumetric exponential and power-law expansions. The model describes a rapid expansion where the acceleration grows in an exponential way and the van der Waals fluid behaves like an inflation for an initial epoch of the universe. Also, the model describes that when time goes away the acceleration is positive, but it decreases to zero and the van der Waals fluid approximation behaves like a present accelerated phase of the universe. Finally, it is observed that the model contains a type-III future singularity for volumetric power-law expansion.
Discrete hierarchy of sizes and performances in the exchange-traded fund universe
NASA Astrophysics Data System (ADS)
Vandermarliere, B.; Ryckebusch, J.; Schoors, K.; Cauwels, P.; Sornette, D.
2017-03-01
Using detailed statistical analyses of the size distribution of a universe of equity exchange-traded funds (ETFs), we discover a discrete hierarchy of sizes, which imprints a log-periodic structure on the probability distribution of ETF sizes that dominates the details of the asymptotic tail. This allows us to propose a classification of the studied universe of ETFs into seven size layers approximately organized according to a multiplicative ratio of 3.5 in their total market capitalization. Introducing a similarity metric generalizing the Herfindhal index, we find that the largest ETFs exhibit a significantly stronger intra-layer and inter-layer similarity compared with the smaller ETFs. Comparing the performance across the seven discerned ETF size layers, we find an inverse size effect, namely large ETFs perform significantly better than the small ones both in 2014 and 2015.
The effects of spatial dynamics on a wormhole throat
NASA Astrophysics Data System (ADS)
Alias, Anuar; Wan Abdullah, Wan Ahmad Tajuddin
2018-02-01
Previous studies on dynamic wormholes were focused on the dynamics of the wormhole itself, be it either rotating or evolutionary in character and also in various frameworks from classical to braneworld cosmological models. In this work, we modeled a dynamic factor that represents the spatial dynamics in terms of spacetime expansion and contraction surrounding the wormhole itself. Using an RS2-based braneworld cosmological model, we modified the spacetime metric of Wong and subsequently employed the method of Bronnikov, where it is observed that a traversable wormhole is easier to exist in an expanding brane universe, however it is difficult to exist in a contracting brane universe due to stress-energy tensors requirement. This model of spatial dynamic factor affecting the wormhole throat can also be applied on the cyclic or the bounce universe model.
Mystery of the Hidden Cosmos [Complex Dark Matter
Dobrescu, Bogdan A.; Lincoln, Don
2015-06-16
Scientists know there must be more matter in the universe than what is visible. Searches for this dark matter have focused on a single unseen particle, but decades of experiments have been unsuccessful at finding it. Exotic possibilities for dark matter are looking increasingly plausible. Rather than just one particle, dark matter could contain an entire world of particles and forces that barely interact with normal matter. Complex dark matter could form dark atoms and molecules and even clump together to make hidden galactic disks that overlap with the spiral arms of the Milky Way and other galaxies. Experiments aremore » under way to search for evidence of such a dark sector.« less
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Anderson, M. R.; Schmidt, D. K.
1986-01-01
In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.
NASA Astrophysics Data System (ADS)
Zierenberg, Johannes; Fricke, Niklas; Marenz, Martin; Spitzner, F. P.; Blavatska, Viktoria; Janke, Wolfhard
2017-12-01
We study long-range power-law correlated disorder on square and cubic lattices. In particular, we present high-precision results for the percolation thresholds and the fractal dimension of the largest clusters as a function of the correlation strength. The correlations are generated using a discrete version of the Fourier filtering method. We consider two different metrics to set the length scales over which the correlations decay, showing that the percolation thresholds are highly sensitive to such system details. By contrast, we verify that the fractal dimension df is a universal quantity and unaffected by the choice of metric. We also show that for weak correlations, its value coincides with that for the uncorrelated system. In two dimensions we observe a clear increase of the fractal dimension with increasing correlation strength, approaching df→2 . The onset of this change does not seem to be determined by the extended Harris criterion.
Near horizon symmetry and entropy formula for Kerr-Newman (A)dS black holes
NASA Astrophysics Data System (ADS)
Setare, Mohammad Reza; Adami, Hamed
2018-04-01
In this paper we provide the first non-trivial evidence for universality of the entropy formula 4 πJ 0 + J 0 - beyond pure Einstein gravity in 4-dimensions. We consider the Einstein-Maxwell theory in the presence of cosmological constant, then write near horizon metric of the Kerr-Newman (A)dS black hole in the Gaussian null coordinate system. We consider near horizon fall-off conditions for metric and U(1) gauge field. We find asymptotic combined symmetry generator, consists of diffeomorphism and U(1) gauge transformation, so that it preserves fall-off conditions. Consequently, we find supertranslation, supperrotation and multiple-charge modes and then we show that the entropy formula is held for the Kerr-Newman (A)dS black hole. Supperrotation modes suffer from a problem. By introducing new combined symmetry generator, we cure that problem.
AlZhrani, Gmaan; Alotaibi, Fahad; Azarnoush, Hamed; Winkler-Schwartz, Alexander; Sabbagh, Abdulrahman; Bajunaid, Khalid; Lajoie, Susanne P; Del Maestro, Rolando F
2015-01-01
Assessment of neurosurgical technical skills involved in the resection of cerebral tumors in operative environments is complex. Educators emphasize the need to develop and use objective and meaningful assessment tools that are reliable and valid for assessing trainees' progress in acquiring surgical skills. The purpose of this study was to develop proficiency performance benchmarks for a newly proposed set of objective measures (metrics) of neurosurgical technical skills performance during simulated brain tumor resection using a new virtual reality simulator (NeuroTouch). Each participant performed the resection of 18 simulated brain tumors of different complexity using the NeuroTouch platform. Surgical performance was computed using Tier 1 and Tier 2 metrics derived from NeuroTouch simulator data consisting of (1) safety metrics, including (a) volume of surrounding simulated normal brain tissue removed, (b) sum of forces utilized, and (c) maximum force applied during tumor resection; (2) quality of operation metric, which involved the percentage of tumor removed; and (3) efficiency metrics, including (a) instrument total tip path lengths and (b) frequency of pedal activation. All studies were conducted in the Neurosurgical Simulation Research Centre, Montreal Neurological Institute and Hospital, McGill University, Montreal, Canada. A total of 33 participants were recruited, including 17 experts (board-certified neurosurgeons) and 16 novices (7 senior and 9 junior neurosurgery residents). The results demonstrated that "expert" neurosurgeons resected less surrounding simulated normal brain tissue and less tumor tissue than residents. These data are consistent with the concept that "experts" focused more on safety of the surgical procedure compared with novices. By analyzing experts' neurosurgical technical skills performance on these different metrics, we were able to establish benchmarks for goal proficiency performance training of neurosurgery residents. This study furthers our understanding of expert neurosurgical performance during the resection of simulated virtual reality tumors and provides neurosurgical trainees with predefined proficiency performance benchmarks designed to maximize the learning of specific surgical technical skills. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza
2017-01-01
Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.
University Engineering Design Challenge
2015-01-29
solution was to design a sort of friction winch which would use friction to grip the rope sufficiently to wind itself up the rope without slipping ...provide enough friction to wind itself up the rope. It was determined through analysis that the friction pulley method would be the better solution...squeezed the rope becomes in the groove. This provides the friction which allows the rope not to slip in the pulley as it climbs the rope. The Metric
1980-11-01
Systems: A Raytheon Project History", RADC-TR-77-188, Final Technical Report, June 1977. 4. IBM Federal Systems Division, "Statistical Prediction of...147, June 1979. 4. W. D. Brooks, R. W. Motley, "Analysis of Discrete Software Reliability Models", IBM Corp., RADC-TR-80-84, RADC, New York, April 1980...J. C. King of IBM (Reference 9) and Lori A. Clark (Reference 10) of the University of Massachusetts. Programs, so exercised must be augmented so they
Induction for Radiology Patients
NASA Astrophysics Data System (ADS)
Yıldırım, Pınar; Tolun, Mehmet R.
This paper represents the implementation of an inductive learning algorithm for patients of Radiology Department in Hacettepe University hospitals to discover the relationship between patient demographics information and time that patients spend during a specific radiology exam. ILA has been used for the implementation which generates rules and the results are evaluated by evaluation metrics. According to generated rules, some patients in different age groups or birthplaces may spend more time for the same radiology exam than the others.
Maintaining a Distributed File System by Collection and Analysis of Metrics
NASA Technical Reports Server (NTRS)
Bromberg, Daniel
1997-01-01
AFS(originally, Andrew File System) is a widely-deployed distributed file system product used by companies, universities, and laboratories world-wide. However, it is not trivial to operate: runing an AFS cell is a formidable task. It requires a team of dedicated and experienced system administratores who must manage a user base numbring in the thousands, rather than the smaller range of 10 to 500 faced by the typical system administrator.
Universal properties of knotted polymer rings.
Baiesi, M; Orlandini, E
2012-09-01
By performing Monte Carlo sampling of N-steps self-avoiding polygons embedded on different Bravais lattices we explore the robustness of universality in the entropic, metric, and geometrical properties of knotted polymer rings. In particular, by simulating polygons with N up to 10(5) we furnish a sharp estimate of the asymptotic values of the knot probability ratios and show their independence on the lattice type. This universal feature was previously suggested, although with different estimates of the asymptotic values. In addition, we show that the scaling behavior of the mean-squared radius of gyration of polygons depends on their knot type only through its correction to scaling. Finally, as a measure of the geometrical self-entanglement of the self-avoiding polygons we consider the standard deviation of the writhe distribution and estimate its power-law behavior in the large N limit. The estimates of the power exponent do depend neither on the lattice nor on the knot type, strongly supporting an extension of the universality property to some features of the geometrical entanglement.
An Inverse Square Law Variation for Hubble's Constant
NASA Astrophysics Data System (ADS)
Day, Orville W., Jr.
1999-11-01
The solution to Einstein's gravitational field equations is examined, using a Robertson-Walker metric with positive curvature, when Hubble's parameter, H_0, is taken to be a constant divided by R^2. R is the cosmic scale factor for the universe treated as a three-dimensional hypersphere in a four-dimensional Euclidean space. This solution produces a self-energy of the universe, W^(0)_self, proportional to the square of the total mass, times the universal gravitational constant divided by the cosmic scale factor, R. This result is totally analogous to the self-energy of the electromagnetic field of a charged particle, W^(0)_self = ke^2/2r, where the total charge e is squared, k is the universal electric constant and r is the scale factor, usually identified as the radius of the particle. It is shown that this choice for H0 leads to physically meaningful results for the average mass density and pressure, and a deacceleration parameter q_0=1.
Curvature from Strong Gravitational Lensing: A Spatially Closed Universe or Systematics?
NASA Astrophysics Data System (ADS)
Li, Zhengxiang; Ding, Xuheng; Wang, Guo-Jian; Liao, Kai; Zhu, Zong-Hong
2018-02-01
Model-independent constraints on the spatial curvature are not only closely related to important problems, such as the evolution of the universe and properties of dark energy, but also provide a test of the validity of the fundamental Copernican principle. In this paper, with the distance sum rule in the Friedmann–Lemaître–Robertson–Walker metric, we achieve model-independent measurements of the spatial curvature from the latest type Ia supernovae and strong gravitational lensing (SGL) observations. We find that a spatially closed universe is preferred. Moreover, by considering different kinds of velocity dispersion and subsamples, we study possible factors that might affect model-independent estimations for the spatial curvature from SGL observations. It is suggested that the combination of observational data from different surveys might cause a systematic bias, and the tension between the spatially flat universe and SGL observations is alleviated when the subsample only from the Sloan Lens ACS Survey is used or a more complex treatment for the density profile of lenses is considered.
The Janus Cosmological Model (JCM) : An answer to the missing cosmological antimatter
NASA Astrophysics Data System (ADS)
D'Agostini, Gilles; Petit, Jean-Pierre
2017-01-01
Cosmological antimatter absence remains unexplained. Twin universes 1967 Sakarov's model suggests an answer: excess of matter and anti-quarks production in our universe is balanced by equivalent excess of antimatter and quark in twin universe. JCM provides geometrical framework, with a single manifold , two metrics solutions of two coupled field equations, to describe two populations of particles, one with positive energy-mass and the other with negative energy-mass : the `twin matter'. In a quantum point of view, it's a copy of the standard matter but with negative mass and energy. The matter-antimatter duality holds in both sectors. The standard and twin matters do not interact except through the gravitational coupling expressed in field equations. The twin matter is unobservable from matter-made apparatus. Field equations shows that matter and twin matter repel each other. Twin matter surrounding galaxies explains their confinement (dark matter role) and, in the dust universe era, mainly drives the process of expansion of the positive sector, responsible of the observed acceleration (dark energy role).
Application of plausible reasoning to AI-based control systems
NASA Technical Reports Server (NTRS)
Berenji, Hamid; Lum, Henry, Jr.
1987-01-01
Some current approaches to plausible reasoning in artificial intelligence are reviewed and discussed. Some of the most significant recent advances in plausible and approximate reasoning are examined. A synergism among the techniques of uncertainty management is advocated, and brief discussions on the certainty factor approach, probabilistic approach, Dempster-Shafer theory of evidence, possibility theory, linguistic variables, and fuzzy control are presented. Some extensions to these methods are described, and the applications of the methods are considered.
Methods for measuring the citations and productivity of scientists across time and discipline.
Petersen, Alexander M; Wang, Fengzhong; Stanley, H Eugene
2010-03-01
Publication statistics are ubiquitous in the ratings of scientific achievement, with citation counts and paper tallies factoring into an individual's consideration for postdoctoral positions, junior faculty, and tenure. Citation statistics are designed to quantify individual career achievement, both at the level of a single publication, and over an individual's entire career. While some academic careers are defined by a few significant papers (possibly out of many), other academic careers are defined by the cumulative contribution made by the author's publications to the body of science. Several metrics have been formulated to quantify an individual's publication career, yet none of these metrics account for the collaboration group size, and the time dependence of citation counts. In this paper we normalize publication metrics in order to achieve a universal framework for analyzing and comparing scientific achievement across both time and discipline. We study the publication careers of individual authors over the 50-year period 1958-2008 within six high-impact journals: CELL, the New England Journal of Medicine (NEJM), Nature, the Proceedings of the National Academy of Science (PNAS), Physical Review Letters (PRL), and Science. Using the normalized metrics (i) "citation shares" to quantify scientific success, and (ii) "paper shares" to quantify scientific productivity, we compare the career achievement of individual authors within each journal, where each journal represents a local arena for competition. We uncover quantifiable statistical regularity in the probability density function of scientific achievement in all journals analyzed, which suggests that a fundamental driving force underlying scientific achievement is the competitive nature of scientific advancement.
Methods for measuring the citations and productivity of scientists across time and discipline
NASA Astrophysics Data System (ADS)
Petersen, Alexander M.; Wang, Fengzhong; Stanley, H. Eugene
2010-03-01
Publication statistics are ubiquitous in the ratings of scientific achievement, with citation counts and paper tallies factoring into an individual’s consideration for postdoctoral positions, junior faculty, and tenure. Citation statistics are designed to quantify individual career achievement, both at the level of a single publication, and over an individual’s entire career. While some academic careers are defined by a few significant papers (possibly out of many), other academic careers are defined by the cumulative contribution made by the author’s publications to the body of science. Several metrics have been formulated to quantify an individual’s publication career, yet none of these metrics account for the collaboration group size, and the time dependence of citation counts. In this paper we normalize publication metrics in order to achieve a universal framework for analyzing and comparing scientific achievement across both time and discipline. We study the publication careers of individual authors over the 50-year period 1958-2008 within six high-impact journals: CELL, the New England Journal of Medicine (NEJM), Nature, the Proceedings of the National Academy of Science (PNAS), Physical Review Letters (PRL), and Science. Using the normalized metrics (i) “citation shares” to quantify scientific success, and (ii) “paper shares” to quantify scientific productivity, we compare the career achievement of individual authors within each journal, where each journal represents a local arena for competition. We uncover quantifiable statistical regularity in the probability density function of scientific achievement in all journals analyzed, which suggests that a fundamental driving force underlying scientific achievement is the competitive nature of scientific advancement.
The Invisibility of Diffeomorphisms
NASA Astrophysics Data System (ADS)
De Haro, Sebastian
2017-11-01
I examine the relationship between (d+1)-dimensional Poincaré metrics and d-dimensional conformal manifolds, from both mathematical and physical perspectives. The results have a bearing on several conceptual issues relating to asymptotic symmetries in general relativity and in gauge-gravity duality, as follows: (1: Ambient Construction) I draw from the remarkable work by Fefferman and Graham (Elie Cartan et les Mathématiques d'aujourd'hui, Astérisque, 1985; The Ambient Metric. Annals of Mathematics Studies, Princeton University Press, Princeton, 2012) on conformal geometry, in order to prove two propositions and a theorem that characterise which classes of diffeomorphisms qualify as gravity-invisible. I define natural notions of gravity-invisibility (strong, weak, and simpliciter) that apply to the diffeomorphisms of Poincaré metrics in any dimension. (2: Dualities) I apply the notions of invisibility, developed in (1), to gauge-gravity dualities: which, roughly, relate Poincaré metrics in d+1 dimensions to QFTs in d dimensions. I contrast QFT-visible versus QFT-invisible diffeomorphisms: those gravity diffeomorphisms that can, respectively cannot, be seen from the QFT. The QFT-invisible diffeomorphisms are the ones which are relevant to the hole argument in Einstein spaces. The results on dualities are surprising, because the class of QFT-visible diffeomorphisms is larger than expected, and the class of QFT-invisible ones is smaller than expected, or usually believed, i.e. larger than the PBH diffeomorphisms in Imbimbo et al. (Class Quantum Gravity 17(5):1129, 2000, Eq. 2.6). I also give a general derivation of the asymptotic conformal Killing equation, which has not appeared in the literature before.
Cosmological signature change in Cartan gravity with dynamical symmetry breaking
NASA Astrophysics Data System (ADS)
Magueijo, João; Rodríguez-Vázquez, Matías; Westman, Hans; Złośnik, Tom
2014-03-01
We investigate the possibility for classical metric signature change in a straightforward generalization of the first-order formulation of gravity, dubbed "Cartan gravity." The mathematical structure of this theory mimics the electroweak theory in that the basic ingredients are an SO(1,4) Yang-Mills gauge field Aabμ and a symmetry breaking Higgs field Va, with no metric or affine structure of spacetime presupposed. However, these structures can be recovered, with the predictions of general relativity exactly reproduced, whenever the Higgs field breaking the symmetry to SO(1,3) is forced to have a constant (positive) norm VaVa. This restriction is usually imposed "by hand," but in analogy with the electroweak theory we promote the gravitational Higgs field Va to a genuine dynamical field, subject to nontrivial equations of motion. Even though we limit ourselves to actions polynomial in these variables, we discover a rich phenomenology. Most notably we derive classical cosmological solutions exhibiting a smooth transition between Euclidean and Lorentzian signature in the four-metric. These solutions are nonsingular and arise whenever the SO(1,4) norm of the Higgs field changes sign; i.e. the signature of the metric of spacetime is determined dynamically by the gravitational Higgs field. It is possible to find a plethora of such solutions and in some of them this dramatic behavior is confined to the early Universe, with the theory asymptotically tending to Einstein gravity at late times. Curiously the theory can also naturally embody a well-known dark energy model: Peebles-Ratra quintessence.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A
2014-01-01
The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.
Commissioning for COPD care: a new, recordable metric that supports the patient interest.
Walker, Paul Phillip; Thompson, E; Hill, S L; Holton, K; Bodger, K; Pearson, M G
2016-06-01
Healthcare metrics have been used to drive improvement in outcome and delivery in UK hospital stroke and cardiac care. This model is attractive for chronic obstructive pulmonary disease (COPD) care because of disease frequency and the burden it places on primary, secondary and integrated care services. Using 'hospital episode statistics' (UK 'coding'), we examined hospital 'bed days/1000 population' in 150 UK Primary Care Trusts (PCTs) during 2006-07 and 2007-08. Data were adjusted for COPD prevalence. We looked at year-on-year consistency and factors which influenced variation. There were 248 996 COPD admissions during 2006-08. 'Bed days/1000 PCT population' was consistent between years (r = 0.87; P < 0.001). There was a >2-fold difference in bed days between the best and worst performing PCTs which was primarily a consequence of variation in emergency admission rate (P < 0.001) and proportion of emergency admissions due to COPD (P < 0.001) and to only a lesser extent length of hospital stay (P < 0.001). Bed days/1000 population appears a useful annual metric of COPD care quality. Good COPD care keeps patients active and out of hospital and requires co-ordinated action from both hospital and community services, with an important role for integrated care. This metric demonstrates that current care is highly variable and offers a measurable target to commission against. © The Author 2015. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Modeled streamflow metrics on small, ungaged stream reaches in the Upper Colorado River Basin
Reynolds, Lindsay V.; Shafroth, Patrick B.
2016-01-20
Modeling streamflow is an important approach for understanding landscape-scale drivers of flow and estimating flows where there are no streamgage records. In this study conducted by the U.S. Geological Survey in cooperation with Colorado State University, the objectives were to model streamflow metrics on small, ungaged streams in the Upper Colorado River Basin and identify streams that are potentially threatened with becoming intermittent under drier climate conditions. The Upper Colorado River Basin is a region that is critical for water resources and also projected to experience large future climate shifts toward a drying climate. A random forest modeling approach was used to model the relationship between streamflow metrics and environmental variables. Flow metrics were then projected to ungaged reaches in the Upper Colorado River Basin using environmental variables for each stream, represented as raster cells, in the basin. Last, the projected random forest models of minimum flow coefficient of variation and specific mean daily flow were used to highlight streams that had greater than 61.84 percent minimum flow coefficient of variation and less than 0.096 specific mean daily flow and suggested that these streams will be most threatened to shift to intermittent flow regimes under drier climate conditions. Map projection products can help scientists, land managers, and policymakers understand current hydrology in the Upper Colorado River Basin and make informed decisions regarding water resources. With knowledge of which streams are likely to undergo significant drying in the future, managers and scientists can plan for stream-dependent ecosystems and human water users.
Development and implementation of a balanced scorecard in an academic hospitalist group.
Hwa, Michael; Sharpe, Bradley A; Wachter, Robert M
2013-03-01
Academic hospitalist groups (AHGs) are often expected to excel in multiple domains: quality improvement, patient safety, education, research, administration, and clinical care. To be successful, AHGs must develop strategies to balance their energies, resources, and performance. The balanced scorecard (BSC) is a strategic management system that enables organizations to translate their mission and vision into specific objectives and metrics across multiple domains. To date, no hospitalist group has reported on BSC implementation. We set out to develop a BSC as part of a strategic planning initiative. Based on a needs assessment of the University of California, San Francisco, Division of Hospital Medicine, mission and vision statements were developed. We engaged representative faculty to develop strategic objectives and determine performance metrics across 4 BSC perspectives. There were 41 metrics identified, and 16 were chosen for the initial BSC. It allowed us to achieve several goals: 1) present a broad view of performance, 2) create transparency and accountability, 3) communicate goals and engage faculty, and 4) ensure we use data to guide strategic decisions. Several lessons were learned, including the need to build faculty consensus, establish metrics with reliable measureable data, and the power of the BSC to drive goals across the division. We successfully developed and implemented a BSC in an AHG as part of a strategic planning initiative. The BSC has been instrumental in allowing us to achieve balanced success in multiple domains. Academic groups should consider employing the BSC as it allows for a data-driven strategic planning and assessment process. Copyright © 2013 Society of Hospital Medicine.
NASA Astrophysics Data System (ADS)
Christ, John A.; Lemke, Lawrence D.; Abriola, Linda M.
2005-01-01
The influence of reduced dimensionality (two-dimensional (2-D) versus 3-D) on predictions of dense nonaqueous phase liquid (DNAPL) infiltration and entrapment in statistically homogeneous, nonuniform permeability fields was investigated using the University of Texas Chemical Compositional Simulator (UTCHEM), a 3-D numerical multiphase simulator. Hysteretic capillary pressure-saturation and relative permeability relationships implemented in UTCHEM were benchmarked against those of another lab-tested simulator, the Michigan-Vertical and Lateral Organic Redistribution (M-VALOR). Simulation of a tetrachloroethene spill in 16 field-scale aquifer realizations generated DNAPL saturation distributions with approximately equivalent distribution metrics in two and three dimensions, with 2-D simulations generally resulting in slightly higher maximum saturations and increased vertical spreading. Variability in 2-D and 3-D distribution metrics across the set of realizations was shown to be correlated at a significance level of 95-99%. Neither spill volume nor release rate appeared to affect these conclusions. Variability in the permeability field did affect spreading metrics by increasing the horizontal spreading in 3-D more than in 2-D in more heterogeneous media simulations. The assumption of isotropic horizontal spatial statistics resulted, on average, in symmetric 3-D saturation distribution metrics in the horizontal directions. The practical implication of this study is that for statistically homogeneous, nonuniform aquifers, 2-D simulations of saturation distributions are good approximations to those obtained in 3-D. However, additional work will be needed to explore the influence of dimensionality on simulated DNAPL dissolution.
Cosmological perturbations in the entangled inflationary universe
NASA Astrophysics Data System (ADS)
Robles-Pérez, Salvador J.
2018-03-01
In this paper, the model of a multiverse made up of universes that are created in entangled pairs that conserve the total momentum conjugated to the scale factor is presented. For the background spacetime, assumed is a Friedmann-Robertson-Walker metric with a scalar field with mass m minimally coupled to gravity. For the fields that propagate in the entangled spacetimes, the perturbations of the spacetime and the scalar field, whose quantum states become entangled too, are considered. They turn out to be in a quasithermal state, and the corresponding thermodynamical magnitudes are computed. Three observables are expected to be caused by the creation of the universes in entangled pairs: a modification of the Friedmann equation because of the entanglement of the spacetimes, a modification of the effective value of the potential of the scalar field by the backreaction of the perturbation modes, and a modification of the spectrum of fluctuations because the thermal distribution is induced by the entanglement of the partner universes. The later would be a distinctive feature of the creation of universes in entangled pairs.
Yavari, Arash; Goriely, Alain
2016-12-01
The elastic Ericksen problem consists of finding deformations in isotropic hyperelastic solids that can be maintained for arbitrary strain-energy density functions. In the compressible case, Ericksen showed that only homogeneous deformations are possible. Here, we solve the anelastic version of the same problem, that is, we determine both the deformations and the eigenstrains such that a solution to the anelastic problem exists for arbitrary strain-energy density functions. Anelasticity is described by finite eigenstrains. In a nonlinear solid, these eigenstrains can be modelled by a Riemannian material manifold whose metric depends on their distribution. In this framework, we show that the natural generalization of the concept of homogeneous deformations is the notion of covariantly homogeneous deformations -deformations with covariantly constant deformation gradients. We prove that these deformations are the only universal deformations and that they put severe restrictions on possible universal eigenstrains . We show that, in a simply-connected body, for any distribution of universal eigenstrains the material manifold is a symmetric Riemannian manifold and that in dimensions 2 and 3 the universal eigenstrains are zero-stress.
2016-01-01
The elastic Ericksen problem consists of finding deformations in isotropic hyperelastic solids that can be maintained for arbitrary strain-energy density functions. In the compressible case, Ericksen showed that only homogeneous deformations are possible. Here, we solve the anelastic version of the same problem, that is, we determine both the deformations and the eigenstrains such that a solution to the anelastic problem exists for arbitrary strain-energy density functions. Anelasticity is described by finite eigenstrains. In a nonlinear solid, these eigenstrains can be modelled by a Riemannian material manifold whose metric depends on their distribution. In this framework, we show that the natural generalization of the concept of homogeneous deformations is the notion of covariantly homogeneous deformations—deformations with covariantly constant deformation gradients. We prove that these deformations are the only universal deformations and that they put severe restrictions on possible universal eigenstrains. We show that, in a simply-connected body, for any distribution of universal eigenstrains the material manifold is a symmetric Riemannian manifold and that in dimensions 2 and 3 the universal eigenstrains are zero-stress. PMID:28119554
NASA Astrophysics Data System (ADS)
Prince, N. H. E.
2005-10-01
Meaning and purpose can be given to life, consciousness, the laws of physics, etc. If one assumes that the Universe is endowed with some form of (strong) anthropic principle. In particular, the final anthropic principle (FAP) of Barrow and Tipler postulates that intelligent life will continue in the Universe until the far future when the computational power of descendent civilizations will be sufficient to run simulations of enormous scale and power. Tipler has claimed that it will be possible to create simulations with rendered environments and inhabitants, i.e. intelligent software constructs, which are effectively ‘people’. Proponents of this FAP claim that if both substrate independence and the pattern identity postulate hold, then these simulations would be able to contain reanimated individuals that once lived. These claims have been heavily criticized but the growing study of physical eschatology, initiated by Freeman Dyson in a seminal work, and the developments in computational theory have made some progress in showing that simulations containing intelligent information processing software constructs, which may be conscious, are not only feasible but may be a reality within the next few centuries. In this work, arguments and conservative calculations are given which concur with these latter more minimal claims. FAP-type simulations inevitably rely on cosmology type, but current observations would seem to rule appropriate models out. However, it is argued that dark energy, described in the recent forms of ‘quintessence’ cosmological models may show the current conclusions from observations to be too presumptive. In this paper some relevant physical and cosmological aspects are reviewed in the light of the recent propositions regarding the plausibility of certain simulations given by Bostrom, and the longer held postulate of finite nature due to Fredkin which has grown in credibility, following advances in quantum mechanics and the computational theory of cellular automata. This latter postulate supports the conclusions of Bostrom, which, under certain plausible assumptions, can imply that our Universe is itself already a simulated entity. It is demonstrated in this paper how atemporal memory connections could make efficient ancestor simulations possible, solving many of the objections faced by the FAP of Barrow and Tipler. Also, if finite nature is true then it can offer a similar vindication to this FAP. Indeed the conclusions of this postulate can be realized more easily, but only if the existence of life within the simulation/Universe is not merely incidental to the (currently unknown) purpose for which it was generated to fulfil.
Scholarometer: a social framework for analyzing impact across disciplines.
Kaur, Jasleen; Hoang, Diep Thi; Sun, Xiaoling; Possamai, Lino; Jafariasbagh, Mohsen; Patil, Snehal; Menczer, Filippo
2012-01-01
The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable usage and annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. To address these limitations, we propose a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. We describe a system called Scholarometer, which provides a service to scholars by computing citation-based impact measures. This creates an incentive for users to provide disciplinary annotations of authors, which in turn can be used to compute disciplinary metrics. We first present the system architecture and several heuristics to deal with noisy bibliographic and annotation data. We report on data sharing and interactive visualization services enabled by Scholarometer. Usage statistics, illustrating the data collected and shared through the framework, suggest that the proposed crowdsourcing approach can be successful. Secondly, we illustrate how the disciplinary bibliometric indicators elicited by Scholarometer allow us to implement for the first time a universal impact measure proposed in the literature. Our evaluation suggests that this metric provides an effective means for comparing scholarly impact across disciplinary boundaries.
Preheating after multifield inflation with nonminimal couplings. III. Dynamical spacetime results
NASA Astrophysics Data System (ADS)
DeCross, Matthew P.; Kaiser, David I.; Prabhu, Anirudh; Prescod-Weinstein, Chanda; Sfakianakis, Evangelos I.
2018-01-01
This paper concludes our semianalytic study of preheating in inflationary models comprised of multiple scalar fields coupled nonminimally to gravity. Using the covariant framework of paper I in this series, we extend the rigid-spacetime results of paper II by considering both the expansion of the Universe during preheating, as well as the effect of the coupled metric perturbations on particle production. The adiabatic and isocurvature perturbations are governed by different effective masses that scale differently with the nonminimal couplings and evolve differently in time. The effective mass for the adiabatic modes is dominated by contributions from the coupled metric perturbations immediately after inflation. The metric perturbations contribute an oscillating tachyonic term that enhances an early period of significant particle production for the adiabatic modes, which ceases on a time scale governed by the nonminimal couplings ξI . The effective mass of the isocurvature perturbations, on the other hand, is dominated by contributions from the fields' potential and from the curvature of the field-space manifold (in the Einstein frame), the balance between which shifts on a time scale governed by ξI. As in papers I and II, we identify distinct behavior depending on whether the nonminimal couplings are small [ξI≲O (1 ) ], intermediate [ξI˜O (1 -10 ) ], or large (ξI≥100 ).
Grid cells form a global representation of connected environments.
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-05-04
The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5-8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9-11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Grid Cells Form a Global Representation of Connected Environments
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-01-01
Summary The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5–8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9–11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. PMID:25913404
NASA Astrophysics Data System (ADS)
Ivanov, A. N.; Wellenzohn, M.
2016-09-01
We analyse the Einstein-Cartan gravity in its standard form { R }=R+{{ K }}2, where { R } {and} R are the Ricci scalar curvatures in the Einstein-Cartan and Einstein gravity, respectively, and {{ K }}2 is the quadratic contribution of torsion in terms of the contorsion tensor { K }. We treat torsion as an external (or background) field and show that its contribution to the Einstein equations can be interpreted in terms of the torsion energy-momentum tensor, local conservation of which in a curved spacetime with an arbitrary metric or an arbitrary gravitational field demands a proportionality of the torsion energy-momentum tensor to a metric tensor, a covariant derivative of which vanishes owing to the metricity condition. This allows us to claim that torsion can serve as an origin for the vacuum energy density, given by the cosmological constant or dark energy density in the universe. This is a model-independent result that may explain the small value of the cosmological constant, which is a long-standing problem in cosmology. We show that the obtained result is valid also in the Poincaré gauge gravitational theory of Kibble, where the Einstein-Hilbert action can be represented in the same form: { R }=R+{{ K }}2.
Stites, Steven; Steffen, Patrick; Turner, Scott; Pingleton, Susan
2013-07-01
A new metric was developed and implemented at the University of Kansas School of Medicine Department of Internal Medicine, the financial value unit (FVU). This metric analyzes faculty clinical compensation compared with clinical work productivity as a transparent means to decrease the physician compensation variability and compensate faculty equitably for clinical work.The FVU is the ratio of individual faculty clinical compensation compared with their total work relative value units (wRVUs) generated divided by Medical Group Management Association (MGMA) salary to wRVUs of a similar MGMA physician.The closer the FVU ratio is to 1.0, the closer clinical compensation is to that of an MGMA physician with similar clinical productivity. Using FVU metrics to calculate a faculty salary gap compared with MGMA median salary and wRVU productivity, a divisional production payment was established annually.From FY 2006 to FY 2011, both total faculty numbers and overall clinical activity increased. With the implementation of the FVU, both clinical productivity and compensation increased while, at the same time, physician retention rates remained high. Variability in physician compensation decreased. Dramatic clinical growth was associated with the alignment of clinical work and clinical compensation in a transparent and equable process.
Feldman, Betsy J.; Crane, Heidi M.; Mugavero, Michael; Willig, James H.; Patrick, Donald; Schumacher, Joseph; Saag, Michael; Kitahata, Mari M.; Crane, Paul K.
2011-01-01
Purpose We provide detailed instructions for analyzing patient-reported outcome (PRO) data collected with an existing (legacy) instrument so that scores can be calibrated to the PRO Measurement Information System (PROMIS) metric. This calibration facilitates migration to computerized adaptive test (CAT) PROMIS data collection, while facilitating research using historical legacy data alongside new PROMIS data. Methods A cross-sectional convenience sample (n = 2,178) from the Universities of Washington and Alabama at Birmingham HIV clinics completed the PROMIS short form and Patient Health Questionnaire (PHQ-9) depression symptom measures between August 2008 and December 2009. We calibrated the tests using item response theory. We compared measurement precision of the PHQ-9, the PROMIS short form, and simulated PROMIS CAT. Results Dimensionality analyses confirmed the PHQ-9 could be calibrated to the PROMIS metric. We provide code used to score the PHQ-9 on the PROMIS metric. The mean standard errors of measurement were 0.49 for the PHQ-9, 0.35 for the PROMIS short form, and 0.37, 0.28, and 0.27 for 3-, 8-, and 9-item-simulated CATs. Conclusions The strategy described here facilitated migration from a fixed-format legacy scale to PROMIS CAT administration and may be useful in other settings. PMID:21409516
Gibbons, Laura E; Feldman, Betsy J; Crane, Heidi M; Mugavero, Michael; Willig, James H; Patrick, Donald; Schumacher, Joseph; Saag, Michael; Kitahata, Mari M; Crane, Paul K
2011-11-01
We provide detailed instructions for analyzing patient-reported outcome (PRO) data collected with an existing (legacy) instrument so that scores can be calibrated to the PRO Measurement Information System (PROMIS) metric. This calibration facilitates migration to computerized adaptive test (CAT) PROMIS data collection, while facilitating research using historical legacy data alongside new PROMIS data. A cross-sectional convenience sample (n = 2,178) from the Universities of Washington and Alabama at Birmingham HIV clinics completed the PROMIS short form and Patient Health Questionnaire (PHQ-9) depression symptom measures between August 2008 and December 2009. We calibrated the tests using item response theory. We compared measurement precision of the PHQ-9, the PROMIS short form, and simulated PROMIS CAT. Dimensionality analyses confirmed the PHQ-9 could be calibrated to the PROMIS metric. We provide code used to score the PHQ-9 on the PROMIS metric. The mean standard errors of measurement were 0.49 for the PHQ-9, 0.35 for the PROMIS short form, and 0.37, 0.28, and 0.27 for 3-, 8-, and 9-item-simulated CATs. The strategy described here facilitated migration from a fixed-format legacy scale to PROMIS CAT administration and may be useful in other settings.
Scholarometer: A Social Framework for Analyzing Impact across Disciplines
Sun, Xiaoling; Possamai, Lino; JafariAsbagh, Mohsen; Patil, Snehal; Menczer, Filippo
2012-01-01
The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable usage and annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. To address these limitations, we propose a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. We describe a system called Scholarometer, which provides a service to scholars by computing citation-based impact measures. This creates an incentive for users to provide disciplinary annotations of authors, which in turn can be used to compute disciplinary metrics. We first present the system architecture and several heuristics to deal with noisy bibliographic and annotation data. We report on data sharing and interactive visualization services enabled by Scholarometer. Usage statistics, illustrating the data collected and shared through the framework, suggest that the proposed crowdsourcing approach can be successful. Secondly, we illustrate how the disciplinary bibliometric indicators elicited by Scholarometer allow us to implement for the first time a universal impact measure proposed in the literature. Our evaluation suggests that this metric provides an effective means for comparing scholarly impact across disciplinary boundaries. PMID:22984414
Ghadie, Mohamed A; Japkowicz, Nathalie; Perkins, Theodore J
2015-08-15
Stem cell differentiation is largely guided by master transcriptional regulators, but it also depends on the expression of other types of genes, such as cell cycle genes, signaling genes, metabolic genes, trafficking genes, etc. Traditional approaches to understanding gene expression patterns across multiple conditions, such as principal components analysis or K-means clustering, can group cell types based on gene expression, but they do so without knowledge of the differentiation hierarchy. Hierarchical clustering can organize cell types into a tree, but in general this tree is different from the differentiation hierarchy itself. Given the differentiation hierarchy and gene expression data at each node, we construct a weighted Euclidean distance metric such that the minimum spanning tree with respect to that metric is precisely the given differentiation hierarchy. We provide a set of linear constraints that are provably sufficient for the desired construction and a linear programming approach to identify sparse sets of weights, effectively identifying genes that are most relevant for discriminating different parts of the tree. We apply our method to microarray gene expression data describing 38 cell types in the hematopoiesis hierarchy, constructing a weighted Euclidean metric that uses just 175 genes. However, we find that there are many alternative sets of weights that satisfy the linear constraints. Thus, in the style of random-forest training, we also construct metrics based on random subsets of the genes and compare them to the metric of 175 genes. We then report on the selected genes and their biological functions. Our approach offers a new way to identify genes that may have important roles in stem cell differentiation. tperkins@ohri.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Metric 3-Leibniz algebras and M2-branes
NASA Astrophysics Data System (ADS)
Méndez-Escobar, Elena
2010-08-01
This thesis is concerned with superconformal Chern-Simons theories with matter in 3 dimensions. The interest in these theories is two-fold. On the one hand, it is a new family of theories in which to test the AdS/CFT correspondence and on the other, they are important to study one of the main objects of M-theory (M2-branes). All these theories have something in common: they can be written in terms of 3-Leibniz algebras. Here we study the structure theory of such algebras, paying special attention to a subclass of them that gives rise to maximal supersymmetry and that was the first to appear in this context: 3-Lie algebras. In chapter 2, we review the structure theory of metric Lie algebras and their unitary representations. In chapter 3, we study metric 3-Leibniz algebras and show, by specialising a construction originally due to Faulkner, that they are in one to one correspondence with pairs of real metric Lie algebras and unitary representations of them. We also show a third characterisation for six extreme cases of 3-Leibniz algebras as graded Lie (super)algebras. In chapter 4, we study metric 3-Lie algebras in detail. We prove a structural result and also classify those with a maximally isotropic centre, which is the requirement that ensures unitarity of the corresponding conformal field theory. Finally, in chapter 5, we study the universal structure of superpotentials in this class of superconformal Chern-Simons theories with matter in three dimensions. We provide a uniform formulation for all these theories and establish the connection between the amount of supersymmetry preserved and the gauge Lie algebra and the appropriate unitary representation to be used to write down the Lagrangian. The conditions for supersymmetry enhancement are then expressed equivalently in the language of representation theory of Lie algebras or the language of 3-Leibniz algebras.
An evaluation of non-metric cranial traits used to estimate ancestry in a South African sample.
L'Abbé, E N; Van Rooyen, C; Nawrocki, S P; Becker, P J
2011-06-15
Establishing ancestry from a skeleton for forensic purposes has been shown to be difficult. The purpose of this paper is to address the application of thirteen non-metric traits to estimate ancestry in three South African groups, namely White, Black and "Coloured". In doing so, the frequency distribution of thirteen non-metric traits among South Africans are presented; the relationship of these non-metric traits with ancestry, sex, age at death are evaluated; and Kappa statistics are utilized to assess the inter and intra-rater reliability. Crania of 520 known individuals were obtained from four skeletal samples in South Africa: the Pretoria Bone Collection, the Raymond A. Dart Collection, the Kirsten Collection and the Student Bone Collection from the University of the Free State. Average age at death was 51, with an age range between 18 and 90. Thirteen commonly used non-metric traits from the face and jaw were scored; definition and illustrations were taken from Hefner, Bass and Hauser and De Stephano. Frequency distributions, ordinal regression and Cohen's Kappa statistics were performed as a means to assess population variation and repeatability. Frequency distributions were highly variable among South Africans. Twelve of the 13 variables had a statistically significant relationship with ancestry. Sex significantly affected only one variable, inter-orbital breadth, and age at death affected two (anterior nasal spine and alveolar prognathism). The interaction of ancestry and sex independently affected three variables (nasal bone contour, nasal breadth, and interorbital breadth). Seven traits had moderate to excellent repeatability, while poor scoring consistency was noted for six variables. Difficulties in repeating several of the trait scores may require either a need for refinement of the definitions, or these character states may not adequately describe the observable morphology in the population. The application of the traditional experience-based approach for estimating ancestry in forensic case work is problematic. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Koral, Korgün; Mathis, Derek; Gimi, Barjor; Gargan, Lynn; Weprin, Bradley; Bowers, Daniel C; Margraf, Linda
2013-08-01
To test whether there is correlation between cell densities and apparent diffusion coefficient (ADC) metrics of common pediatric cerebellar tumors. This study was reviewed for issues of patient safety and confidentiality and was approved by the Institutional Review Board of the University of Texas Southwestern Medical Center and was compliant with HIPAA. The need for informed consent was waived. Ninety-five patients who had preoperative magnetic resonance imaging and surgical pathologic findings available between January 2003 and June 2011 were included. There were 37 pilocytic astrocytomas, 34 medulloblastomas (23 classic, eight desmoplastic-nodular, two large cell, one anaplastic), 17 ependymomas (13 World Health Organization [WHO] grade II, four WHO grade III), and seven atypical teratoid rhabdoid tumors. ADCs of solid tumor components and normal cerebellum were measured. Tumor-to-normal brain ADC ratios (hereafter, ADC ratio) were calculated. The medulloblastomas and ependymomas were subcategorized according to the latest WHO classification, and tumor cellularity was calculated. Correlation was sought between cell densities and mean tumor ADCs, minimum tumor ADCs, and ADC ratio. When all tumors were considered together, negative correlation was found between cellularity and mean tumor ADCs (ρ = -0.737, P < .05) and minimum tumor ADCs (ρ = -0.736, P < .05) of common pediatric cerebellar tumors. There was no correlation between cellularity and ADC ratio. Negative correlation was found between cellularity and minimum tumor ADC in atypical teratoid rhabdoid tumors (ρ = -0.786, P < .05). In atypical teratoid rhabdoid tumors, no correlation was found between cellularity and mean tumor ADC and ADC ratio. There was no correlation between the ADC metrics and cellularity of the pilocytic astrocytomas, medulloblastomas, and ependymomas. Negative correlation was found between cellularity and ADC metrics of common pediatric cerebellar tumors. Although ADC metrics are useful in the preoperative diagnosis of common pediatric cerebellar tumors and this utility is generally attributed to differences in cellularity of tumors, tumor cellularity may not be the sole determinant of the differences in diffusivity.
Dynamics in a Maximally Symmetric Universe
NASA Astrophysics Data System (ADS)
Bewketu, Asnakew
2016-03-01
Our present understanding of the evolution of the universe relies upon the Friedmann- Robertson- Walker cosmological models. This model is so successful that it is now being considered as the Standard Model of Cosmology. So in this work we derive the Fried- mann equations using the Friedmann-Robertson-Walker metric together with Einstein field equation and then we give a simple method to reduce Friedmann equations to a second order linear differential equation when it is supplemented with a time dependent equation of state. Furthermore, as illustrative examples, we solve this equation for some specific time dependent equation of states. And also by using the Friedmann equations with some time dependent equation of state we try to determine the cosmic scale factor(the rate at which the universe expands) and age of the Friedmann universe, for the matter dominated era, radiation dominated era and for both matter and radiation dominated era by considering different cases. We have finally discussed the observable quantities that can be evidences for the accelerated expansion of the Friedmann universe. I would like to acknowledge Addis Ababa University for its financial and material support to my work on the title mentioned above.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagne, MC; Archambault, L; CHU de Quebec, Quebec, Quebec
2014-06-15
Purpose: Intensity modulated radiation therapy always requires compromises between PTV coverage and organs at risk (OAR) sparing. We previously developed metrics that correlate doses to OAR to specific patients’ morphology using stochastic frontier analysis (SFA). Here, we aim to examine the validity of this approach using a large set of realistically simulated dosimetric and geometric data. Methods: SFA describes a set of treatment plans as an asymmetric distribution with respect to a frontier defining optimal plans. Eighty head and neck IMRT plans were used to establish a metric predicting the mean dose to parotids as a function of simple geometricmore » parameters. A database of 140 parotids was used as a basis distribution to simulate physically plausible data of geometry and dose. Distributions comprising between 20 and 5000 were simulated and the SFA was applied to obtain new frontiers, which were compared to the original frontier. Results: It was possible to simulate distributions consistent with the original dataset. Below 160 organs, the SFA could not always describe distributions as asymmetric: a few cases showed a Gaussian or half-Gaussian distribution. In order to converge to a stable solution, the number of organs in a distribution must ideally be above 100, but in many cases stable parameters could be achieved with as low as 60 samples of organ data. Mean RMS value of the error of new frontiers was significantly reduced when additional organs are used. Conclusion: The number of organs in a distribution showed to have an impact on the effectiveness of the model. It is always possible to obtain a frontier, but if the number of organs in the distribution is small (< 160), it may not represent de lowest dose achievable. These results will be used to determine number of cases necessary to adapt the model to other organs.« less
Kellman, Philip J.; Mnookin, Jennifer L.; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E.
2014-01-01
Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons. PMID:24788812
Assessment of the Relationship Between Flexibility and Adaptive Capacity in Flood Management Systems
NASA Astrophysics Data System (ADS)
DiFrancesco, K.; Tullos, D. D.
2013-12-01
Discussions around adapting water management systems to future changes often state the need to increase system flexibility. Intuitively, a flexible, easily modifiable system seems desirable when faced with a wide range of uncertain, but plausible future conditions. Yet, despite the frequent use of the terms flexibility, very little work has examined what exactly it means to have a flexible water management system, what makes one system more flexible than another, or the extent to which flexibility increases adaptive capacity. This study applies a methodology for assessing the inherent flexibility of the structural and non-structural components of flood management systems using original flexibility metrics in the categories of: slack, intensity, connectivity, adjustability, and coordination. We use these metrics to assess the flexibility of three sub-systems within the Sacramento Valley flood management system in California, USA under current system conditions as well as with proposed management actions in place. We then assess the range of hydrologic conditions under which each sub-system can meet flood risk targets in order to determine whether more flexible systems are also more robust and able to perform over a wider range of hydrologic conditions. In doing so, we identify flexible characteristics of flood management systems that enhance the ability of the system to preform over a wide range of conditions making them better suited to adapt to an uncertain hydrologic future. We find that the flexibility characteristics that increase the range of conditions under which the system can meet performance goals varies depending on whether the region is considered urban, rural, or a small community. In some cases, a decrease in certain flexibility characteristics is associated with an increase in robustness, indicating that more flexibility is not always desirable. Future work will assess the transferability of these results to other regions and systems.
NASA Technical Reports Server (NTRS)
Trevino, Robert C.
2009-01-01
The Texas Space Grant Consortium (TSGC) and the Exploration Systems Mission Directorate (ESMD) both have programs that present design challenges for university senior design classes that offer great opportunities for educational outreach and workforce development. These design challenges have been identified by NASA engineers and researchers as real design problems faced by the Constellation Program in its exploration missions and architecture. Student teams formed in their senior design class select and then work on a design challenge for one or two semesters. The senior design class follows the requirements set by their university, but it must also comply with the Accreditation Board for Engineering and Technology (ABET) in order to meet the class academic requirements. Based on a one year fellowship at a TSGC university under the NASA Administrator's Fellowship Program (NAFP) and several years of experience, results and metrics are presented on the NASA Design Challenge Program.
Emerging universe from scale invariance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Campo, Sergio; Herrera, Ramón; Guendelman, Eduardo I.
2010-06-01
We consider a scale invariant model which includes a R{sup 2} term in action and show that a stable ''emerging universe'' scenario is possible. The model belongs to the general class of theories, where an integration measure independent of the metric is introduced. To implement scale invariance (S.I.), a dilaton field is introduced. The integration of the equations of motion associated with the new measure gives rise to the spontaneous symmetry breaking (S.S.B) of S.I. After S.S.B. of S.I. in the model with the R{sup 2} term (and first order formalism applied), it is found that a non trivial potentialmore » for the dilaton is generated. The dynamics of the scalar field becomes non linear and these non linearities are instrumental in the stability of some of the emerging universe solutions, which exists for a parameter range of the theory.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shamir, M. F., E-mail: farasat.shamir@nu.edu.pk
Modified theories of gravity have attracted much attention of the researchers in the recent years. In particular, the f(R) theory has been investigated extensively due to important f(R) gravity models in cosmological contexts. This paper is devoted to exploring an anisotropic universe in metric f(R) gravity. A locally rotationally symmetric Bianchi type I cosmological model is considered for this purpose. Exact solutions of modified field equations are obtained for a well-known f(R) gravity model. The energy conditions are also discussed for the model under consideration. The viability of the model is investigated via graphical analysis using the present-day values ofmore » cosmological parameters. The model satisfies null energy, weak energy, and dominant energy conditions for a particular range of the anisotropy parameter while the strong energy condition is violated, which shows that the anisotropic universe in f(R) gravity supports the crucial issue of accelerated expansion of the universe.« less
Wattanapisit, Apichai; Vijitpongjinda, Surasak; Saengow, Udomsak; Amaek, Waluka; Thanamee, Sanhapan; Petchuay, Prachyapan
2017-09-27
Physical activity (PA) is important in promoting health, as well as in the treatment and prevention of diseases. However, insufficient PA is still a global health problem and it is also a problem in medical schools. PA training in medical curricula is still sparse or non-existent. There is a need for a comprehensive understanding of the extent of PA in medical schools through several indicators, including people, places and policies. This study includes a survey of the PA prevalence in a medical school and development of a tool, the Medical School Physical Activity Report Card (MSPARC), which will contain concise and understandable infographics and information for exploring, monitoring and reporting information relating to PA prevalence. This mixed methods study will run from January to September 2017. We will involve the School of Medicine, Walailak University, Thailand, and its medical students (n=285). Data collection will consist of both primary and secondary data, divided into four parts: general information, people, places and policies. We will investigate the PA metrics about (1) people: the prevalence of PA and sedentary behaviours; (2) place: the quality and accessibility of walkable neighbourhoods, bicycle facilities and recreational areas; and (3) policy: PA promotion programmes for medical students, education metrics and investments related to PA. The MSPARC will be developed using simple symbols, infographics and short texts to evaluate the PA metrics of the medical school. This study has been approved by the Human Research Ethics Committee of Walailak University (protocol number: WUEC-16-005-01). Findings will be published in peer-reviewed journals and presented at national or international conferences. The MSPARC and full report will be disseminated to relevant stakeholders, policymakers, staff and clients. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Foy, Jeffrey E; LoCasto, Paul C; Briner, Stephen W; Dyar, Samantha
2017-02-01
Readers rapidly check new information against prior knowledge during validation, but research is inconsistent as to whether source credibility affects validation. We argue that readers are likely to accept highly plausible assertions regardless of source, but that high source credibility may boost acceptance of claims that are less plausible based on general world knowledge. In Experiment 1, participants read narratives with assertions for which the plausibility varied depending on the source. For high credibility sources, we found that readers were faster to read information confirming these assertions relative to contradictory information. We found the opposite patterns for low credibility characters. In Experiment 2, readers read claims from the same high or low credibility sources, but the claims were always plausible based on general world knowledge. Readers consistently took longer to read contradictory information, regardless of source. In Experiment 3, participants read modified versions of "The Tell-Tale Heart," which was narrated entirely by an unreliable source. We manipulated the plausibility of a target event, as well as whether high credibility characters within the story provided confirmatory or contradictory information about the narrator's description of the target event. Though readers rated the narrator as being insane, they were more likely to believe the narrator's assertions about the target event when it was plausible and corroborated by other characters. We argue that sourcing research would benefit from focusing on the relationship between source credibility, message credibility, and multiple sources within a text.
Measuring Progress in Conflict Environments (MPICE): A Metrics Framework
2010-06-04
offered by at- tendees at several peer review workshops held at the LBJ School of Public Affairs at the University of texas, the Carr Center at Harvard...at the Carr Center, Katherine Go- rove at the Center for Law and Military Operations, and Karen Guttieri at the naval Postgraduate School for their...in violent crime. QD, EK – Incidence of attacks or intimidation or discrimination against ex-combatants. QD, CA – Level of participation in the
Visibility-Based Goal Oriented Metrics and Application to Navigation and Path Planning Problems
2017-12-14
Suite 5.300 Austin , TX 78712 -1532 Agency Code: Proposal Number: 62381MA Address: 101 East 27th Street, Austin , TX 787121532 Country: USA DUNS...University of Texas at Austin , USA Research supported by NSF, ARO Points sampled from imaging devices What are the total length of the cables? The...by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park
Fiber Based Optical Amplifier for High Energy Laser Pulses Final Report CRADA No. TC02100.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messerly, M.; Cunningham, P.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL), and The Boeing Company to develop an optical fiber-based laser amplifier capable of producing and sustaining very high-energy, nanosecond-scale optical pulses. The overall technical objective of this CRADA was to research, design, and develop an optical fiber-based amplifier that would meet specific metrics.
Kittelson, Sheri; Pierce, Read; Youngwerth, Jeanie
2017-05-01
In response to poor healthcare quality outcomes and rising costs, healthcare reform triple aim has increased requirements for providers to demonstrate value to payers, partners, and the public. Electronically automating measurement of the meaningful impact of palliative care (PC) programs on clinical, operational, and financial systems over time is imperative to the success of the field and the goal of development of this automated PC scorecard. The scorecard was organized into a format of quality measures identified by the Measuring What Matters (MWM) project that are defined as important to the team, automatically extracted from the electronic health record, valid, and can be impacted over time. The scorecard was initially created using University of Florida Health (UF) data, a new PC program, and successfully applied and implemented at University of Colorado Anschutz Medical Campus (CU), a second institution with a mature PC program. Clinical metrics are organized in the scorecard based on MWM and described in terms of the metric definition, rationale for selection, measure type (structure, process, or outcome), and whether this represents a direct or proxy measure. The process of constructing the scorecard helped identify areas within both systems for potential improvement in team structure, clinical processes, and outcomes. In addition, by automating data extraction, the scorecard decreases costs associated with manual data entry and extraction, freeing clinical staff to care for patients and increasing the value of PC delivered to patients.
Time delay in Swiss cheese gravitational lensing
NASA Astrophysics Data System (ADS)
Chen, B.; Kantowski, R.; Dai, X.
2010-08-01
We compute time delays for gravitational lensing in a flat Λ dominated cold dark matter Swiss cheese universe. We assume a primary and secondary pair of light rays are deflected by a single point mass condensation described by a Kottler metric (Schwarzschild with Λ) embedded in an otherwise homogeneous cosmology. We find that the cosmological constant’s effect on the difference in arrival times is nonlinear and at most around 0.002% for a large cluster lens; however, we find differences from time delays predicted by conventional linear lensing theory that can reach ˜4% for these large lenses. The differences in predicted delay times are due to the failure of conventional lensing to incorporate the lensing mass into the mean mass density of the universe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, B.; Kantowski, R.; Dai, X.
We compute time delays for gravitational lensing in a flat {Lambda} dominated cold dark matter Swiss cheese universe. We assume a primary and secondary pair of light rays are deflected by a single point mass condensation described by a Kottler metric (Schwarzschild with {Lambda}) embedded in an otherwise homogeneous cosmology. We find that the cosmological constant's effect on the difference in arrival times is nonlinear and at most around 0.002% for a large cluster lens; however, we find differences from time delays predicted by conventional linear lensing theory that can reach {approx}4% for these large lenses. The differences in predictedmore » delay times are due to the failure of conventional lensing to incorporate the lensing mass into the mean mass density of the universe.« less
Role and reality: technology transfer at Canadian universities.
Bubela, Tania M; Caulfield, Timothy
2010-09-01
Technology transfer offices (TTOs) play a central role in the knowledge translation and commercialization agenda of Canadian universities. Despite this presumed mandate, there is a disconnect between the expectations of government and research institutions (which view TTOs' primary role as the promotion of profitable commercialization activities) and the reality of what TTOs do. Interviews with professionals at Canadian TTOs have revealed that, at their best, TTOs support the social and academic missions of their institutions by facilitating knowledge mobilization and research relationships with other sectors, including industry; however, this does not always produce obvious or traditional commercial outputs. Thus, the existing metrics used to measure the success of TTOs do not capture this reality and, as such, realignment is needed. Copyright 2010 Elsevier Ltd. All rights reserved.
Chittleborough, Catherine R; Mittinty, Murthy N; Lawlor, Debbie A; Lynch, John W
2014-01-01
Randomized controlled trial evidence shows that interventions before age 5 can improve skills necessary for educational success; the effect of these interventions on socioeconomic inequalities is unknown. Using trial effect estimates, and marginal structural models with data from the Avon Longitudinal Study of Parents and Children (n = 11,764, imputed), simulated effects of plausible interventions to improve school entry academic skills on socioeconomic inequality in educational achievement at age 16 were examined. Progressive universal interventions (i.e., more intense intervention for those with greater need) to improve school entry academic skills could raise population levels of educational achievement by 5% and reduce absolute socioeconomic inequality in poor educational achievement by 15%. PMID:25327718