An Empirical Human Controller Model for Preview Tracking Tasks.
van der El, Kasper; Pool, Daan M; Damveld, Herman J; van Paassen, Marinus Rene M; Mulder, Max
2016-11-01
Real-life tracking tasks often show preview information to the human controller about the future track to follow. The effect of preview on manual control behavior is still relatively unknown. This paper proposes a generic operator model for preview tracking, empirically derived from experimental measurements. Conditions included pursuit tracking, i.e., without preview information, and tracking with 1 s of preview. Controlled element dynamics varied between gain, single integrator, and double integrator. The model is derived in the frequency domain, after application of a black-box system identification method based on Fourier coefficients. Parameter estimates are obtained to assess the validity of the model in both the time domain and frequency domain. Measured behavior in all evaluated conditions can be captured with the commonly used quasi-linear operator model for compensatory tracking, extended with two viewpoints of the previewed target. The derived model provides new insights into how human operators use preview information in tracking tasks.
Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity
NASA Astrophysics Data System (ADS)
Ingber, Lester
1984-06-01
A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Low temperature heat capacities and thermodynamic functions described by Debye-Einstein integrals.
Gamsjäger, Ernst; Wiessner, Manfred
2018-01-01
Thermodynamic data of various crystalline solids are assessed from low temperature heat capacity measurements, i.e., from almost absolute zero to 300 K by means of semi-empirical models. Previous studies frequently present fit functions with a large amount of coefficients resulting in almost perfect agreement with experimental data. It is, however, pointed out in this work that special care is required to avoid overfitting. Apart from anomalies like phase transformations, it is likely that data from calorimetric measurements can be fitted by a relatively simple Debye-Einstein integral with sufficient precision. Thereby, reliable values for the heat capacities, standard enthalpies, and standard entropies at T = 298.15 K are obtained. Standard thermodynamic functions of various compounds strongly differing in the number of atoms in the formula unit can be derived from this fitting procedure and are compared to the results of previous fitting procedures. The residuals are of course larger when the Debye-Einstein integral is applied instead of using a high number of fit coefficients or connected splines, but the semi-empiric fit coefficients keep their meaning with respect to physics. It is suggested to use the Debye-Einstein integral fit as a standard method to describe heat capacities in the range between 0 and 300 K so that the derived thermodynamic functions are obtained on the same theory-related semi-empiric basis. Additional fitting is recommended when a precise description for data at ultra-low temperatures (0-20 K) is requested.
Integrated empirical ethics: loss of normativity?
van der Scheer, Lieke; Widdershoven, Guy
2004-01-01
An important discussion in contemporary ethics concerns the relevance of empirical research for ethics. Specifically, two crucial questions pertain, respectively, to the possibility of inferring normative statements from descriptive statements, and to the danger of a loss of normativity if normative statements should be based on empirical research. Here we take part in the debate and defend integrated empirical ethical research: research in which normative guidelines are established on the basis of empirical research and in which the guidelines are empirically evaluated by focusing on observable consequences. We argue that in our concrete example normative statements are not derived from descriptive statements, but are developed within a process of reflection and dialogue that goes on within a specific praxis. Moreover, we show that the distinction in experience between the desirable and the undesirable precludes relativism. The normative guidelines so developed are both critical and normative: they help in choosing the right action and in evaluating that action. Finally, following Aristotle, we plead for a return to the view that morality and ethics are inherently related to one another, and for an acknowledgment of the fact that moral judgments have their origin in experience which is always related to historical and cultural circumstances.
NASA Technical Reports Server (NTRS)
Varanasi, P.; Cess, R. D.; Bangaru, B. R. P.
1974-01-01
Measurements of the absolute intensity and integrated band absorption have been performed for the nu sub 9 fundamental band of ethane. The intensity is found to be about 34 per sq cm per atm at STP, and this is significantly higher than previous estimates. It is shown that a Gaussian profile provides an empirical representation of the apparent spectral absorption coefficient. Employing this empirical profile, a simple expression is derived for the integrated band absorption, which is in excellent agreement with experimental values. The band model is then employed to investigate the possible role of ethane as a source of thermal infrared opacity within the atmospheres of Jupiter and Saturn, and to interpret qualitatively observed brightness temperatures for Saturn.
Using LANDSAT to provide potato production estimates to Columbia Basin farmers and processors
NASA Technical Reports Server (NTRS)
1991-01-01
The estimation of potato yields in the Columbia basin is described. The fundamental objective is to provide CROPIX with working models of potato production. A two-pronged approach was used to yield estimation: (1) using simulation models, and (2) using purely empirical models. The simulation modeling approach used satellite observations to determine certain key dates in the development of the crop for each field identified as potatoes. In particular, these include planting dates, emergence dates, and harvest dates. These critical dates are fed into simulation models of crop growth and development to derive yield forecasts. Purely empirical models were developed to relate yield to some spectrally derived measure of crop development. Two empirical approaches are presented: one relates tuber yield to estimates of cumulative intercepted solar radiation, the other relates tuber yield to the integral under GVI (Global Vegetation Index) curve.
Empirical calibration of the near-infrared Ca II triplet - III. Fitting functions
NASA Astrophysics Data System (ADS)
Cenarro, A. J.; Gorgas, J.; Cardiel, N.; Vazdekis, A.; Peletier, R. F.
2002-02-01
Using a near-infrared stellar library of 706 stars with a wide coverage of atmospheric parameters, we study the behaviour of the CaII triplet strength in terms of effective temperature, surface gravity and metallicity. Empirical fitting functions for recently defined line-strength indices, namely CaT*, CaT and PaT, are provided. These functions can be easily implemented into stellar population models to provide accurate predictions for integrated CaII strengths. We also present a thorough study of the various error sources and their relation to the residuals of the derived fitting functions. Finally, the derived functional forms and the behaviour of the predicted CaII are compared with those of previous works in the field.
NASA Technical Reports Server (NTRS)
Goldhirsh, Julius; Krichevsky, Vladimir; Gebo, Norman
1992-01-01
Five years of rain rate and modeled slant path attenuation distributions at 20 GHz and 30 GHz derived from a network of 10 tipping bucket rain gages was examined. The rain gage network is located within a grid 70 km north-south and 47 km east-west in the Mid-Atlantic coast of the United States in the vicinity of Wallops Island, Virginia. Distributions were derived from the variable integration time data and from one minute averages. It was demonstrated that for realistic fade margins, the variable integration time results are adequate to estimate slant path attenuations at frequencies above 20 GHz using models which require one minute averages. An accurate empirical formula was developed to convert the variable integration time rain rates to one minute averages. Fade distributions at 20 GHz and 30 GHz were derived employing Crane's Global model because it was demonstrated to exhibit excellent accuracy with measured COMSTAR fades at 28.56 GHz.
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.
2007-09-01
Foreword; Preface; Acknowledgements; 1. Synopsis; Part I. Fundamental Concepts of Finance: 2. Introduction to finance; 3. Derivative securities; Part II. Systems with Finite Number of Degrees of Freedom: 4. Hamiltonians and stock options; 5. Path integrals and stock options; 6. Stochastic interest rates' Hamiltonians and path integrals; Part III. Quantum Field Theory of Interest Rates Models: 7. Quantum field theory of forward interest rates; 8. Empirical forward interest rates and field theory models; 9. Field theory of Treasury Bonds' derivatives and hedging; 10. Field theory Hamiltonian of forward interest rates; 11. Conclusions; Appendix A: mathematical background; Brief glossary of financial terms; Brief glossary of physics terms; List of main symbols; References; Index.
ERIC Educational Resources Information Center
Maestripieri, Dario; Roney, James R.
2006-01-01
Evolutionary developmental psychology is a discipline that has the potential to integrate conceptual approaches to the study of behavioral development derived from psychology and biology as well as empirical data from humans and animals. Comparative research with animals, and especially with nonhuman primates, can provide evidence of adaptation in…
Students' Acceptance of Tablet PCs in the Classroom
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Schweinbenz, Volker
2016-01-01
In recent years digital technologies, such as tablet personal computers (TPCs), have become an integral part of a school's infrastructure and are seen as a promising way to facilitate students' learning processes. This study empirically tested a theoretical model derived from the technology acceptance model containing key constructs developed in…
Meta-Regression Approximations to Reduce Publication Selection Bias
ERIC Educational Resources Information Center
Stanley, T. D.; Doucouliagos, Hristos
2014-01-01
Publication selection bias is a serious challenge to the integrity of all empirical sciences. We derive meta-regression approximations to reduce this bias. Our approach employs Taylor polynomial approximations to the conditional mean of a truncated distribution. A quadratic approximation without a linear term, precision-effect estimate with…
Evolution, Biology, and Society: A Conversation for the 21st-Century Sociology Classroom
ERIC Educational Resources Information Center
Machalek, Richard; Martin, Michael W.
2010-01-01
Recently, a growing contingent of "evolutionary sociologists" has begun to integrate theoretical ideas and empirical findings derived from evolutionary biology, especially sociobiology, into a variety of sociological inquiries. Without capitulating to a naive version of either biological reductionism or genetic determinism, these researchers and…
Comparison of modelled and empirical atmospheric propagation data
NASA Technical Reports Server (NTRS)
Schott, J. R.; Biegel, J. D.
1983-01-01
The radiometric integrity of TM thermal infrared channel data was evaluated and monitored to develop improved radiometric preprocessing calibration techniques for removal of atmospheric effects. Modelled atmospheric transmittance and path radiance were compared with empirical values derived from aircraft underflight data. Aircraft thermal infrared imagery and calibration data were available on two dates as were corresponding atmospheric radiosonde data. The radiosonde data were used as input to the LOWTRAN 5A code which was modified to output atmospheric path radiance in addition to transmittance. The aircraft data were calibrated and used to generate analogous measurements. These data indicate that there is a tendancy for the LOWTRAN model to underestimate atmospheric path radiance and transmittance as compared to empirical data. A plot of transmittance versus altitude for both LOWTRAN and empirical data is presented.
Path integral for equities: Dynamic correlation and empirical analysis
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Cao, Yang; Lau, Ada; Tang, Pan
2012-02-01
This paper develops a model to describe the unequal time correlation between rate of returns of different stocks. A non-trivial fourth order derivative Lagrangian is defined to provide an unequal time propagator, which can be fitted to the market data. A calibration algorithm is designed to find the empirical parameters for this model and different de-noising methods are used to capture the signals concealed in the rate of return. The detailed results of this Gaussian model show that the different stocks can have strong correlation and the empirical unequal time correlator can be described by the model's propagator. This preliminary study provides a novel model for the correlator of different instruments at different times.
Surface albedo from bidirectional reflectance
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Irons, J. R.; Daughtry, C. S. T.
1991-01-01
The validity of integrating over discrete wavelength bands is examined to estimate total shortwave bidirectional reflectance of vegetated and bare soil surfaces. Methods for estimating albedo from multiple angle, discrete wavelength band radiometer measurements are studied. These methods include a numerical integration technique and the integration of an empirically derived equation for bidirectional reflectance. It is concluded that shortwave albedos estimated through both techniques agree favorably with the independent pyranometer measurements. Absolute rms errors are found to be 0.5 percent or less for both grass sod and bare soil surfaces.
NASA Astrophysics Data System (ADS)
Moon, Joon-Young; Kim, Junhyeok; Ko, Tae-Wook; Kim, Minkyung; Iturria-Medina, Yasser; Choi, Jee-Hyun; Lee, Joseph; Mashour, George A.; Lee, Uncheol
2017-04-01
Identifying how spatially distributed information becomes integrated in the brain is essential to understanding higher cognitive functions. Previous computational and empirical studies suggest a significant influence of brain network structure on brain network function. However, there have been few analytical approaches to explain the role of network structure in shaping regional activities and directionality patterns. In this study, analytical methods are applied to a coupled oscillator model implemented in inhomogeneous networks. We first derive a mathematical principle that explains the emergence of directionality from the underlying brain network structure. We then apply the analytical methods to the anatomical brain networks of human, macaque, and mouse, successfully predicting simulation and empirical electroencephalographic data. The results demonstrate that the global directionality patterns in resting state brain networks can be predicted solely by their unique network structures. This study forms a foundation for a more comprehensive understanding of how neural information is directed and integrated in complex brain networks.
Integration of GRACE and GNET GPS in modeling the deglaciation of Greenland
NASA Astrophysics Data System (ADS)
Knudsen, P.; Madsen, F. B.; Khan, S. A.; Bevis, M. G.; van Dam, T. M.
2017-12-01
The use the monthly gravity fields from the Gravity Recovery and Climate Experiment (GRACE) has become essential when assessing and modeling the mass changes of the ice sheets. The recent degradation of the current mission, however, has hampered the continuous monitoring of ice sheet masses, at least until GRACE Follow-On mission will become operational. Through the recent years it has been demonstrated that mass changes can be observed by GPS receivers mounted on the adjacent bedrock. Especially, the Greenland GPS Network (GNET) has proven that GPS is a valuable technique for detecting mass changes through the Earths elastic response. An integration of GNET with other observations of the Greenland ice sheet, e.g. satellite altimetry and GRACE, has made studies of GIA progressing significantly. In this study, we aim at improving the monitoring of the ice sheet mass by utilizing the redundancy for reducing the influence of errors and to fill in at data voids and, not at least to bridge the gap between GRACE and GRACE FO. Initial analyses are carried out to link GRACE and GNET time series empirically. EOF analyses are carried out to extract the main part of the variability and to isolate errors. Subsequently, empirical covariance functions are derived and used in the integration. Preliminary results are derived and inter-compared.
Schwindt, Adam R; Winkelman, Dana L
2016-09-01
Urban freshwater streams in arid climates are wastewater effluent dominated ecosystems particularly impacted by bioactive chemicals including steroid estrogens that disrupt vertebrate reproduction. However, more understanding of the population and ecological consequences of exposure to wastewater effluent is needed. We used empirically derived vital rate estimates from a mesocosm study to develop a stochastic stage-structured population model and evaluated the effect of 17α-ethinylestradiol (EE2), the estrogen in human contraceptive pills, on fathead minnow Pimephales promelas stochastic population growth rate. Tested EE2 concentrations ranged from 3.2 to 10.9 ng L(-1) and produced stochastic population growth rates (λ S ) below 1 at the lowest concentration, indicating potential for population decline. Declines in λ S compared to controls were evident in treatments that were lethal to adult males despite statistically insignificant effects on egg production and juvenile recruitment. In fact, results indicated that λ S was most sensitive to the survival of juveniles and female egg production. More broadly, our results document that population model results may differ even when empirically derived estimates of vital rates are similar among experimental treatments, and demonstrate how population models integrate and project the effects of stressors throughout the life cycle. Thus, stochastic population models can more effectively evaluate the ecological consequences of experimentally derived vital rates.
An economic analysis of harvest behavior: integrating forest and ownership characteristics
Donald F. Dennis
1989-01-01
This study provides insight into the determinants of timber supply from private forests through development of both theoretical and empirical models of harvest behavior. A microeconomic model encompasses the multiple objective nature of private ownership by examining the harvest decision for landowners who derive utility from forest amenities and from income used for...
Bijou, Sidney W.; Peterson, Robert F.; Ault, Marion H.
1968-01-01
It is the thesis of this paper that data from descriptive and experimental field studies can be interrelated at the level of data and empirical concepts if both sets are derived from frequency-of-occurrence measures. The methodology proposed for a descriptive field study is predicated on three assumptions: (1) The primary data of psychology are the observable interactions of a biological organism and environmental events, past and present. (2) Theoretical concepts and laws are derived from empirical concepts and laws, which in turn are derived from the raw data. (3) Descriptive field studies describe interactions between behavioral and environmental events; experimental field studies provide information on their functional relationships. The ingredients of a descriptive field investigation using frequency measures consist of: (1) specifying in objective terms the situation in which the study is conducted, (2) defining and recording behavioral and environmental events in observable terms, and (3) measuring observer reliability. Field descriptive studies following the procedures suggested here would reveal interesting new relationships in the usual ecological settings and would also provide provocative cues for experimental studies. On the other hand, field-experimental studies using frequency measures would probably yield findings that would suggest the need for describing new interactions in specific natural situations. PMID:16795175
NASA Technical Reports Server (NTRS)
Berman, A. L.
1976-01-01
In the last two decades, increasingly sophisticated deep space missions have placed correspondingly stringent requirements on navigational accuracy. As part of the effort to increase navigational accuracy, and hence the quality of radiometric data, much effort has been expended in an attempt to understand and compute the tropospheric effect on range (and hence range rate) data. The general approach adopted has been that of computing a zenith range refraction, and then mapping this refraction to any arbitrary elevation angle via an empirically derived function of elevation. The prediction of zenith range refraction derived from surface measurements of meteorological parameters is presented. Refractivity is separated into wet (water vapor pressure) and dry (atmospheric pressure) components. The integration of dry refractivity is shown to be exact. Attempts to integrate wet refractivity directly prove ineffective; however, several empirical models developed by the author and other researchers at JPL are discussed. The best current wet refraction model is here considered to be a separate day/night model, which is proportional to surface water vapor pressure and inversely proportional to surface temperature. Methods are suggested that might improve the accuracy of the wet range refraction model.
Schwindt, Adam R.; Winkelman, Dana L.
2016-01-01
Urban freshwater streams in arid climates are wastewater effluent dominated ecosystems particularly impacted by bioactive chemicals including steroid estrogens that disrupt vertebrate reproduction. However, more understanding of the population and ecological consequences of exposure to wastewater effluent is needed. We used empirically derived vital rate estimates from a mesocosm study to develop a stochastic stage-structured population model and evaluated the effect of 17α-ethinylestradiol (EE2), the estrogen in human contraceptive pills, on fathead minnow Pimephales promelas stochastic population growth rate. Tested EE2 concentrations ranged from 3.2 to 10.9 ng L−1 and produced stochastic population growth rates (λ S ) below 1 at the lowest concentration, indicating potential for population decline. Declines in λ S compared to controls were evident in treatments that were lethal to adult males despite statistically insignificant effects on egg production and juvenile recruitment. In fact, results indicated that λ S was most sensitive to the survival of juveniles and female egg production. More broadly, our results document that population model results may differ even when empirically derived estimates of vital rates are similar among experimental treatments, and demonstrate how population models integrate and project the effects of stressors throughout the life cycle. Thus, stochastic population models can more effectively evaluate the ecological consequences of experimentally derived vital rates.
Kanematsu, Nobuyuki
2009-03-07
Dose calculation for radiotherapy with protons and heavier ions deals with a large volume of path integrals involving a scattering power of body tissue. This work provides a simple model for such demanding applications. There is an approximate linearity between RMS end-point displacement and range of incident particles in water, empirically found in measurements and detailed calculations. This fact was translated into a simple linear formula, from which the scattering power that is only inversely proportional to the residual range was derived. The simplicity enabled the analytical formulation for ions stopping in water, which was designed to be equivalent with the extended Highland model and agreed with measurements within 2% or 0.02 cm in RMS displacement. The simplicity will also improve the efficiency of numerical path integrals in the presence of heterogeneity.
Mertz, Marcel; Schildmann, Jan
2018-06-01
Empirical bioethics is commonly understood as integrating empirical research with normative-ethical research in order to address an ethical issue. Methodological analyses in empirical bioethics mainly focus on the integration of socio-empirical sciences (e.g. sociology or psychology) and normative ethics. But while there are numerous multidisciplinary research projects combining life sciences and normative ethics, there is few explicit methodological reflection on how to integrate both fields, or about the goals and rationales of such interdisciplinary cooperation. In this paper we will review some drivers for the tendency of empirical bioethics methodologies to focus on the collaboration of normative ethics with particularly social sciences. Subsequently, we argue that the ends of empirical bioethics, not the empirical methods, are decisive for the question of which empirical disciplines can contribute to empirical bioethics in a meaningful way. Using already existing types of research integration as a springboard, five possible types of research which encompass life sciences and normative analysis will illustrate how such cooperation can be conceptualized from a methodological perspective within empirical bioethics. We will conclude with a reflection on the limitations and challenges of empirical bioethics research that integrates life sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorin Zaharia; C.Z. Cheng
In this paper, we study whether the magnetic field of the T96 empirical model can be in force balance with an isotropic plasma pressure distribution. Using the field of T96, we obtain values for the pressure P by solving a Poisson-type equation {del}{sup 2}P = {del} {center_dot} (J x B) in the equatorial plane, and 1-D profiles on the Sun-Earth axis by integrating {del}P = J x B. We work in a flux coordinate system in which the magnetic field is expressed in terms of Euler potentials. Our results lead to the conclusion that the T96 model field cannot bemore » in equilibrium with an isotropic pressure. We also analyze in detail the computation of Birkeland currents using the Vasyliunas relation and the T96 field, which yields unphysical results, again indicating the lack of force balance in the empirical model. The underlying reason for the force imbalance is likely the fact that the derivatives of the least-square fitted model B are not accurate predictions of the actual magnetospheric field derivatives. Finally, we discuss a possible solution to the problem of lack of force balance in empirical field models.« less
Lin, B Y; Wan, T T
1999-12-01
Few empirical analyses have been done in the organizational researches of integrated healthcare networks (IHNs) or integrated healthcare delivery systems. Using a contingency derived contact-process-performance model, this study attempts to explore the relationships among an IHN's strategic direction, structural design, and performance. A cross-sectional analysis of 100 IHNs suggests that certain contextual factors such as market competition and network age and tax status have statistically significant effects on the implementation of an IHN's service differentiation strategy, which addresses coordination and control in the market. An IHN's service differentiation strategy is positively related to its integrated structural design, which is characterized as integration of administration, patient care, and information system across different settings. However, no evidence supports that the development of integrated structural design may benefit an IHN's performance in terms of clinical efficiency and financial viability.
NASA Technical Reports Server (NTRS)
Blum, P. W.; Harris, I.
1973-01-01
The equations of horizontal motion of the neutral atmosphere between 120 and 500 km are integrated with the inclusion of all the nonlinear terms of the convective derivative and the viscous forces due to vertical and horizontal velocity gradients. Empirical models of the distribution of neutral and charged particles are assumed to be known. The model of velocities developed is a steady state model. In part 1 the mathematical method used in the integration of the Navier-Stokes equations is described and the various forces are analysed.
ERIC Educational Resources Information Center
Moore, Stephanie L.
2009-01-01
Although ethics are commonly regarded as an important characteristic and performance attribute, they are also regarded as a slippery or ill-defined topic leaving practitioners and faculty flat-footed in how to teach and assess ethics. This article reports part of the findings from an investigation on deriving an empirical definition of ethics,…
Empirical data and moral theory. A plea for integrated empirical ethics.
Molewijk, Bert; Stiggelbout, Anne M; Otten, Wilma; Dupuis, Heleen M; Kievit, Job
2004-01-01
Ethicists differ considerably in their reasons for using empirical data. This paper presents a brief overview of four traditional approaches to the use of empirical data: "the prescriptive applied ethicists," "the theorists," "the critical applied ethicists," and "the particularists." The main aim of this paper is to introduce a fifth approach of more recent date (i.e. "integrated empirical ethics") and to offer some methodological directives for research in integrated empirical ethics. All five approaches are presented in a table for heuristic purposes. The table consists of eight columns: "view on distinction descriptive-prescriptive sciences," "location of moral authority," "central goal(s)," "types of normativity," "use of empirical data," "method," "interaction empirical data and moral theory," and "cooperation with descriptive sciences." Ethicists can use the table in order to identify their own approach. Reflection on these issues prior to starting research in empirical ethics should lead to harmonization of the different scientific disciplines and effective planning of the final research design. Integrated empirical ethics (IEE) refers to studies in which ethicists and descriptive scientists cooperate together continuously and intensively. Both disciplines try to integrate moral theory and empirical data in order to reach a normative conclusion with respect to a specific social practice. IEE is not wholly prescriptive or wholly descriptive since IEE assumes an interdepence between facts and values and between the empirical and the normative. The paper ends with three suggestions for consideration on some of the future challenges of integrated empirical ethics.
Dimensions of integration, continuity and longitudinality in clinical clerkships.
Ellaway, Rachel H; Graves, Lisa; Cummings, Beth-Ann
2016-09-01
Over the past few decades, longitudinal integrated clerkships (LICs) have been proposed to address many perceived short-coming of traditional block clerkships. This growing interest in LICs has raised broader questions regarding the role of integration, continuity and longitudinality in medical education. A study with complementary theoretical and empirical dimensions was conducted to derive a more precise way of defining these three underlying concepts within the design of medical education curricula. The theoretical dimension involved a thematic review of the literature on integration, continuity and longitudinality in medical education. The empirical dimension surveyed all 17 Canadian medical schools on how they have operationalised integration, continuity and longitudinality in their undergraduate programmes. The two dimensions were iteratively synthesised to explore the meaning and expression of integration, continuity and longitudinality in medical education curriculum design. Integration, continuity and longitudinality were expressed in many ways and forms, including: integration of clinical disciplines, combined horizontal integration and vertical integration, and programme-level integration. Types of continuity included: continuity of patients, continuity of teaching, continuity of location and peer continuity. Longitudinality focused on connected or repeating episodes of training or on connecting activities, such as encounter logging across educational episodes. Twelve of the 17 schools were running an LIC of some kind, although only one school had a mandatory LIC experience. An ordinal scale of uses of integration, continuity and longitudinality during clerkships was developed, and new definitions of these concepts in the clerkship context were generated. Different clerkship designs embodied different forms and levels of integration, continuity and longitudinality. A dichotomous view of LICs and rotation-based clerkships was found not to represent current practices in Canada, which instead tended to fall along a continuum of integration, continuity and longitudinality. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Improving Marine Ecosystem Models with Biochemical Tracers
NASA Astrophysics Data System (ADS)
Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.
2018-01-01
Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.
Using Landsat to provide potato production estimates to Columbia Basin farmers and processors
NASA Technical Reports Server (NTRS)
1990-01-01
A summary of project activities relative to the estimation of potato yields in the Columbia Basin is given. Oregon State University is using a two-pronged approach to yield estimation, one using simulation models and the other using purely empirical models. The simulation modeling approach has used satellite observations to determine key dates in the development of the crop for each field identified as potatoes. In particular, these include planting dates, emergence dates, and harvest dates. These critical dates are fed into simulation models of crop growth and development to derive yield forecasts. Two empirical modeling approaches are illustrated. One relates tuber yield to estimates of cumulative intercepted solar radiation; the other relates tuber yield to the integral under the GVI curve.
ResearchMaps.org for integrating and planning research.
Matiasz, Nicholas J; Wood, Justin; Doshi, Pranay; Speier, William; Beckemeyer, Barry; Wang, Wei; Hsu, William; Silva, Alcino J
2018-01-01
To plan experiments, a biologist needs to evaluate a growing set of empirical findings and hypothetical assertions from diverse fields that use increasingly complex techniques. To address this problem, we operationalized principles (e.g., convergence and consistency) that biologists use to test causal relations and evaluate experimental evidence. With the framework we derived, we then created a free, open-source web application that allows biologists to create research maps, graph-based representations of empirical evidence and hypothetical assertions found in research articles, reviews, and other sources. With our ResearchMaps web application, biologists can systematically reason through the research that is most important to them, as well as evaluate and plan experiments with a breadth and precision that are unlikely without such a tool.
ResearchMaps.org for integrating and planning research
Speier, William; Beckemeyer, Barry; Wang, Wei; Hsu, William; Silva, Alcino J.
2018-01-01
To plan experiments, a biologist needs to evaluate a growing set of empirical findings and hypothetical assertions from diverse fields that use increasingly complex techniques. To address this problem, we operationalized principles (e.g., convergence and consistency) that biologists use to test causal relations and evaluate experimental evidence. With the framework we derived, we then created a free, open-source web application that allows biologists to create research maps, graph-based representations of empirical evidence and hypothetical assertions found in research articles, reviews, and other sources. With our ResearchMaps web application, biologists can systematically reason through the research that is most important to them, as well as evaluate and plan experiments with a breadth and precision that are unlikely without such a tool. PMID:29723213
Mapping a research agenda for the science of team science
Falk-Krzesinski, Holly J; Contractor, Noshir; Fiore, Stephen M; Hall, Kara L; Kane, Cathleen; Keyton, Joann; Klein, Julie Thompson; Spring, Bonnie; Stokols, Daniel; Trochim, William
2012-01-01
An increase in cross-disciplinary, collaborative team science initiatives over the last few decades has spurred interest by multiple stakeholder groups in empirical research on scientific teams, giving rise to an emergent field referred to as the science of team science (SciTS). This study employed a collaborative team science concept-mapping evaluation methodology to develop a comprehensive research agenda for the SciTS field. Its integrative mixed-methods approach combined group process with statistical analysis to derive a conceptual framework that identifies research areas of team science and their relative importance to the emerging SciTS field. The findings from this concept-mapping project constitute a lever for moving SciTS forward at theoretical, empirical, and translational levels. PMID:23223093
NASA Astrophysics Data System (ADS)
Strauch, R. L.; Istanbulluoglu, E.
2017-12-01
We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-01-01
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user’s ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user’s high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user’s daily smartphone use. PMID:26978364
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System.
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-03-11
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user's ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user's high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user's daily smartphone use.
Renormalization of the fragmentation equation: exact self-similar solutions and turbulent cascades.
Saveliev, V L; Gorokhovski, M A
2012-12-01
Using an approach developed earlier for renormalization of the Boltzmann collision integral [Saveliev and Nanbu, Phys. Rev. E 65, 051205 (2002)], we derive an exact divergence form for the fragmentation operator. Then we reduce the fragmentation equation to the continuity equation in size space, with the flux given explicitly. This allows us to obtain self-similar solutions and to find the integral of motion for these solutions (we call it the bare flux). We show how these solutions can be applied as a description of cascade processes in three- and two-dimensional turbulence. We also suggested an empirical cascade model of impact fragmentation of brittle materials.
Construct Validation of Wenger's Support Network Typology.
Szabo, Agnes; Stephens, Christine; Allen, Joanne; Alpass, Fiona
2016-10-07
The study aimed to validate Wenger's empirically derived support network typology of responses to the Practitioner Assessment of Network Type (PANT) in an older New Zealander population. The configuration of network types was tested across ethnic groups and in the total sample. Data (N = 872, Mage = 67 years, SDage = 1.56 years) from the 2006 wave of the New Zealand Health, Work and Retirement study were analyzed using latent profile analysis. In addition, demographic differences among the emerging profiles were tested. Competing models were evaluated based on a range of fit criteria, which supported a five-profile solution. The "locally integrated," "community-focused," "local self-contained," "private-restricted," and "friend- and family-dependent" network types were identified as latent profiles underlying the data. There were no differences between Māori and non-Māori in final profile configurations. However, Māori were more likely to report integrated network types. Findings confirm the validity of Wenger's network types. However, the level to which participants endorse accessibility of family, frequency of interactions, and community engagement can be influenced by sample and contextual characteristics. Future research using the PANT items should empirically verify and derive the social support network types, rather than use a predefined scoring system. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Integrating Empirical-Modeling Approaches to Improve Understanding of Terrestrial Ecology Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarthy, Heather; Luo, Yiqi; Wullschleger, Stan D
Recent decades have seen tremendous increases in the quantity of empirical ecological data collected by individual investigators, as well as through research networks such as FLUXNET (Baldocchi et al., 2001). At the same time, advances in computer technology have facilitated the development and implementation of large and complex land surface and ecological process models. Separately, each of these information streams provides useful, but imperfect information about ecosystems. To develop the best scientific understanding of ecological processes, and most accurately predict how ecosystems may cope with global change, integration of empirical and modeling approaches is necessary. However, true integration - inmore » which models inform empirical research, which in turn informs models (Fig. 1) - is not yet common in ecological research (Luo et al., 2011). The goal of this workshop, sponsored by the Department of Energy, Office of Science, Biological and Environmental Research (BER) program, was to bring together members of the empirical and modeling communities to exchange ideas and discuss scientific practices for increasing empirical - model integration, and to explore infrastructure and/or virtual network needs for institutionalizing empirical - model integration (Yiqi Luo, University of Oklahoma, Norman, OK, USA). The workshop included presentations and small group discussions that covered topics ranging from model-assisted experimental design to data driven modeling (e.g. benchmarking and data assimilation) to infrastructure needs for empirical - model integration. Ultimately, three central questions emerged. How can models be used to inform experiments and observations? How can experimental and observational results be used to inform models? What are effective strategies to promote empirical - model integration?« less
NASA Astrophysics Data System (ADS)
Pelowski, Matthew; Markey, Patrick S.; Forster, Michael; Gerger, Gernot; Leder, Helmut
2017-07-01
This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between ;aesthetic; and ;everyday; emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research.
Pelowski, Matthew; Markey, Patrick S; Forster, Michael; Gerger, Gernot; Leder, Helmut
2017-07-01
This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between "aesthetic" and "everyday" emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research. Copyright © 2017 Elsevier B.V. All rights reserved.
EFFECTIVE USE OF SEDIMENT QUALITY GUIDELINES: WHICH GUIDELINE IS RIGHT FOR ME?
A bewildering array of sediment quality guidelines have been developed, but fortunately they mostly fall into two families: empirically-derived and theoretically-derived. The empirically-derived guidelines use large data bases of concurrent sediment chemistry and biological effe...
NASA Astrophysics Data System (ADS)
Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.
2007-03-01
This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.
NASA Astrophysics Data System (ADS)
Tu, Rui; Wang, Rongjiang; Zhang, Yong; Walter, Thomas R.
2014-06-01
The description of static displacements associated with earthquakes is traditionally achieved using GPS, EDM or InSAR data. In addition, displacement histories can be derived from strong-motion records, allowing an improvement of geodetic networks at a high sampling rate and a better physical understanding of earthquake processes. Strong-motion records require a correction procedure appropriate for baseline shifts that may be caused by rotational motion, tilting and other instrumental effects. Common methods use an empirical bilinear correction on the velocity seismograms integrated from the strong-motion records. In this study, we overcome the weaknesses of an empirically based bilinear baseline correction scheme by using a net-based criterion to select the timing parameters. This idea is based on the physical principle that low-frequency seismic waveforms at neighbouring stations are coherent if the interstation distance is much smaller than the distance to the seismic source. For a dense strong-motion network, it is plausible to select the timing parameters so that the correlation coefficient between the velocity seismograms of two neighbouring stations is maximized after the baseline correction. We applied this new concept to the KiK-Net and K-Net strong-motion data available for the 2011 Mw 9.0 Tohoku earthquake. We compared the derived coseismic static displacement with high-quality GPS data, and with the results obtained using empirical methods. The results show that the proposed net-based approach is feasible and more robust than the individual empirical approaches. The outliers caused by unknown problems in the measurement system can be easily detected and quantified.
Integrating animal movement with habitat suitability for estimating dynamic landscape connectivity
van Toor, Mariëlle L.; Kranstauber, Bart; Newman, Scott H.; Prosser, Diann J.; Takekawa, John Y.; Technitis, Georgios; Weibel, Robert; Wikelski, Martin; Safi, Kamran
2018-01-01
Context High-resolution animal movement data are becoming increasingly available, yet having a multitude of empirical trajectories alone does not allow us to easily predict animal movement. To answer ecological and evolutionary questions at a population level, quantitative estimates of a species’ potential to link patches or populations are of importance. Objectives We introduce an approach that combines movement-informed simulated trajectories with an environment-informed estimate of the trajectories’ plausibility to derive connectivity. Using the example of bar-headed geese we estimated migratory connectivity at a landscape level throughout the annual cycle in their native range. Methods We used tracking data of bar-headed geese to develop a multi-state movement model and to estimate temporally explicit habitat suitability within the species’ range. We simulated migratory movements between range fragments, and calculated a measure we called route viability. The results are compared to expectations derived from published literature. Results Simulated migrations matched empirical trajectories in key characteristics such as stopover duration. The viability of the simulated trajectories was similar to that of the empirical trajectories. We found that, overall, the migratory connectivity was higher within the breeding than in wintering areas, corroborating previous findings for this species. Conclusions We show how empirical tracking data and environmental information can be fused for meaningful predictions of animal movements throughout the year and even outside the spatial range of the available data. Beyond predicting migratory connectivity, our framework will prove useful for modelling ecological processes facilitated by animal movement, such as seed dispersal or disease ecology.
Renormalization of the fragmentation equation: Exact self-similar solutions and turbulent cascades
NASA Astrophysics Data System (ADS)
Saveliev, V. L.; Gorokhovski, M. A.
2012-12-01
Using an approach developed earlier for renormalization of the Boltzmann collision integral [Saveliev and Nanbu, Phys. Rev. E1539-375510.1103/PhysRevE.65.051205 65, 051205 (2002)], we derive an exact divergence form for the fragmentation operator. Then we reduce the fragmentation equation to the continuity equation in size space, with the flux given explicitly. This allows us to obtain self-similar solutions and to find the integral of motion for these solutions (we call it the bare flux). We show how these solutions can be applied as a description of cascade processes in three- and two-dimensional turbulence. We also suggested an empirical cascade model of impact fragmentation of brittle materials.
Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice
2014-01-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation. PMID:24672588
Cramer, Robert J; Johnson, Shara M; McLaughlin, Jennifer; Rausch, Emilie M; Conroy, Mary Alice
2013-02-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation.
Measuring Integrated Information from the Decoding Perspective
Oizumi, Masafumi; Amari, Shun-ichi; Yanagawa, Toru; Fujii, Naotaka; Tsuchiya, Naotsugu
2016-01-01
Accumulating evidence indicates that the capacity to integrate information in the brain is a prerequisite for consciousness. Integrated Information Theory (IIT) of consciousness provides a mathematical approach to quantifying the information integrated in a system, called integrated information, Φ. Integrated information is defined theoretically as the amount of information a system generates as a whole, above and beyond the amount of information its parts independently generate. IIT predicts that the amount of integrated information in the brain should reflect levels of consciousness. Empirical evaluation of this theory requires computing integrated information from neural data acquired from experiments, although difficulties with using the original measure Φ precludes such computations. Although some practical measures have been previously proposed, we found that these measures fail to satisfy the theoretical requirements as a measure of integrated information. Measures of integrated information should satisfy the lower and upper bounds as follows: The lower bound of integrated information should be 0 and is equal to 0 when the system does not generate information (no information) or when the system comprises independent parts (no integration). The upper bound of integrated information is the amount of information generated by the whole system. Here we derive the novel practical measure Φ* by introducing a concept of mismatched decoding developed from information theory. We show that Φ* is properly bounded from below and above, as required, as a measure of integrated information. We derive the analytical expression of Φ* under the Gaussian assumption, which makes it readily applicable to experimental data. Our novel measure Φ* can generally be used as a measure of integrated information in research on consciousness, and also as a tool for network analysis on diverse areas of biology. PMID:26796119
Evidence-based hypnotherapy for depression.
Alladin, Assen
2010-04-01
Cognitive hypnotherapy (CH) is a comprehensive evidence-based hypnotherapy for clinical depression. This article describes the major components of CH, which integrate hypnosis with cognitive-behavior therapy as the latter provides an effective host theory for the assimilation of empirically supported treatment techniques derived from various theoretical models of psychotherapy and psychopathology. CH meets criteria for an assimilative model of psychotherapy, which is considered to be an efficacious model of psychotherapy integration. The major components of CH for depression are described in sufficient detail to allow replication, verification, and validation of the techniques delineated. CH for depression provides a template that clinicians and investigators can utilize to study the additive effects of hypnosis in the management of other psychological or medical disorders. Evidence-based hypnotherapy and research are encouraged; such a movement is necessary if clinical hypnosis is to integrate into mainstream psychotherapy.
NASA Astrophysics Data System (ADS)
Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.
2007-06-01
This study presents an empirical relation that links the volume extinction coefficients of water clouds, the layer integrated depolarization ratios measured by lidar, and the effective radii of water clouds derived from collocated passive sensor observations. Based on Monte Carlo simulations of CALIPSO lidar observations, this method combines the cloud effective radius reported by MODIS with the lidar depolarization ratios measured by CALIPSO to estimate both the liquid water content and the effective number concentration of water clouds. The method is applied to collocated CALIPSO and MODIS measurements obtained during July and October of 2006, and January 2007. Global statistics of the cloud liquid water content and effective number concentration are presented.
Multi-scale predictive modeling of nano-material and realistic electron devices
NASA Astrophysics Data System (ADS)
Palaria, Amritanshu
Among the challenges faced in further miniaturization of electronic devices, heavy influence of the detailed atomic configuration of the material(s) involved, which often differs significantly from that of the bulk material(s), is prominent. Device design has therefore become highly interrelated with material engineering at the atomic level. This thesis aims at outlining, with examples, a multi-scale simulation procedure that allows one to integrate material and device aspects of nano-electronic design to predict behavior of novel devices with novel material. This is followed in four parts: (1) An approach that combines a higher time scale reactive force field analysis with density functional theory to predict structure of new material is demonstrated for the first time for nanowires. Novel stable structures for very small diameter silicon nanowires are predicted. (2) Density functional theory is used to show that the new nanowire structures derived in 1 above have properties different from diamond core wires even though the surface bonds in some may be similar to the surface of bulk silicon. (3) Electronic structure of relatively large-scale germanium sections of realistically strained Si/strained Ge/ strained Si nanowire heterostructures is computed using empirical tight binding and it is shown that the average non-homogeneous strain in these structures drives their interesting non-conventional electronic characteristics such as hole effective masses which decrease as the wire cross-section is reduced. (4) It is shown that tight binding, though empirical in nature, is not necessarily limited to the material and atomic structure for which the parameters have been empirically derived, but that simple changes may adapt the derived parameters to new bond environments. Si (100) surface electronic structure is obtained from bulk Si parameters.
Landis, G.P.; Hofstra, A.H.
1991-01-01
Recent advances in instrumentation now permit quantitative analysis of gas species from individual fluid inclusions. Fluid inclusion gas data can be applied to minerals exploration empirically to establish chemical (gas composition) signatures of the ore fluids, and conceptually through the development of genetic models of ore formation from a framework of integrated geologic, geochemical, and isotopic investigations. Case studies of fluid inclusion gas chemistry from ore deposits representing a spectrum of ore-forming processes and environments are presented to illustrate both the empirical and conceptual approaches. We consider epithermal silver-gold deposits of Creede, Colorado, Carlin-type sediment-hosted disseminated gold deposits of Jerritt Canyon, Nevada, metamorphic silver-base-metal veins of the Coeur d'Alene district, Idaho and Montana, gold-quartz veins in accreted terranes of southern Alaska, and the mid-continent base-metal sulfide deposits of Mississippi Valley-Type (MVT's). Variations in gas chemistry determine the redox state of the ore fluids, provide compositional input for gas geothermometers, characterize ore fluid chemistry (e.g., CH4CO2, H2SSO2, CO2/H2S, organic-rich fluids, gas-rich and gas-poor fluids), identify magmatic, meteoric, metamorphic, shallow and deep basin fluids in ore systems, locate upwelling plumes of magmatic-derived volatiles, zones of boiling and volatile separation, interfaces between contrasting fluids, and important zones of fluid mixing. Present techniques are immediately applicable to exploration programsas empirical studies that monitor fluid inclusion gas threshold concentration levels, presence or absence of certain gases, or changes in gas ratios. We suggest that the greater contribution of fluid inclusion gas analysis is in the integrated and comprehensive chemical dimension that gas data impart to genetic models, and in the exploration concepts based on processes and environments of ore formation derived from these genetic models. ?? 1991.
Modeling of Inverted Annular Film Boiling using an integral method
NASA Astrophysics Data System (ADS)
Sridharan, Arunkumar
In modeling Inverted Annular Film Boiling (IAFB), several important phenomena such as interaction between the liquid and the vapor phases and irregular nature of the interface, which greatly influence the momentum and heat transfer at the interface, need to be accounted for. However, due to the complexity of these phenomena, they were not modeled in previous studies. Since two-phase heat transfer equations and relationships rely heavily on experimental data, many closure relationships that were used in previous studies to solve the problem are empirical in nature. Also, in deriving the relationships, the experimental data were often extrapolated beyond the intended range of conditions, causing errors in predictions. In some cases, empirical correlations that were derived from situations other than IAFB, and whose applicability to IAFB was questionable, were used. Moreover, arbitrary constants were introduced in the model developed in previous studies to provide good fit to the experimental data. These constants have no physical basis, thereby leading to questionable accuracy in the model predictions. In the present work, modeling of Inverted Annular Film Boiling (IAFB) is done using Integral Method. Two-dimensional formulation of IAFB is presented. Separate equations for the conservation of mass, momentum and energy are derived from first principles, for the vapor film and the liquid core. Turbulence is incorporated in the formulation. The system of second-order partial differential equations is integrated over the radial direction to obtain a system of integral differential equations. In order to solve the system of equations, second order polynomial profiles are used to describe the nondimensional velocity and temperatures. The unknown coefficients in the profiles are functions of the axial direction alone. Using the boundary conditions that govern the physical problem, equations for the unknown coefficients are derived in terms of the primary dependent variables: wall shear stress, interfacial shear stress, film thickness, pressure, wall temperature and the mass transfer rate due to evaporation. A system of non-linear first order coupled ordinary differential equations is obtained. Due to the inherent mathematical complexity of the system of equations, simplifying assumptions are made to obtain a numerical solution. The system of equations is solved numerically to obtain values of the unknown quantities at each subsequent axial location. Derived quantities like void fraction and heat transfer coefficient are calculated at each axial location. The calculation is terminated when the void fraction reaches a value of 0.6, the upper limit of IAFB. The results obtained agree with the experimental trends observed. Void fraction increases along the heated length, while the heat transfer coefficient drops due to the increased resistance of the vapor film as expected.
Calculation and Identification of the Aerodynamic Parameters for Small-Scaled Fixed-Wing UAVs.
Shen, Jieliang; Su, Yan; Liang, Qing; Zhu, Xinhua
2018-01-13
The establishment of the Aircraft Dynamic Model(ADM) constitutes the prerequisite for the design of the navigation and control system, but the aerodynamic parameters in the model could not be readily obtained especially for small-scaled fixed-wing UAVs. In this paper, the procedure of computing the aerodynamic parameters is developed. All the longitudinal and lateral aerodynamic derivatives are firstly calculated through semi-empirical method based on the aerodynamics, rather than the wind tunnel tests or fluid dynamics software analysis. Secondly, the residuals of each derivative are proposed to be identified or estimated further via Extended Kalman Filter(EKF), with the observations of the attitude and velocity from the airborne integrated navigation system. Meanwhile, the observability of the targeted parameters is analyzed and strengthened through multiple maneuvers. Based on a small-scaled fixed-wing aircraft driven by propeller, the airborne sensors are chosen and the model of the actuators are constructed. Then, real flight tests are implemented to verify the calculation and identification process. Test results tell the rationality of the semi-empirical method and show the improvement of accuracy of ADM after the compensation of the parameters.
Ooms, Gorik
2015-06-16
Global health research is essentially a normative undertaking: we use it to propose policies that ought to be implemented. To arrive at a normative conclusion in a logical way requires at least one normative premise, one that cannot be derived from empirical evidence alone. But there is no widely accepted normative premise for global health, and the actors with the power to set policies may use a different normative premise than the scholars that propose policies - which may explain the 'implementation gap' in global health. If global health scholars shy away from the normative debate - because it requires normative premises that cannot be derived from empirical evidence alone - they not only mislead each other, they also prevent and stymie debate on the role of the powerhouses of global health, their normative premises, and the rights and wrongs of these premises. The humanities and social sciences are better equipped - and less reluctant - to approach the normative debate in a scientifically valid manner, and ought to be better integrated in the interdisciplinary research that global health research is, or should be. © 2015 by Kerman University of Medical Sciences.
Ooms, Gorik
2015-01-01
Global health research is essentially a normative undertaking: we use it to propose policies that ought to be implemented. To arrive at a normative conclusion in a logical way requires at least one normative premise, one that cannot be derived from empirical evidence alone. But there is no widely accepted normative premise for global health, and the actors with the power to set policies may use a different normative premise than the scholars that propose policies – which may explain the ‘implementation gap’ in global health. If global health scholars shy away from the normative debate – because it requires normative premises that cannot be derived from empirical evidence alone – they not only mislead each other, they also prevent and stymie debate on the role of the powerhouses of global health, their normative premises, and the rights and wrongs of these premises. The humanities and social sciences are better equipped – and less reluctant – to approach the normative debate in a scientifically valid manner, and ought to be better integrated in the interdisciplinary research that global health research is, or should be. PMID:26673173
Calculation and Identification of the Aerodynamic Parameters for Small-Scaled Fixed-Wing UAVs
Shen, Jieliang; Su, Yan; Liang, Qing; Zhu, Xinhua
2018-01-01
The establishment of the Aircraft Dynamic Model (ADM) constitutes the prerequisite for the design of the navigation and control system, but the aerodynamic parameters in the model could not be readily obtained especially for small-scaled fixed-wing UAVs. In this paper, the procedure of computing the aerodynamic parameters is developed. All the longitudinal and lateral aerodynamic derivatives are firstly calculated through semi-empirical method based on the aerodynamics, rather than the wind tunnel tests or fluid dynamics software analysis. Secondly, the residuals of each derivative are proposed to be identified or estimated further via Extended Kalman Filter (EKF), with the observations of the attitude and velocity from the airborne integrated navigation system. Meanwhile, the observability of the targeted parameters is analyzed and strengthened through multiple maneuvers. Based on a small-scaled fixed-wing aircraft driven by propeller, the airborne sensors are chosen and the model of the actuators are constructed. Then, real flight tests are implemented to verify the calculation and identification process. Test results tell the rationality of the semi-empirical method and show the improvement of accuracy of ADM after the compensation of the parameters. PMID:29342856
The definition of community integration: perspectives of people with brain injuries.
McColl, M A; Carlson, P; Johnston, J; Minnes, P; Shue, K; Davies, D; Karlovits, T
1998-01-01
Despite considerable attention to community integration and related topics in the past decades, a clear definition of community integration continues to elude researchers and service providers. Common to most discussions of the topic, however, are three ideas: that integration involves relationships with others, independence in one's living situation and activities to fill one's time. The present study sought to expand this conceptualization of community integration by asking people with brain injuries for their own perspectives on community integration. This qualitative study resulted in a definition of community integration consisting of nine indicators: orientation, acceptance, conformity, close and diffuse relationships, living situation, independence, productivity and leisure. These indicators were empirically derived from the text of 116 interviews with people with moderate-severe brain injuries living in the community. Eighteen adults living in supported living programmes were followed for 1 year, to track their evolving definition of integration and the factors they felt were related to integration. The study also showed a general trend toward more positive evaluation over the year, and revealed that positive evaluation was frequently related to meeting new people and freedom from staff supervision. These findings are interpreted in the light of recommendations for community programmes.
Organic unity theory: an integrative mind-body theory for psychiatry.
Goodman, A
1997-12-01
The potential of psychiatry as an integrative science has been impeded by an internal schism that derives from the duality of mental and physical. Organic unity theory is proposed as a conceptual framework that brings together the terms of the mind-body duality in one coherent perspective. Organic unity theory is braided of three strands: identity, which describes the relationship between mentally described events and corresponding physically described events; continuity, which describes the linguistic-conceptual system that contains both mental and physical terms; and dialectic, which describes the relationship between the empirical way of knowing that is associated with the physical domain of the linguistic-conceptual system and the hermeneutic way of knowing that is associated with the mental domain. Each strand represents an integrative formulation that resolves an aspect of mental-physical dualism into an underlying unity. After the theory is presented, its implications for psychiatry are briefly considered.
Essays on pricing electricity and electricity derivatives in deregulated markets
NASA Astrophysics Data System (ADS)
Popova, Julia
2008-10-01
This dissertation is composed of four essays on the behavior of wholesale electricity prices and their derivatives. The first essay provides an empirical model that takes into account the spatial features of a transmission network on the electricity market. The spatial structure of the transmission grid plays a key role in determining electricity prices, but it has not been incorporated into previous empirical models. The econometric model in this essay incorporates a simple representation of the transmission system into a spatial panel data model of electricity prices, and also accounts for the effect of dynamic transmission system constraints on electricity market integration. Empirical results using PJM data confirm the existence of spatial patterns in electricity prices and show that spatial correlation diminishes as transmission lines become more congested. The second essay develops and empirically tests a model of the influence of natural gas storage inventories on the electricity forward premium. I link a model of the effect of gas storage constraints on the higher moments of the distribution of electricity prices to a model of the effect of those moments on the forward premium. Empirical results using PJM data support the model's predictions that gas storage inventories sharply reduce the electricity forward premium when demand for electricity is high and space-heating demand for gas is low. The third essay examines the efficiency of PJM electricity markets. A market is efficient if prices reflect all relevant information, so that prices follow a random walk. The hypothesis of random walk is examined using empirical tests, including the Portmanteau, Augmented Dickey-Fuller, KPSS, and multiple variance ratio tests. The results are mixed though evidence of some level of market efficiency is found. The last essay investigates the possibility that previous researchers have drawn spurious conclusions based on classical unit root tests incorrectly applied to wholesale electricity prices. It is well known that electricity prices exhibit both cyclicity and high volatility which varies through time. Results indicate that heterogeneity in unconditional variance---which is not detected by classical unit root tests---may contribute to the appearance of non-stationarity.
Simons, Jeffrey S.
2017-01-01
The purpose of this paper was to describe and appraise the research evidence on the effects of acute alcohol intoxication and sexual arousal on sexual risk behaviors in men who have sex with men (MSM) and to examine its implications for design of HIV prevention interventions that target MSM. Toward that end, the paper begins with a discussion of research on sexual arousal in men and alcohol and their acute effects on sexual behaviors. This is followed by a review of empirical evidence on the combined acute effects of alcohol and sexual arousal in heterosexual men (the large majority of studies) and then in MSM. The empirical evidence and related theoretical developments then are integrated to derive implications for developing effective HIV prevention interventions that target MSM. PMID:26459332
An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh
2013-01-01
The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis. PMID:23300959
Tudela-Torras, M; Abad-Mas, L; Tudela-Torras, E
2017-02-24
Today, the fact that sensory integration difficulties with a neurological basis exist and that they seriously condition the development of those individuals who suffer from them is widely accepted and acknowledged as being obvious by the vast majority of professionals working in the field of community healthcare. However, less is known and there is more controversy about effective treatments that can be applied to them. This is because many professionals criticise the fact that there is not enough scientific evidence to prove, both quantitatively and empirically, the outcomes of the interventions implemented as alternatives to pharmacological therapy. Consequently, when the symptoms and repercussions on the quality of life deriving from a distorted sensory integration are really disabling for the person, pharmacological treatment is used as the only possible approach, with the side effects that this entails. The reason for this is largely the fact that little is known about other effective therapeutic approaches, such as occupational therapy based on sensory integration.
Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh
2013-01-01
The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.
NASA Astrophysics Data System (ADS)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-01
This study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Second, using a newly developed proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ˜ 2°, than those from the three empirical models with averaged errors > ˜ 5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. This study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-21
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parson, E.A.
This project undertook a preliminary investigation of the conduct and use of assessments, particularly integrated assessments, in international negotiation and policy-making. The research involved review of existing secondary literatures including related theoretical literatures of negotiation analysis and multi-party bargaining; review of archival and documentary material on a few international assessment cases; and interviews in North America and Europe with assessment managers and users. The project sought to identify empirical regularities in the relationships between assessment characteristics and the manner and extent of their contribution to policy-making; to specify and critically assess a set of candidate mechanisms through which assessments influencemore » and assist international policy-making; and to derive from these investigations preliminary practical guidance for assessment design.« less
NASA Astrophysics Data System (ADS)
Cheong, Chin Wen
2008-02-01
This article investigated the influences of structural breaks on the fractionally integrated time-varying volatility model in the Malaysian stock markets which included the Kuala Lumpur composite index and four major sectoral indices. A fractionally integrated time-varying volatility model combined with sudden changes is developed to study the possibility of structural change in the empirical data sets. Our empirical results showed substantial reduction in fractional differencing parameters after the inclusion of structural change during the Asian financial and currency crises. Moreover, the fractionally integrated model with sudden change in volatility performed better in the estimation and specification evaluations.
Pluvials, Droughts, Energetics, and the Mongol Empire
NASA Astrophysics Data System (ADS)
Hessl, A. E.; Pederson, N.; Baatarbileg, N.
2012-12-01
The success of the Mongol Empire, the largest contiguous land empire the world has ever known, is a historical enigma. At its peak in the late 13th century, the empire influenced areas from the Hungary to southern Asia and Persia. Powered by domesticated herbivores, the Mongol Empire grew at the expense of agriculturalists in Eastern Europe, Persia, and China. What environmental factors contributed to the rise of the Mongols? What factors influenced the disintegration of the empire by 1300 CE? Until now, little high resolution environmental data have been available to address these questions. We use tree-ring records of past temperature and water to illuminate the role of energy and water in the evolution of the Mongol Empire. The study of energetics has long been applied to biological and ecological systems but has only recently become a theme in understanding modern coupled natural and human systems (CNH). Because water and energy are tightly linked in human and natural systems, studying their synergies and interactions make it possible to integrate knowledge across disciplines and human history, yielding important lessons for modern societies. We focus on the role of energy and water in the trajectory of an empire, including its rise, development, and demise. Our research is focused on the Orkhon Valley, seat of the Mongol Empire, where recent paleoenvironmental and archeological discoveries allow high resolution reconstructions of past human and environmental conditions for the first time. Our preliminary records indicate that the period 1210-1230 CE, the height of Chinggis Khan's reign is one of the longest and most consistent pluvials in our tree ring reconstruction of interannual drought. Reconstructed temperature derived from five millennium-long records from subalpine forests in Mongolia document warm temperatures beginning in the early 1200's and ending with a plunge into cold temperatures in 1260. Abrupt cooling in central Mongolia at this time is consistent with a well-documented volcanic eruption that caused massive crop damage and famine throughout much of Europe. In Mongol history, this abrupt cooling also coincides with the move of the capital from Central Mongolia (Karakorum) to China (Beijing). In combination, the tree-ring records of water and temperature suggest that 1) the rise of the Mongol Empire occurred during an unusually consistent warm and wet climate and 2) the disintegration of the Empire occurred following a plunge into cold and dry conditions in Central Mongolia. These results represent the first step of a larger project integrating physical science and history to understand the role of energy in the evolution of the Mongol Empire. Using data from historic documents, ecological modeling, tree rings, and sediment cores, we will investigate whether the expansion and contraction of the empire was related to moisture and temperature availability and thus grassland productivity associated with climate change in the Orkhon Valley.
Faisal, Kamil; Shaker, Ahmed
2017-03-07
Urban Environmental Quality (UEQ) can be treated as a generic indicator that objectively represents the physical and socio-economic condition of the urban and built environment. The value of UEQ illustrates a sense of satisfaction to its population through assessing different environmental, urban and socio-economic parameters. This paper elucidates the use of the Geographic Information System (GIS), Principal Component Analysis (PCA) and Geographically-Weighted Regression (GWR) techniques to integrate various parameters and estimate the UEQ of two major cities in Ontario, Canada. Remote sensing, GIS and census data were first obtained to derive various environmental, urban and socio-economic parameters. The aforementioned techniques were used to integrate all of these environmental, urban and socio-economic parameters. Three key indicators, including family income, higher level of education and land value, were used as a reference to validate the outcomes derived from the integration techniques. The results were evaluated by assessing the relationship between the extracted UEQ results and the reference layers. Initial findings showed that the GWR with the spatial lag model represents an improved precision and accuracy by up to 20% with respect to those derived by using GIS overlay and PCA techniques for the City of Toronto and the City of Ottawa. The findings of the research can help the authorities and decision makers to understand the empirical relationships among environmental factors, urban morphology and real estate and decide for more environmental justice.
Faisal, Kamil; Shaker, Ahmed
2017-01-01
Urban Environmental Quality (UEQ) can be treated as a generic indicator that objectively represents the physical and socio-economic condition of the urban and built environment. The value of UEQ illustrates a sense of satisfaction to its population through assessing different environmental, urban and socio-economic parameters. This paper elucidates the use of the Geographic Information System (GIS), Principal Component Analysis (PCA) and Geographically-Weighted Regression (GWR) techniques to integrate various parameters and estimate the UEQ of two major cities in Ontario, Canada. Remote sensing, GIS and census data were first obtained to derive various environmental, urban and socio-economic parameters. The aforementioned techniques were used to integrate all of these environmental, urban and socio-economic parameters. Three key indicators, including family income, higher level of education and land value, were used as a reference to validate the outcomes derived from the integration techniques. The results were evaluated by assessing the relationship between the extracted UEQ results and the reference layers. Initial findings showed that the GWR with the spatial lag model represents an improved precision and accuracy by up to 20% with respect to those derived by using GIS overlay and PCA techniques for the City of Toronto and the City of Ottawa. The findings of the research can help the authorities and decision makers to understand the empirical relationships among environmental factors, urban morphology and real estate and decide for more environmental justice. PMID:28272334
Aperture-free star formation rate of SDSS star-forming galaxies
NASA Astrophysics Data System (ADS)
Duarte Puertas, S.; Vilchez, J. M.; Iglesias-Páramo, J.; Kehrig, C.; Pérez-Montero, E.; Rosales-Ortega, F. F.
2017-03-01
Large area surveys with a high number of galaxies observed have undoubtedly marked a milestone in the understanding of several properties of galaxies, such as star-formation history, morphology, and metallicity. However, in many cases, these surveys provide fluxes from fixed small apertures (e.g. fibre), which cover a scant fraction of the galaxy, compelling us to use aperture corrections to study the global properties of galaxies. In this work, we derive the current total star formation rate (SFR) of Sloan Digital Sky Survey (SDSS) star-forming galaxies, using an empirically based aperture correction of the measured Hα flux for the first time, thus minimising the uncertainties associated with reduced apertures. All the Hα fluxes have been extinction-corrected using the Hα/ Hβ ratio free from aperture effects. The total SFR for 210 000 SDSS star-forming galaxies has been derived applying pure empirical Hα and Hα/ Hβ aperture corrections based on the Calar Alto Legacy Integral Field Area (CALIFA) survey. We find that, on average, the aperture-corrected SFR is 0.65 dex higher than the SDSS fibre-based SFR. The relation between the SFR and stellar mass for SDSS star-forming galaxies (SFR-M⋆) has been obtained, together with its dependence on extinction and Hα equivalent width. We compare our results with those obtained in previous works and examine the behaviour of the derived SFR in six redshift bins, over the redshift range 0.005 ≤ z ≤ 0.22. The SFR-M⋆ sequence derived here is in agreement with selected observational studies based on integral field spectroscopy of individual galaxies as well as with the predictions of recent theoretical models of disc galaxies. A table of the aperture-corrected fluxes and SFR for 210 000 SDSS star-forming galaxies and related relevant data is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/599/A71 Warning, no authors found for 2017A&A...599A..51.
NASA Astrophysics Data System (ADS)
Varotsos, C. A.; Efstathiou, M. N.
2018-03-01
In this paper we investigate the evolution of the energy emitted by CO2 and NO from the Earth's thermosphere on a global scale using both observational and empirically derived data. In the beginning, we analyze the daily power observations of CO2 and NO received from the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) equipment on the NASA Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite for the entire period 2002-2016. We then perform the same analysis on the empirical daily power emitted by CO2 and NO that were derived recently from the infrared energy budget of the thermosphere during 1947-2016. The tool used for the analysis of the observational and empirical datasets is the detrended fluctuation analysis, in order to investigate whether the power emitted by CO2 and by NO from the thermosphere exhibits power-law behavior. The results obtained from both observational and empirical data do not support the establishment of the power-law behavior. This conclusion reveals that the empirically derived data are characterized by the same intrinsic properties as those of the observational ones, thus enhancing the validity of their reliability.
Advancing Empirical Scholarship to Further Develop Evaluation Theory and Practice
ERIC Educational Resources Information Center
Christie, Christina A.
2011-01-01
Good theory development is grounded in empirical inquiry. In the context of educational evaluation, the development of empirically grounded theory has important benefits for the field and the practitioner. In particular, a shift to empirically derived theory will assist in advancing more systematic and contextually relevant evaluation practice, as…
Landslide Hazard Probability Derived from Inherent and Dynamic Determinants
NASA Astrophysics Data System (ADS)
Strauch, Ronda; Istanbulluoglu, Erkan
2016-04-01
Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.
ERIC Educational Resources Information Center
Coromaldi, Manuela; Zoli, Mariangela
2012-01-01
Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…
Koopmeiners, Joseph S; Feng, Ziding
2011-01-01
The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves.
Koopmeiners, Joseph S.; Feng, Ziding
2013-01-01
The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves. PMID:24039313
Bradbury, Steven P; Russom, Christine L; Ankley, Gerald T; Schultz, T Wayne; Walker, John D
2003-08-01
The use of quantitative structure-activity relationships (QSARs) in assessing potential toxic effects of organic chemicals on aquatic organisms continues to evolve as computational efficiency and toxicological understanding advance. With the ever-increasing production of new chemicals, and the need to optimize resources to assess thousands of existing chemicals in commerce, regulatory agencies have turned to QSARs as essential tools to help prioritize tiered risk assessments when empirical data are not available to evaluate toxicological effects. Progress in designing scientifically credible QSARs is intimately associated with the development of empirically derived databases of well-defined and quantified toxicity endpoints, which are based on a strategic evaluation of diverse sets of chemical structures, modes of toxic action, and species. This review provides a brief overview of four databases created for the purpose of developing QSARs for estimating toxicity of chemicals to aquatic organisms. The evolution of QSARs based initially on general chemical classification schemes, to models founded on modes of toxic action that range from nonspecific partitioning into hydrophobic cellular membranes to receptor-mediated mechanisms is summarized. Finally, an overview of expert systems that integrate chemical-specific mode of action classification and associated QSAR selection for estimating potential toxicological effects of organic chemicals is presented.
'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.
Leget, Carlo; Borry, Pascal; de Vries, Raymond
2009-05-01
This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.
[Psychoanalysis and psychoanalytic oriented psychotherapy: differences and similarities].
Rössler-Schülein, Hemma; Löffler-Stastka, Henriette
2013-01-01
Psychoanalysis as well as Psychoanalytic Psychotherapy derived from Psychoanalysis are efficient methods offered by the Austrian health care system in the treatment for anxiety, depression, personality disorders, neurotic and somatoform disorders. In both methods similar basic treatment techniques are applied. Therefore differentiation between both treatment options often is made pragmatically by the frequency of sessions or the use of the couch and seems to be vague in the light of empirical studies. This overview focuses a potential differentiation-the objective and subjective dimensions of the indication process. Concerning the latter it is to investigate, if reflective functioning and ego-integration can be enhanced in the patient during the interaction process between patient and psychoanalyst. Empirical data underline the necessity to investigate to which extent externalizing defence processes are used and to integrate such factors into the decision and indication process. Differing treatment aims display another possibility to differentiate psychoanalysis and psychoanalytic psychotherapy. Psychoanalytic psychotherapy aims for example more at circumscribed problem-foci, the capability for self-reflexion is one of the most prominent treatment effects in psychoanalysis that results in on-going symptom reduction and resilience. The most prominent differentiation lies in the utilization of technical neutrality. Within Psychoanalytic Psychotherapy neutrality has sometimes to be suspended in order to stop severe acting out. Empirical evidence is given concerning the differentiation between psychoanalysis and psychoanalytic psychotherapy, that treatment efficacy is not correlated with the duration of the treatment, but with the frequency of sessions. Results give support to the assumption that the dosage of specific and appropriate psychoanalytic techniques facilitates sustained therapeutic change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masuda, Y.; Chiba, N.; Matsuo, Y.
This research proposes to investigate the impact behavior of the steel plate of BWR containment vessels against missiles, caused by the postulated catastrophic failure of components with a high kinetic energy. Although the probability of the occurrence of missiles inside and outside of containment vessels is extremely low, the following items are required to maintain the integrity of containment vessels: the probability of the occurrence of missiles, the weight and energy of missiles, and the impact behavior of containment vessel steel plate against postulated missiles. In connection with the third item, an actualscale missile test was conducted. In addition, amore » computation analysis was performed to confirm the impact behavior against the missiles, in order to search for wide applicability to the various kinds of postulated missiles. This research tries to derive a new empirical formula which carries out the assessment of the integrity of containment vessels.« less
Critical Realism and Empirical Bioethics: A Methodological Exposition.
McKeown, Alex
2017-09-01
This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.
Increasing Functional Communication in Non-Speaking Preschool Children: Comparison of PECS and VOCA
ERIC Educational Resources Information Center
Bock, Stacey Jones; Stoner, Julia B.; Beck, Ann R.; Hanley, Laurie; Prochnow, Jessica
2005-01-01
For individuals who have complex communication needs and for the interventionists who work with them, the collection of empirically derived data that support the use of an intervention approach is critical. The purposes of this study were to continue building an empirically derived base of support for, and to compare the relative effectiveness of…
On the use of integrating FLUXNET eddy covariance and remote sensing data for model evaluation
NASA Astrophysics Data System (ADS)
Reichstein, Markus; Jung, Martin; Beer, Christian; Carvalhais, Nuno; Tomelleri, Enrico; Lasslop, Gitta; Baldocchi, Dennis; Papale, Dario
2010-05-01
The current FLUXNET database (www.fluxdata.org) of CO2, water and energy exchange between the terrestrial biosphere and the atmosphere contains almost 1000 site-years with data from more than 250 sites, encompassing all major biomes of the world and being processed in a standardized way (1-3). In this presentation we show that the information in the data is sufficient to derive generalized empirical relationships between vegetation/respective remote sensing information, climate and the biosphere-atmosphere exchanges across global biomes. These empirical patterns are used to generate global grids of the respective fluxes and derived properties (e.g. radiation and water-use efficiencies or climate sensitivities in general, bowen-ratio, AET/PET ratio). For example we revisit global 'text-book' numbers such as global Gross Primary Productivity (GPP) estimated since the 70's as ca. 120PgC (4), or global evapotranspiration (ET) estimated at 65km3/yr-1 (5) - for the first time with a more solid and direct empirical basis. Evaluation against independent data at regional to global scale (e.g. atmospheric CO2 inversions, runoff data) lends support to the validity of our almost purely empirical up-scaling approaches. Moreover climate factors such as radiation, temperature and water balance are identified as driving factors for variations and trends of carbon and water fluxes, with distinctly different sensitivities between different vegetation types. Hence, these global fields of biosphere-atmosphere exchange and the inferred relations between climate, vegetation type and fluxes should be used for evaluation or benchmarking of climate models or their land-surface components, while overcoming scale-issues with classical point-to-grid-cell comparisons. 1. M. Reichstein et al., Global Change Biology 11, 1424 (2005). 2. D. Baldocchi, Australian Journal of Botany 56, 1 (2008). 3. D. Papale et al., Biogeosciences 3, 571 (2006). 4. D. E. Alexander, R. W. Fairbridge, Encyclopedia of Environmental Science (Springer, Heidelberg, 1999), pp. 741. 5. T. Oki, S. Kanae, Science 313, 1068 (Aug 25, 2006)
2009-04-01
Shelf, and into the Gulf of Mexico, empirically derived chl ; increases were observed in the Tortugas Gyre circulation feature, and in adjacent...Mexico, empirically derived chl a increases were observed in the Tortugas Gyre circulation feature, and in adjacent waters. Analy- sis of the...hurricane interaction also influenced the Tortugas Gyre, a recognized circulation feature in the southern Gulf of Mexico induced by the flow of the
Hattori, Masasi
2016-12-01
This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
Westen, Drew; Shedler, Jonathan; Bradley, Bekh; DeFife, Jared A.
2013-01-01
Objective The authors describe a system for diagnosing personality pathology that is empirically derived, clinically relevant, and practical for day-to-day use. Method A random national sample of psychiatrists and clinical psychologists (N=1,201) described a randomly selected current patient with any degree of personality dysfunction (from minimal to severe) using the descriptors in the Shedler-Westen Assessment Procedure–II and completed additional research forms. Results The authors applied factor analysis to identify naturally occurring diagnostic groupings within the patient sample. The analysis yielded 10 clinically coherent personality diagnoses organized into three higher-order clusters: internalizing, externalizing, and borderline-dysregulated. The authors selected the most highly rated descriptors to construct a diagnostic prototype for each personality syndrome. In a second, independent sample, research interviewers and patients’ treating clinicians were able to diagnose the personality syndromes with high agreement and minimal comorbidity among diagnoses. Conclusions The empirically derived personality prototypes described here provide a framework for personality diagnosis that is both empirically based and clinically relevant. PMID:22193534
NASA Astrophysics Data System (ADS)
Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.
1992-09-01
A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.
ERIC Educational Resources Information Center
Peterson, Carol B.; Crow, Scott J.; Swanson, Sonja A.; Crosby, Ross D.; Wonderlich, Stephen A.; Mitchell, James E.; Agras, W. Stewart; Halmi, Katherine A.
2011-01-01
Objective: The purpose of this investigation was to derive an empirical classification of eating disorder symptoms in a heterogeneous eating disorder sample using latent class analysis (LCA) and to examine the longitudinal stability of these latent classes (LCs) and the stability of DSM-IV eating disorder (ED) diagnoses. Method: A total of 429…
ERIC Educational Resources Information Center
Beitzel, Brian D.
2013-01-01
The Student Response to Faculty Instruction (SRFI) is an instrument designed to measure the student perspective on courses in higher education. The SRFI was derived from decades of empirical studies of student evaluations of teaching. This article describes the development of the SRFI and its psychometric attributes demonstrated in two pilot study…
Suh, Eunkook M
2007-12-01
The self becomes context sensitive in service of the need to belong. When it comes to achieving personal happiness, an identity system that derives its worth and meaning excessively from its social context puts itself in a significantly disadvantageous position. This article integrates empirical findings and ideas from the self, subjective well-being, and cross-cultural literature and tries to offer insights to why East Asian cultural members report surprisingly low levels of happiness. The various cognitive, motivational, behavioral, and affective characteristics of the overly relation-oriented self are discussed as potential explanations. Implications for the study of self and culture are offered.
Preparing Current and Future Practitioners to Integrate Research in Real Practice Settings
ERIC Educational Resources Information Center
Thyer, Bruce A.
2015-01-01
Past efforts aimed at promoting a better integration between research and practice are reviewed. These include the empirical clinical practice movement (ECP), originating within social work; the empirically supported treatment (EST) initiative of clinical psychology; and the evidence-based practice (EBP) model developed within medicine. The…
USDA-ARS?s Scientific Manuscript database
Multi-locus genome-wide association studies has become the state-of-the-art procedure to identify quantitative trait loci (QTL) associated with traits simultaneously. However, implementation of multi-locus model is still difficult. In this study, we integrated least angle regression with empirical B...
NASA Technical Reports Server (NTRS)
Mannino, A.; Dyda, R. Y.; Hernes, P. J.; Hooker, Stan; Hyde, Kim; Novak, Mike
2012-01-01
Estuaries and coastal ocean waters experience a high degree of variability in the composition and concentration of particulate and dissolved organic matter (DOM) as a consequence of riverine/estuarine fluxes of terrigenous DOM, sediments, detritus and nutrients into coastal waters and associated phytoplankton blooms. Our approach integrates biogeochemical measurements (elemental content, molecular analyses), optical properties (absorption) and remote sensing to examine terrestrial DOM contributions into the U.S. Middle Atlantic Bight (MAB). We measured lignin phenol composition, DOC and CDOM absorption within the Chesapeake and Delaware Bay mouths, plumes and adjacent coastal ocean waters to derive empirical relationships between CDOM and biogeochemical measurements for satellite remote sensing application. Lignin ranged from 0.03 to 6.6 ug/L between estuarine and outer shelf waters. Our results demonstrate that satellite-derived CDOM is useful as a tracer of terrigenous DOM in the coastal ocean
An object programming based environment for protein secondary structure prediction.
Giacomini, M; Ruggiero, C; Sacile, R
1996-01-01
The most frequently used methods for protein secondary structure prediction are empirical statistical methods and rule based methods. A consensus system based on object-oriented programming is presented, which integrates the two approaches with the aim of improving the prediction quality. This system uses an object-oriented knowledge representation based on the concepts of conformation, residue and protein, where the conformation class is the basis, the residue class derives from it and the protein class derives from the residue class. The system has been tested with satisfactory results on several proteins of the Brookhaven Protein Data Bank. Its results have been compared with the results of the most widely used prediction methods, and they show a higher prediction capability and greater stability. Moreover, the system itself provides an index of the reliability of its current prediction. This system can also be regarded as a basis structure for programs of this kind.
Galbraith, Kevin; Ward, Alison; Heneghan, Carl
2017-05-03
Evidence-Based Medicine (EBM) skills have been included in general practice curricula and competency frameworks. However, GPs experience numerous barriers to developing and maintaining EBM skills, and some GPs feel the EBM movement misunderstands, and threatens their traditional role. We therefore need a new approach that acknowledges the constraints encountered in real-world general practice. The aim of this study was to synthesise from empirical research a real-world EBM competency framework for general practice, which could be applied in training, in the individual pursuit of continuing professional development, and in routine care. We sought to integrate evidence from the literature with evidence derived from the opinions of experts in the fields of general practice and EBM. We synthesised two sets of themes describing the meaning of EBM in general practice. One set of themes was derived from a mixed-methods systematic review of the literature; the other set was derived from the further development of those themes using a Delphi process among a panel of EBM and general practice experts. From these two sets of themes we constructed a real-world EBM competency framework for general practice. A simple competency framework was constructed, that acknowledges the constraints of real-world general practice: (1) mindfulness - in one's approach towards EBM itself, and to the influences on decision-making; (2) pragmatism - in one's approach to finding and evaluating evidence; and (3) knowledge of the patient - as the most useful resource in effective communication of evidence. We present a clinical scenario to illustrate how a GP might demonstrate these competencies in their routine daily work. We have proposed a real-world EBM competency framework for general practice, derived from empirical research, which acknowledges the constraints encountered in modern general practice. Further validation of these competencies is required, both as an educational resource and as a strategy for actual practice.
Quantifying tolerance indicator values for common stream fish species of the United States
Meador, M.R.; Carlisle, D.M.
2007-01-01
The classification of fish species tolerance to environmental disturbance is often used as a means to assess ecosystem conditions. Its use, however, may be problematic because the approach to tolerance classification is based on subjective judgment. We analyzed fish and physicochemical data from 773 stream sites collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program to calculate tolerance indicator values for 10 physicochemical variables using weighted averaging. Tolerance indicator values (TIVs) for ammonia, chloride, dissolved oxygen, nitrite plus nitrate, pH, phosphorus, specific conductance, sulfate, suspended sediment, and water temperature were calculated for 105 common fish species of the United States. Tolerance indicator values for specific conductance and sulfate were correlated (rho = 0.87), and thus, fish species may be co-tolerant to these water-quality variables. We integrated TIVs for each species into an overall tolerance classification for comparisons with judgment-based tolerance classifications. Principal components analysis indicated that the distinction between tolerant and intolerant classifications was determined largely by tolerance to suspended sediment, specific conductance, chloride, and total phosphorus. Factors such as water temperature, dissolved oxygen, and pH may not be as important in distinguishing between tolerant and intolerant classifications, but may help to segregate species classified as moderate. Empirically derived tolerance classifications were 58.8% in agreement with judgment-derived tolerance classifications. Canonical discriminant analysis revealed that few TIVs, primarily chloride, could discriminate among judgment-derived tolerance classifications of tolerant, moderate, and intolerant. To our knowledge, this is the first empirically based understanding of fish species tolerance for stream fishes in the United States.
Haynos, Ann F.; Pearson, Carolyn M.; Utzinger, Linsey M.; Wonderlich, Stephen A.; Crosby, Ross D.; Mitchell, James E.; Crow, Scott J.; Peterson, Carol B.
2016-01-01
Objective Evidence suggests that eating disorder subtypes reflecting under-controlled, over-controlled, and low psychopathology personality traits constitute reliable phenotypes that differentiate treatment response. This study is the first to use statistical analyses to identify these subtypes within treatment-seeking individuals with bulimia nervosa (BN) and to use these statistically derived clusters to predict clinical outcomes. Methods Using variables from the Dimensional Assessment of Personality Pathology–Basic Questionnaire, K-means cluster analyses identified under-controlled, over-controlled, and low psychopathology subtypes within BN patients (n = 80) enrolled in a treatment trial. Generalized linear models examined the impact of personality subtypes on Eating Disorder Examination global score, binge eating frequency, and purging frequency cross-sectionally at baseline and longitudinally at end of treatment (EOT) and follow-up. In the longitudinal models, secondary analyses were conducted to examine personality subtype as a potential moderator of response to Cognitive Behavioral Therapy-Enhanced (CBT-E) or Integrative Cognitive-Affective Therapy for BN (ICAT-BN). Results There were no baseline clinical differences between groups. In the longitudinal models, personality subtype predicted binge eating (p = .03) and purging (p = .01) frequency at EOT and binge eating frequency at follow-up (p = .045). The over-controlled group demonstrated the best outcomes on these variables. In secondary analyses, there was a treatment by subtype interaction for purging at follow-up (p = .04), which indicated a superiority of CBT-E over ICAT-BN for reducing purging among the over-controlled group. Discussion Empirically derived personality subtyping is appears to be a valid classification system with potential to guide eating disorder treatment decisions. PMID:27611235
A Method for Precision Closed-Loop Irrigation Using a Modified PID Control Algorithm
NASA Astrophysics Data System (ADS)
Goodchild, Martin; Kühn, Karl; Jenkins, Malcolm; Burek, Kazimierz; Dutton, Andrew
2016-04-01
The benefits of closed-loop irrigation control have been demonstrated in grower trials which show the potential for improved crop yields and resource usage. Managing water use by controlling irrigation in response to soil moisture changes to meet crop water demands is a popular approach but requires knowledge of closed-loop control practice. In theory, to obtain precise closed-loop control of a system it is necessary to characterise every component in the control loop to derive the appropriate controller parameters, i.e. proportional, integral & derivative (PID) parameters in a classic PID controller. In practice this is often difficult to achieve. Empirical methods are employed to estimate the PID parameters by observing how the system performs under open-loop conditions. In this paper we present a modified PID controller, with a constrained integral function, that delivers excellent regulation of soil moisture by supplying the appropriate amount of water to meet the needs of the plant during the diurnal cycle. Furthermore, the modified PID controller responds quickly to changes in environmental conditions, including rainfall events which can result in: controller windup, under-watering and plant stress conditions. The experimental work successfully demonstrates the functionality of a constrained integral PID controller that delivers robust and precise irrigation control. Coir substrate strawberry growing trial data is also presented illustrating soil moisture control and the ability to match water deliver to solar radiation.
ERIC Educational Resources Information Center
Oetting, Janna B.; Cleveland, Lesli H.; Cope, Robert F., III
2008-01-01
Purpose: Using a sample of culturally/linguistically diverse children, we present data to illustrate the value of empirically derived combinations of tools and cutoffs for determining eligibility in child language impairment. Method: Data were from 95 4- and 6-year-olds (40 African American, 55 White; 18 with language impairment, 77 without) who…
ERIC Educational Resources Information Center
Bihagen, Erik; Ohls, Marita
2007-01-01
It has been claimed that women experience fewer career opportunities than men do mainly because they are over-represented in "Dead-end Jobs" (DEJs). Using Swedish panel data covering 1.1 million employees with the same employer in 1999 and 2003, measures of DEJ are empirically derived from analyses of wage mobility. The results indicate…
ERIC Educational Resources Information Center
Eddy, Kamryn T.; Le Grange, Daniel; Crosby, Ross D.; Hoste, Renee Rienecke; Doyle, Angela Celio; Smyth, Angela; Herzog, David B.
2010-01-01
Objective: The purpose of this study was to empirically derive eating disorder phenotypes in a clinical sample of children and adolescents using latent profile analysis (LPA), and to compare these latent profile (LP) groups to the DSM-IV-TR eating disorder categories. Method: Eating disorder symptom data collected from 401 youth (aged 7 through 19…
NASA Technical Reports Server (NTRS)
Blum, P. W.; Harris, I.
1975-01-01
The equations of horizontal motion of the neutral atmosphere between 120 and 500 km are integrated with the inclusion of all nonlinear terms of the convective derivative and the viscous forces due to vertical and horizontal velocity gradients. Empirical models of the distribution of neutral and charged particles are assumed to be known. The model of velocities developed is a steady state model. In Part I the mathematical method used in the integration of the Navier-Stokes equations is described and the various forces are analyzed. Results of the method given in Part I are presented with comparison with previous calculations and observations of upper atmospheric winds. Conclusions are that nonlinear effects are only significant in the equatorial region, especially at solstice conditions and that nonlinear effects do not produce any superrotation.
Semi-empirical airframe noise prediction model
NASA Technical Reports Server (NTRS)
Hersh, A. S.; Putnam, T. W.; Lasagna, P. L.; Burcham, F. W., Jr.
1976-01-01
A semi-empirical maximum overall sound pressure level (OASPL) airframe noise model was derived. The noise radiated from aircraft wings and flaps was modeled by using the trailing-edge diffracted quadrupole sound theory derived by Ffowcs Williams and Hall. The noise radiated from the landing gear was modeled by using the acoustic dipole sound theory derived by Curle. The model was successfully correlated with maximum OASPL flyover noise measurements obtained at the NASA Dryden Flight Research Center for three jet aircraft - the Lockheed JetStar, the Convair 990, and the Boeing 747 aircraft.
ERIC Educational Resources Information Center
O'Brien, Karen M.; Zamostny, Kathy P.
2003-01-01
Contrary to societal stereotypes about adoption, this integrative review of published empirical research on adoptive families noted several positive and few negative out-comes with regard to satisfaction with the adoption, familial functioning, and parent-child communication. The critical analysis of 38 studies on adoptive families revealed a…
NASA Astrophysics Data System (ADS)
Herman, M. W.; Furlong, K. P.; Hayes, G. P.; Benz, H.
2014-12-01
Strong motion accelerometers can record large amplitude shaking on-scale in the near-field of large earthquake ruptures; however, numerical integration of such records to determine displacement is typically unstable due to baseline changes (i.e., distortions in the zero value) that occur during strong shaking. We use datasets from the 2011 Mw 9.0 Tohoku earthquake to assess whether a relatively simple empirical correction scheme (Boore et al., 2002) can return accurate displacement waveforms useful for constraining details of the fault slip. The coseismic deformation resulting from the Tohoku earthquake was recorded by the Kiban Kyoshin network (KiK-net) of strong motion instruments as well as by a dense network of high-rate (1 Hz) GPS instruments. After baseline correcting the KiK-net records and integrating to displacement, over 85% of the KiK-net borehole instrument waveforms and over 75% of the KiK-net surface instrument waveforms match collocated 1 Hz GPS displacement time series. Most of the records that do not match the GPS-derived displacements following the baseline correction have large, systematic drifts that can be automatically identified by examining the slopes in the first 5-10 seconds of the velocity time series. We apply the same scheme to strong motion records from the 2014 Mw 8.2 Iquique earthquake. Close correspondence in both direction and amplitude between coseismic static offsets derived from the integrated strong motion time series and those predicted from a teleseismically-derived finite fault model, as well as displacement amplitudes consistent with InSAR-derived results, suggest that the correction scheme works successfully for the Iquique event. In the absence of GPS displacements, these strong motion-derived offsets provide constraints on the overall distribution of slip on the fault. In addition, the coseismic strong motion-derived displacement time series (50-100 s long) contain a near-field record of the temporal evolution of the rupture, supplementing teleseismic data and improving resolution of the location and timing of moment in finite fault models.
Empirical Orthogonal Function (EOF) Analysis of Storm-Time GPS Total Electron Content Variations
NASA Astrophysics Data System (ADS)
Thomas, E. G.; Coster, A. J.; Zhang, S.; McGranaghan, R. M.; Shepherd, S. G.; Baker, J. B.; Ruohoniemi, J. M.
2016-12-01
Large perturbations in ionospheric density are known to occur during geomagnetic storms triggered by dynamic structures in the solar wind. These ionospheric storm effects have long attracted interest due to their impact on the propagation characteristics of radio wave communications. Over the last two decades, maps of vertically-integrated total electron content (TEC) based on data collected by worldwide networks of Global Positioning System (GPS) receivers have dramatically improved our ability to monitor the spatiotemporal dynamics of prominent storm-time features such as polar cap patches and storm enhanced density (SED) plumes. In this study, we use an empirical orthogonal function (EOF) decomposition technique to identify the primary modes of spatial and temporal variability in the storm-time GPS TEC response at midlatitudes over North America during more than 100 moderate geomagnetic storms from 2001-2013. We next examine the resulting time-varying principal components and their correlation with various geophysical indices and parameters in order to derive an analytical representation. Finally, we use a truncated reconstruction of the EOF basis functions and parameterization of the principal components to produce an empirical representation of the geomagnetic storm-time response of GPS TEC for all magnetic local times local times and seasons at midlatitudes in the North American sector.
Chen, Shih-Chih; Liu, Ming-Ling; Lin, Chieh-Peng
2013-08-01
The aim of this study was to integrate technology readiness into the expectation-confirmation model (ECM) for explaining individuals' continuance of mobile data service usage. After reviewing the ECM and technology readiness, an integrated model was demonstrated via empirical data. Compared with the original ECM, the findings of this study show that the integrated model may offer an ameliorated way to clarify what factors and how they influence the continuous intention toward mobile services. Finally, the major findings are summarized, and future research directions are suggested.
NASA Astrophysics Data System (ADS)
Chamindu Deepagoda, T. K. K.; Chen Lopez, Jose Choc; Møldrup, Per; de Jonge, Lis Wollesen; Tuller, Markus
2013-10-01
Over the last decade there has been a significant shift in global agricultural practice. Because the rapid increase of human population poses unprecedented challenges to production of an adequate and economically feasible food supply for undernourished populations, soilless greenhouse production systems are regaining increased worldwide attention. The optimal control of water availability and aeration is an essential prerequisite to successfully operate plant growth systems with soilless substrates such as aggregated foamed glass, perlite, rockwool, coconut coir, or mixtures thereof. While there are considerable empirical and theoretical efforts devoted to characterize water retention and aeration substrate properties, a holistic, physically-based approach considering water retention and aeration concurrently is lacking. In this study, the previously developed concept of integral water storage and energy was expanded to dual-porosity substrates and an analog integral oxygen diffusivity parameter was introduced to simultaneously characterize aeration properties of four common soilless greenhouse growth media. Integral parameters were derived for greenhouse crops in general, as well as for tomatoes. The integral approach provided important insights for irrigation management and for potential optimization of substrate properties. Furthermore, an observed relationship between the integral parameters for water availability and oxygen diffusivity can be potentially applied for the design of advanced irrigation and management strategies to ensure stress-free growth conditions, while conserving water resources.
O'Brien, D J; León-Vintró, L; McClean, B
2016-01-01
The use of radiotherapy fields smaller than 3 cm in diameter has resulted in the need for accurate detector correction factors for small field dosimetry. However, published factors do not always agree and errors introduced by biased reference detectors, inaccurate Monte Carlo models, or experimental errors can be difficult to distinguish. The aim of this study was to provide a robust set of detector-correction factors for a range of detectors using numerical, empirical, and semiempirical techniques under the same conditions and to examine the consistency of these factors between techniques. Empirical detector correction factors were derived based on small field output factor measurements for circular field sizes from 3.1 to 0.3 cm in diameter performed with a 6 MV beam. A PTW 60019 microDiamond detector was used as the reference dosimeter. Numerical detector correction factors for the same fields were derived based on calculations from a geant4 Monte Carlo model of the detectors and the Linac treatment head. Semiempirical detector correction factors were derived from the empirical output factors and the numerical dose-to-water calculations. The PTW 60019 microDiamond was found to over-respond at small field sizes resulting in a bias in the empirical detector correction factors. The over-response was similar in magnitude to that of the unshielded diode. Good agreement was generally found between semiempirical and numerical detector correction factors except for the PTW 60016 Diode P, where the numerical values showed a greater over-response than the semiempirical values by a factor of 3.7% for a 1.1 cm diameter field and higher for smaller fields. Detector correction factors based solely on empirical measurement or numerical calculation are subject to potential bias. A semiempirical approach, combining both empirical and numerical data, provided the most reliable results.
Gucciardi, Daniel F; Jackson, Ben
2015-01-01
Fostering individuals' long-term participation in activities that promote positive development such as organised sport is an important agenda for research and practice. We integrated the theories of planned behaviour (TPB) and basic psychological needs (BPN) to identify factors associated with young adults' continuation in organised sport over a 12-month period. Prospective study, including an online psycho-social assessment at Time 1 and an assessment of continuation in sport approximately 12 months later. Participants (N=292) aged between 17 and 21 years (M=18.03; SD=1.29) completed an online survey assessing the theories of planned behaviour and basic psychological needs constructs. Bayesian structural equation modelling (BSEM) was employed to test the hypothesised theoretical sequence, using informative priors for structural relations based on empirical and theoretical expectations. The analyses revealed support for the robustness of the hypothesised theoretical model in terms of the pattern of relations as well as the direction and strength of associations among the constructs derived from quantitative summaries of existing research and theoretical expectations. The satisfaction of basic psychological needs was associated with more positive attitudes, higher levels of perceived behavioural control, and more favourable subjective norms; positive attitudes and perceived behavioural control were associated with higher behavioural intentions; and both intentions and perceived behavioural control predicted sport continuation. This study demonstrated the utility of Bayesian structural equation modelling for testing the robustness of an integrated theoretical model, which is informed by empirical evidence from meta-analyses and theoretical expectations, for understanding sport continuation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Lamers, H. J. G. L. M.; Gathier, R.; Snow, T. P.
1980-01-01
From a study of the UV lines in the spectra of 25 stars from 04 to B1, the empirical relations between the mean density in the wind and the ionization fractions of O VI, N V, Si IV, and the excited C III (2p 3P0) level were derived. Using these empirical relations, a simple relation was derived between the mass-loss rate and the column density of any of these four ions. This relation can be used for a simple determination of the mass-loss rate from O4 to B1 stars.
Annual variations of monsoon and drought detected by GPS: A case study in Yunnan, China.
Jiang, Weiping; Yuan, Peng; Chen, Hua; Cai, Jianqing; Li, Zhao; Chao, Nengfang; Sneeuw, Nico
2017-07-19
The Global Positioning System (GPS) records monsoonal precipitable water vapor (PWV) and vertical crustal displacement (VCD) due to hydrological loading, and can thus be applied jointly to diagnose meteorological and hydrological droughts. We have analyzed the PWV and VCD observations during 2007.0-2015.0 at 26 continuous GPS stations located in Yunnan province, China. We also obtained equivalent water height (EWH) derived from the Gravity Recovery And Climate Experiment (GRACE) and precipitation at these stations with the same period. Then, we quantified the annual variations of PWV, precipitation, EWH and VCD and provided empirical relationships between them. We found that GPS-derived PWV and VCD (positive means downward movement) are in phase with precipitation and GRACE-derived EWH, respectively. The annual signals of VCD and PWV show linearly correlated amplitudes and a two-month phase lag. Furthermore, the results indicate that PWV and VCD anomalies can also be used to explore drought, such as the heavy drought during winter/spring 2010. Our analysis results verify the capability of GPS to monitor monsoon variations and drought in Yunnan and show that a more comprehensive understanding of the characteristics of regional monsoon and drought can be achieved by integrating GPS-derived PWV and VCD with precipitation and GRACE-derived EWH.
Stopping Distances: An Excellent Example of Empirical Modelling.
ERIC Educational Resources Information Center
Lawson, D. A.; Tabor, J. H.
2001-01-01
Explores the derivation of empirical models for the stopping distance of a car being driven at a range of speeds. Indicates that the calculation of stopping distances makes an excellent example of empirical modeling because it is a situation that is readily understood and particularly relevant to many first-year undergraduates who are learning or…
Structural Patterns in Empirical Research Articles: A Cross-Disciplinary Study
ERIC Educational Resources Information Center
Lin, Ling; Evans, Stephen
2012-01-01
This paper presents an analysis of the major generic structures of empirical research articles (RAs), with a particular focus on disciplinary variation and the relationship between the adjacent sections in the introductory and concluding parts. The findings were derived from a close "manual" analysis of 433 recent empirical RAs from high-impact…
NASA Astrophysics Data System (ADS)
Yang, Xiaochen; Zhang, Qinghe; Hao, Linnan
2015-03-01
A water-fluid mud coupling model is developed based on the unstructured grid finite volume coastal ocean model (FVCOM) to investigate the fluid mud motion. The hydrodynamics and sediment transport of the overlying water column are solved using the original three-dimensional ocean model. A horizontal two-dimensional fluid mud model is integrated into the FVCOM model to simulate the underlying fluid mud flow. The fluid mud interacts with the water column through the sediment flux, current, and shear stress. The friction factor between the fluid mud and the bed, which is traditionally determined empirically, is derived with the assumption that the vertical distribution of shear stress below the yield surface of fluid mud is identical to that of uniform laminar flow of Newtonian fluid in the open channel. The model is validated by experimental data and reasonable agreement is found. Compared with numerical cases with fixed friction factors, the results simulated with the derived friction factor exhibit the best agreement with the experiment, which demonstrates the necessity of the derivation of the friction factor.
Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.
Lee, Won Hee; Bullmore, Ed; Frangou, Sophia
2017-02-01
There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Survival estimation and the effects of dependency among animals
Schmutz, Joel A.; Ward, David H.; Sedinger, James S.; Rexstad, Eric A.
1995-01-01
Survival models assume that fates of individuals are independent, yet the robustness of this assumption has been poorly quantified. We examine how empirically derived estimates of the variance of survival rates are affected by dependency in survival probability among individuals. We used Monte Carlo simulations to generate known amounts of dependency among pairs of individuals and analyzed these data with Kaplan-Meier and Cormack-Jolly-Seber models. Dependency significantly increased these empirical variances as compared to theoretically derived estimates of variance from the same populations. Using resighting data from 168 pairs of black brant, we used a resampling procedure and program RELEASE to estimate empirical and mean theoretical variances. We estimated that the relationship between paired individuals caused the empirical variance of the survival rate to be 155% larger than the empirical variance for unpaired individuals. Monte Carlo simulations and use of this resampling strategy can provide investigators with information on how robust their data are to this common assumption of independent survival probabilities.
NASA Astrophysics Data System (ADS)
Kuncarayakti, H.; Galbany, L.; Anderson, J. P.; Krühler, T.; Hamuy, M.
2016-09-01
Context. Stellar populations are the building blocks of galaxies, including the Milky Way. The majority, if not all, extragalactic studies are entangled with the use of stellar population models given the unresolved nature of their observation. Extragalactic systems contain multiple stellar populations with complex star formation histories. However, studies of these systems are mainly based upon the principles of simple stellar populations (SSP). Hence, it is critical to examine the validity of SSP models. Aims: This work aims to empirically test the validity of SSP models. This is done by comparing SSP models against observations of spatially resolved young stellar population in the determination of its physical properties, that is, age and metallicity. Methods: Integral field spectroscopy of a young stellar cluster in the Milky Way, NGC 3603, was used to study the properties of the cluster as both a resolved and unresolved stellar population. The unresolved stellar population was analysed using the Hα equivalent width as an age indicator and the ratio of strong emission lines to infer metallicity. In addition, spectral energy distribution (SED) fitting using STARLIGHT was used to infer these properties from the integrated spectrum. Independently, the resolved stellar population was analysed using the colour-magnitude diagram (CMD) to determine age and metallicity. As the SSP model represents the unresolved stellar population, the derived age and metallicity were tested to determine whether they agree with those derived from resolved stars. Results: The age and metallicity estimate of NGC 3603 derived from integrated spectroscopy are confirmed to be within the range of those derived from the CMD of the resolved stellar population, including other estimates found in the literature. The result from this pilot study supports the reliability of SSP models for studying unresolved young stellar populations. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere under ESO programme 60.A-9344.
Organizational economics and health care markets.
Robinson, J C
2001-01-01
As health policy emphasizes the use of private sector mechanisms to pursue public sector goals, health services research needs to develop stronger conceptual frameworks for the interpretation of empirical studies of health care markets and organizations. Organizational relationships should not be interpreted exclusively in terms of competition among providers of similar services but also in terms of relationships among providers of substitute and complementary services and in terms of upstream suppliers and downstream distributors. This article illustrates the potential applicability of transactions cost economics, agency theory, and organizational economics more broadly to horizontal and vertical markets in health care. Examples are derived from organizational integration between physicians and hospitals and organizational conversions from nonprofit to for-profit ownership. PMID:11327173
Eolian Dust and the Origin of Sedimentary Chert
Cecil, C. Blaine
2004-01-01
This paper proposes an alternative model for the primary source of silica contained in bedded sedimentary chert. The proposed model is derived from three principal observations as follows: (1) eolian processes in warm-arid climates produce copious amounts of highly reactive fine-grained quartz particles (dust), (2) eolian processes in warm-arid climates export enormous quantities of quartzose dust to marine environments, and (3) bedded sedimentary cherts generally occur in marine strata that were deposited in warm-arid paleoclimates where dust was a potential source of silica. An empirical integration of these observations suggests that eolian dust best explains both the primary and predominant source of silica for most bedded sedimentary cherts.
Organizational economics and health care markets.
Robinson, J C
2001-04-01
As health policy emphasizes the use of private sector mechanisms to pursue public sector goals, health services research needs to develop stronger conceptual frameworks for the interpretation of empirical studies of health care markets and organizations. Organizational relationships should not be interpreted exclusively in terms of competition among providers of similar services but also in terms of relationships among providers of substitute and complementary services and in terms of upstream suppliers and downstream distributors. This article illustrates the potential applicability of transactions cost economics, agency theory, and organizational economics more broadly to horizontal and vertical markets in health care. Examples are derived from organizational integration between physicians and hospitals and organizational conversions from nonprofit to for-profit ownership.
Multiscale modelling and analysis of collective decision making in swarm robotics.
Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey
2014-01-01
We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.
Aslam, M N; Sudár, S; Hussain, M; Malik, A A; Shah, H A; Qaim, S M
2010-09-01
Cross-section data for the production of medically important radionuclide (124)I via five proton and deuteron induced reactions on enriched tellurium isotopes were evaluated. The nuclear model codes, STAPRE, EMPIRE and TALYS, were used for consistency checks of the experimental data. Recommended excitation functions were derived using a well-defined statistical procedure. Therefrom integral yields were calculated. The various production routes of (124)I were compared. Presently the (124)Te(p,n)(124)I reaction is the method of choice; however, the (125)Te(p,2n)(124)I reaction also appears to have great potential.
Modeling of outgassing and matrix decomposition in carbon-phenolic composites
NASA Technical Reports Server (NTRS)
Mcmanus, Hugh L.
1993-01-01
A new release rate equation to model the phase change of water to steam in composite materials was derived from the theory of molecular diffusion and equilibrium moisture concentration. The new model is dependent on internal pressure, the microstructure of the voids and channels in the composite materials, and the diffusion properties of the matrix material. Hence, it is more fundamental and accurate than the empirical Arrhenius rate equation currently in use. The model was mathematically formalized and integrated into the thermostructural analysis code CHAR. Parametric studies on variation of several parameters have been done. Comparisons to Arrhenius and straight-line models show that the new model produces physically realistic results under all conditions.
NASA Technical Reports Server (NTRS)
Sittler, Edward C., Jr.; Guhathakurta, Madhulika
1999-01-01
We have developed a two-dimensional semiempirical MHD model of the solar corona and solar wind. The model uses empirically derived electron density profiles from white-light coronagraph data measured during the Skylub period and an empirically derived model of the magnetic field which is fitted to observed streamer topologies, which also come from the white-light coronagraph data The electron density model comes from that developed by Guhathakurta and coworkers. The electron density model is extended into interplanetary space by using electron densities derived from the Ulysses plasma instrument. The model also requires an estimate of the solar wind velocity as a function of heliographic latitude and radial component of the magnetic field at 1 AU, both of which can be provided by the Ulysses spacecraft. The model makes estimates as a function of radial distance and latitude of various fluid parameters of the plasma such as flow velocity V, effective temperature T(sub eff), and effective heat flux q(sub eff), which are derived from the equations of conservation of mass, momentum, and energy, respectively. The term effective indicates that wave contributions could be present. The model naturally provides the spiral pattern of the magnetic field far from the Sun and an estimate of the large-scale surface magnetic field at the Sun, which we estimate to be approx. 12 - 15 G. The magnetic field model shows that the large-scale surface magnetic field is dominated by an octupole term. The model is a steady state calculation which makes the assumption of azimuthal symmetry and solves the various conservation equations in the rotating frame of the Sun. The conservation equations are integrated along the magnetic field direction in the rotating frame of the Sun, thus providing a nearly self-consistent calculation of the fluid parameters. The model makes a minimum number of assumptions about the physics of the solar corona and solar wind and should provide a very accurate empirical description of the solar corona and solar wind Once estimates of mass density rho, flow velocity V, effective temperature T(sub eff), effective heat flux q(sub eff), and magnetic field B are computed from the model and waves are assumed unimportant, all other plasma parameters such as Mach number, Alfven speed, gyrofrequency, etc. can be derived as a function of radial distance and latitude from the Sun. The model can be used as a planning tool for such missions as Slar Probe and provide an empirical framework for theoretical models of the solar corona and solar wind The model will be used to construct a semiempirical MHD description of the steady state solar corona and solar wind using the SOHO Large Angle Spectrometric Coronagraph (LASCO) polarized brightness white-light coronagraph data, SOHO Extreme Ultraviolet Imaging Telescope data, and Ulysses plasma data.
Optimizing integrated luminosity of future hadron colliders
NASA Astrophysics Data System (ADS)
Benedikt, Michael; Schulte, Daniel; Zimmermann, Frank
2015-10-01
The integrated luminosity, a key figure of merit for any particle-physics collider, is closely linked to the peak luminosity and to the beam lifetime. The instantaneous peak luminosity of a collider is constrained by a number of boundary conditions, such as the available beam current, the maximum beam-beam tune shift with acceptable beam stability and reasonable luminosity lifetime (i.e., the empirical "beam-beam limit"), or the event pileup in the physics detectors. The beam lifetime at high-luminosity hadron colliders is largely determined by particle burn off in the collisions. In future highest-energy circular colliders synchrotron radiation provides a natural damping mechanism, which can be exploited for maximizing the integrated luminosity. In this article, we derive analytical expressions describing the optimized integrated luminosity, the corresponding optimum store length, and the time evolution of relevant beam parameters, without or with radiation damping, while respecting a fixed maximum value for the total beam-beam tune shift or for the event pileup in the detector. Our results are illustrated by examples for the proton-proton luminosity of the existing Large Hadron Collider (LHC) at its design parameters, of the High-Luminosity Large Hadron Collider (HL-LHC), and of the Future Circular Collider (FCC-hh).
ERIC Educational Resources Information Center
Huerta, Juan Carlos; Sperry, Rita
2013-01-01
This article outlines a systematic and manageable method for learning community program assessment based on collecting empirical direct measures of student learning. Developed at Texas A&M University--Corpus Christi where all full-time, first-year students are in learning communities, the approach ties integrative assignment design to a rubric…
Haynos, Ann F; Pearson, Carolyn M; Utzinger, Linsey M; Wonderlich, Stephen A; Crosby, Ross D; Mitchell, James E; Crow, Scott J; Peterson, Carol B
2017-05-01
Evidence suggests that eating disorder subtypes reflecting under-controlled, over-controlled, and low psychopathology personality traits constitute reliable phenotypes that differentiate treatment response. This study is the first to use statistical analyses to identify these subtypes within treatment-seeking individuals with bulimia nervosa (BN) and to use these statistically derived clusters to predict clinical outcomes. Using variables from the Dimensional Assessment of Personality Pathology-Basic Questionnaire, K-means cluster analyses identified under-controlled, over-controlled, and low psychopathology subtypes within BN patients (n = 80) enrolled in a treatment trial. Generalized linear models examined the impact of personality subtypes on Eating Disorder Examination global score, binge eating frequency, and purging frequency cross-sectionally at baseline and longitudinally at end of treatment (EOT) and follow-up. In the longitudinal models, secondary analyses were conducted to examine personality subtype as a potential moderator of response to Cognitive Behavioral Therapy-Enhanced (CBT-E) or Integrative Cognitive-Affective Therapy for BN (ICAT-BN). There were no baseline clinical differences between groups. In the longitudinal models, personality subtype predicted binge eating (p = 0.03) and purging (p = 0.01) frequency at EOT and binge eating frequency at follow-up (p = 0.045). The over-controlled group demonstrated the best outcomes on these variables. In secondary analyses, there was a treatment by subtype interaction for purging at follow-up (p = 0.04), which indicated a superiority of CBT-E over ICAT-BN for reducing purging among the over-controlled group. Empirically derived personality subtyping appears to be a valid classification system with potential to guide eating disorder treatment decisions. © 2016 Wiley Periodicals, Inc.(Int J Eat Disord 2017; 50:506-514). © 2016 Wiley Periodicals, Inc.
Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics
Goldenberg, Maya J
2005-01-01
Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for reconsideration of the methods used for moral evaluation and resolution, however the options should not include obscuring normative content by seemingly neutral technical measure. PMID:16277663
NASA Astrophysics Data System (ADS)
Wen, Yi-Ying
2018-02-01
The 2014 M L 5.9 Fanglin earthquake occurred at the northern end of the aftershock distribution of the 2013 M L 6.4 Ruisui event and caused strong ground shaking and some damage in the northern part of the Longitudinal Valley. We carried out the strong-motion simulation of the 2014 Fanglin event in the broadband frequency range (0.4-10 Hz) using the empirical Green's function method and then integrated the source models to investigate the source characteristics of the 2013 Ruisui and 2014 Fanglin events. The results show that the dimension of strong motion generation area of the 2013 Ruisui event is smaller, whereas that of the 2014 Fanglin event is comparable with the empirical estimation of inland crustal earthquakes, which indicates the different faulting behaviors. Furthermore, the localized high PGV patch might be caused by the radiation energy amplified by the local low-velocity structure in the northern Longitudinal Valley. Additional study issues are required for building up the knowledge of the potential seismic hazard related to moderate-large events for various seismogenic areas in Taiwan.
Integrity in Biomedical Research: A Systematic Review of Studies in China.
Yi, Nannan; Nemery, Benoit; Dierickx, Kris
2018-05-02
Recent empirical evidence has demonstrated that research misconduct occurs to a substantial degree in biomedical research. It has been suggested that scientific integrity is also of concern in China, but this seems to be based largely on anecdotal evidence. We, therefore, sought to explore the Chinese situation, by making a systematic review of published empirical studies on biomedical research integrity in China. One of our purposes was also to summarize the existing body of research published in Chinese. We searched the China National Knowledge Infrastructure, Wanfang Data, PubMed and Web of Science for potentially relevant studies, and included studies meeting our inclusion criteria, i.e. mainly those presenting empirically obtained data about the practice of research in China. All the data was extracted and synthesized using an inductive approach. Twenty-one studies were included for review. Two studies used qualitative methods (interviews) and nineteen studies used quantitative methods (questionnaires). Studies involved mainly medical postgraduates and nurses and they investigated awareness, attitudes, perceptions and experiences of research integrity and misconduct. Most of the participants in these 21 studies reported that research integrity is of great importance and that they obey academic norms during their research. Nevertheless, the occurrence of research misbehaviors, such as fabrication, falsification, plagiarism, improper authorship and duplicate submission was also reported. Strengthening research integrity training, developing the governance system and improving the scientific evaluation system were areas of particular attention in several studies. Our review demonstrates that a substantial number of articles have been devoted to research integrity in China, but only a few studies provide empirical evidence. With more safeguard measures of research integrity being taken in China, it would be crucial to conduct more research to explore researchers' in-depth perceptions and evaluate the changes.
NASA Astrophysics Data System (ADS)
Kiafar, Hamed; Babazadeh, Hosssien; Marti, Pau; Kisi, Ozgur; Landeras, Gorka; Karimi, Sepideh; Shiri, Jalal
2017-10-01
Evapotranspiration estimation is of crucial importance in arid and hyper-arid regions, which suffer from water shortage, increasing dryness and heat. A modeling study is reported here to cross-station assessment between hyper-arid and humid conditions. The derived equations estimate ET0 values based on temperature-, radiation-, and mass transfer-based configurations. Using data from two meteorological stations in a hyper-arid region of Iran and two meteorological stations in a humid region of Spain, different local and cross-station approaches are applied for developing and validating the derived equations. The comparison of the gene expression programming (GEP)-based-derived equations with corresponding empirical-semi empirical ET0 estimation equations reveals the superiority of new formulas in comparison with the corresponding empirical equations. Therefore, the derived models can be successfully applied in these hyper-arid and humid regions as well as similar climatic contexts especially in data-lack situations. The results also show that when relying on proper input configurations, cross-station might be a promising alternative for locally trained models for the stations with data scarcity.
NASA Technical Reports Server (NTRS)
Lautenschlager, L.; Perry, C. R., Jr. (Principal Investigator)
1981-01-01
The development of formulae for the reduction of multispectral scanner measurements to a single value (vegetation index) for predicting and assessing vegetative characteristics is addressed. The origin, motivation, and derivation of some four dozen vegetation indices are summarized. Empirical, graphical, and analytical techniques are used to investigate the relationships among the various indices. It is concluded that many vegetative indices are very similar, some being simple algebraic transforms of others.
An empirical Bayes approach for the Poisson life distribution.
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1973-01-01
A smooth empirical Bayes estimator is derived for the intensity parameter (hazard rate) in the Poisson distribution as used in life testing. The reliability function is also estimated either by using the empirical Bayes estimate of the parameter, or by obtaining the expectation of the reliability function. The behavior of the empirical Bayes procedure is studied through Monte Carlo simulation in which estimates of mean-squared errors of the empirical Bayes estimators are compared with those of conventional estimators such as minimum variance unbiased or maximum likelihood. Results indicate a significant reduction in mean-squared error of the empirical Bayes estimators over the conventional variety.
A DEIM Induced CUR Factorization
2015-09-18
CUR approximate matrix factorization based on the Discrete Empirical Interpolation Method (DEIM). For a given matrix A, such a factorization provides a...CUR approximations based on leverage scores. 1 Introduction This work presents a new CUR matrix factorization based upon the Discrete Empirical...SUPPLEMENTARY NOTES 14. ABSTRACT We derive a CUR approximate matrix factorization based on the Discrete Empirical Interpolation Method (DEIM). For a given
Werner, Jan; Sfakianakis, Nikolaos; Rendall, Alan D; Griebeler, Eva Maria
2018-05-07
Ectothermic and endothermic vertebrates differ not only in their source of body temperature (environment vs. metabolism), but also in growth patterns, in timing of sexual maturation within life, and energy intake functions. Here, we present a mathematical model applicable to ectothermic and endothermic vertebrates. It is designed to test whether differences in the timing of sexual maturation within an animal's life (age at which sexual maturity is reached vs. longevity) together with its ontogenetic gain in body mass (growth curve) can predict the energy intake throughout the animal's life (food intake curve) and can explain differences in energy partitioning (between growth, reproduction, heat production and maintenance, with the latter subsuming any other additional task requiring energy) between ectothermic and endothermic vertebrates. With our model we calculated from the growth curves and ages at which species reached sexual maturity energy intake functions and energy partitioning for five ectothermic and seven endothermic vertebrate species. We show that our model produces energy intake patterns and distributions as observed in ectothermic and endothermic species. Our results comply consistently with some empirical studies that in endothermic species, like birds and mammals, energy is used for heat production instead of growth, and with a hypothesis on the evolution of endothermy in amniotes published by us before. Our model offers an explanation on known differences in absolute energy intake between ectothermic fish and reptiles and endothermic birds and mammals. From a mathematical perspective, the model comes in two equivalent formulations, a differential and an integral one. It is derived from a discrete level approach, and it is shown to be well-posed and to attain a unique solution for (almost) every parameter set. Numerically, the integral formulation of the model is considered as an inverse problem with unknown parameters that are estimated using a series of empirical data. Copyright © 2018 Elsevier Ltd. All rights reserved.
An algorithmic and information-theoretic approach to multimetric index construction
Schoolmaster, Donald R.; Grace, James B.; Schweiger, E. William; Guntenspergen, Glenn R.; Mitchell, Brian R.; Miller, Kathryn M.; Little, Amanda M.
2013-01-01
The use of multimetric indices (MMIs), such as the widely used index of biological integrity (IBI), to measure, track, summarize and infer the overall impact of human disturbance on biological communities has been steadily growing in recent years. Initially, MMIs were developed for aquatic communities using pre-selected biological metrics as indicators of system integrity. As interest in these bioassessment tools has grown, so have the types of biological systems to which they are applied. For many ecosystem types the appropriate biological metrics to use as measures of biological integrity are not known a priori. As a result, a variety of ad hoc protocols for selecting metrics empirically has developed. However, the assumptions made by proposed protocols have not be explicitly described or justified, causing many investigators to call for a clear, repeatable methodology for developing empirically derived metrics and indices that can be applied to any biological system. An issue of particular importance that has not been sufficiently addressed is the way that individual metrics combine to produce an MMI that is a sensitive composite indicator of human disturbance. In this paper, we present and demonstrate an algorithm for constructing MMIs given a set of candidate metrics and a measure of human disturbance. The algorithm uses each metric to inform a candidate MMI, and then uses information-theoretic principles to select MMIs that capture the information in the multidimensional system response from among possible MMIs. Such an approach can be used to create purely empirical (data-based) MMIs or can, optionally, be influenced by expert opinion or biological theory through the use of a weighting vector to create value-weighted MMIs. We demonstrate the algorithm with simulated data to demonstrate the predictive capacity of the final MMIs and with real data from wetlands from Acadia and Rocky Mountain National Parks. For the Acadia wetland data, the algorithm identified 4 metrics that combined to produce a -0.88 correlation with the human disturbance index. When compared to other methods, we find this algorithmic approach resulted in MMIs that were more predictive and comprise fewer metrics.
Constituents of Music and Visual-Art Related Pleasure - A Critical Integrative Literature Review.
Tiihonen, Marianne; Brattico, Elvira; Maksimainen, Johanna; Wikgren, Jan; Saarikallio, Suvi
2017-01-01
The present literature review investigated how pleasure induced by music and visual-art has been conceptually understood in empirical research over the past 20 years. After an initial selection of abstracts from seven databases (keywords: pleasure, reward, enjoyment, and hedonic), twenty music and eleven visual-art papers were systematically compared. The following questions were addressed: (1) What is the role of the keyword in the research question? (2) Is pleasure considered a result of variation in the perceiver's internal or external attributes? (3) What are the most commonly employed methods and main variables in empirical settings? Based on these questions, our critical integrative analysis aimed to identify which themes and processes emerged as key features for conceptualizing art-induced pleasure. The results demonstrated great variance in how pleasure has been approached: In the music studies pleasure was often a clear object of investigation, whereas in the visual-art studies the term was often embedded into the context of an aesthetic experience, or used otherwise in a descriptive, indirect sense. Music studies often targeted different emotions, their intensity or anhedonia. Biographical and background variables and personality traits of the perceiver were often measured. Next to behavioral methods, a common method was brain imaging which often targeted the reward circuitry of the brain in response to music. Visual-art pleasure was also frequently addressed using brain imaging methods, but the research focused on sensory cortices rather than the reward circuit alone. Compared with music research, visual-art research investigated more frequently pleasure in relation to conscious, cognitive processing, where the variations of stimulus features and the changing of viewing modes were regarded as explanatory factors of the derived experience. Despite valence being frequently applied in both domains, we conclude, that in empirical music research pleasure seems to be part of core affect and hedonic tone modulated by stable personality variables, whereas in visual-art research pleasure is a result of the so called conceptual act depending on a chosen strategy to approach art. We encourage an integration of music and visual-art into to a multi-modal framework to promote a more versatile understanding of pleasure in response to aesthetic artifacts.
An Attempt to Derive the epsilon Equation from a Two-Point Closure
NASA Technical Reports Server (NTRS)
Canuto, V. M.; Cheng, Y.; Howard, A. M.
2010-01-01
The goal of this paper is to derive the equation for the turbulence dissipation rate epsilon for a shear-driven flow. In 1961, Davydov used a one-point closure model to derive the epsilon equation from first principles but the final result contained undetermined terms and thus lacked predictive power. Both in 1987 and in 2001, attempts were made to derive the epsilon equation from first principles using a two-point closure, but their methods relied on a phenomenological assumption. The standard practice has thus been to employ a heuristic form of the equation that contains three empirical ingredients: two constants, c(sub 1 epsilon), and c(sub 2 epsilon), and a diffusion term D(sub epsilon) In this work, a two-point closure is employed, yielding the following results: 1) the empirical constants get replaced by c(sub 1), c(sub 2), which are now functions of Kappa and epsilon; 2) c(sub 1) and c(sub 2) are not independent because a general relation between the two that are valid for any Kappa and epsilon are derived; 3) c(sub 1), c(sub 2) become constant with values close to the empirical values c(sub 1 epsilon), c(sub epsilon 2), (i.e., homogenous flows); and 4) the empirical form of the diffusion term D(sub epsilon) is no longer needed because it gets substituted by the Kappa-epsilon dependence of c(sub 1), c(sub 2), which plays the role of the diffusion, together with the diffusion of the turbulent kinetic energy D(sub Kappa), which now enters the new equation (i.e., inhomogeneous flows). Thus, the three empirical ingredients c(sub 1 epsilon), c(sub epsilon 2), D (sub epsilon)are replaced by a single function c(sub 1)(Kappa, epsilon ) or c(sub 2)(Kappa, epsilon ), plus a D(sub Kappa)term. Three tests of the new equation for epsilon are presented: one concerning channel flow and two concerning the shear-driven planetary boundary layer (PBL).
Derivation of the Freundlich Adsorption Isotherm from Kinetics
ERIC Educational Resources Information Center
Skopp, Joseph
2009-01-01
The Freundlich adsorption isotherm is a useful description of adsorption phenomena. It is frequently presented as an empirical equation with little theoretical basis. In fact, a variety of derivations exist. Here a new derivation is presented using the concepts of fractal reaction kinetics. This derivation provides an alternative basis for…
NASA Astrophysics Data System (ADS)
Fuchs, Richard; Prestele, Reinhard; Verburg, Peter H.
2018-05-01
The consideration of gross land changes, meaning all area gains and losses within a pixel or administrative unit (e.g. country), plays an essential role in the estimation of total land changes. Gross land changes affect the magnitude of total land changes, which feeds back to the attribution of biogeochemical and biophysical processes related to climate change in Earth system models. Global empirical studies on gross land changes are currently lacking. Whilst the relevance of gross changes for global change has been indicated in the literature, it is not accounted for in future land change scenarios. In this study, we extract gross and net land change dynamics from large-scale and high-resolution (30-100 m) remote sensing products to create a new global gross and net change dataset. Subsequently, we developed an approach to integrate our empirically derived gross and net changes with the results of future simulation models by accounting for the gross and net change addressed by the land use model and the gross and net change that is below the resolution of modelling. Based on our empirical data, we found that gross land change within 0.5° grid cells was substantially larger than net changes in all parts of the world. As 0.5° grid cells are a standard resolution of Earth system models, this leads to an underestimation of the amount of change. This finding contradicts earlier studies, which assumed gross land changes to appear in shifting cultivation areas only. Applied in a future scenario, the consideration of gross land changes led to approximately 50 % more land changes globally compared to a net land change representation. Gross land changes were most important in heterogeneous land systems with multiple land uses (e.g. shifting cultivation, smallholder farming, and agro-forestry systems). Moreover, the importance of gross changes decreased over time due to further polarization and intensification of land use. Our results serve as an empirical database for land change dynamics that can be applied in Earth system models and integrated assessment models.
Systems Toxicology: Real World Applications and Opportunities.
Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J
2017-04-17
Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.
Systems Toxicology: Real World Applications and Opportunities
2017-01-01
Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102
Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis
ERIC Educational Resources Information Center
Ball, Richard; Medeiros, Norm
2012-01-01
This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…
Galactic and solar radiation exposure to aircrew during a solar cycle.
Lewis, B J; Bennett, L G I; Green, A R; McCall, M J; Ellaschuk, B; Butler, A; Pierre, M
2002-01-01
An on-going investigation using a tissue-equivalent proportional counter (TEPC) has been carried out to measure the ambient dose equivalent rate of the cosmic radiation exposure of aircrew during a solar cycle. A semi-empirical model has been derived from these data to allow for the interpolation of the dose rate for any global position. The model has been extended to an altitude of up to 32 km with further measurements made on board aircraft and several balloon flights. The effects of changing solar modulation during the solar cycle are characterised by correlating the dose rate data to different solar potential models. Through integration of the dose-rate function over a great circle flight path or between given waypoints, a Predictive Code for Aircrew Radiation Exposure (PCAIRE) has been further developed for estimation of the route dose from galactic cosmic radiation exposure. This estimate is provided in units of ambient dose equivalent as well as effective dose, based on E/H x (10) scaling functions as determined from transport code calculations with LUIN and FLUKA. This experimentally based treatment has also been compared with the CARI-6 and EPCARD codes that are derived solely from theoretical transport calculations. Using TEPC measurements taken aboard the International Space Station, ground based neutron monitoring, GOES satellite data and transport code analysis, an empirical model has been further proposed for estimation of aircrew exposure during solar particle events. This model has been compared to results obtained during recent solar flare events.
Bayesian methods to estimate urban growth potential
Smith, Jordan W.; Smart, Lindsey S.; Dorning, Monica; Dupéy, Lauren Nicole; Méley, Andréanne; Meentemeyer, Ross K.
2017-01-01
Urban growth often influences the production of ecosystem services. The impacts of urbanization on landscapes can subsequently affect landowners’ perceptions, values and decisions regarding their land. Within land-use and land-change research, very few models of dynamic landscape-scale processes like urbanization incorporate empirically-grounded landowner decision-making processes. Very little attention has focused on the heterogeneous decision-making processes that aggregate to influence broader-scale patterns of urbanization. We examine the land-use tradeoffs faced by individual landowners in one of the United States’ most rapidly urbanizing regions − the urban area surrounding Charlotte, North Carolina. We focus on the land-use decisions of non-industrial private forest owners located across the region’s development gradient. A discrete choice experiment is used to determine the critical factors influencing individual forest owners’ intent to sell their undeveloped properties across a series of experimentally varied scenarios of urban growth. Data are analyzed using a hierarchical Bayesian approach. The estimates derived from the survey data are used to modify a spatially-explicit trend-based urban development potential model, derived from remotely-sensed imagery and observed changes in the region’s socioeconomic and infrastructural characteristics between 2000 and 2011. This modeling approach combines the theoretical underpinnings of behavioral economics with spatiotemporal data describing a region’s historical development patterns. By integrating empirical social preference data into spatially-explicit urban growth models, we begin to more realistically capture processes as well as patterns that drive the location, magnitude and rates of urban growth.
EPIC-Simulated and MODIS-Derived Leaf Area Index (LAI) ...
Leaf Area Index (LAI) is an important parameter in assessing vegetation structure for characterizing forest canopies over large areas at broad spatial scales using satellite remote sensing data. However, satellite-derived LAI products can be limited by obstructed atmospheric conditions yielding sub-optimal values, or complete non-returns. The United States Environmental Protection Agency’s Exposure Methods and Measurements and Computational Exposure Divisions are investigating the viability of supplemental modelled LAI inputs into satellite-derived data streams to support various regional and local scale air quality models for retrospective and future climate assessments. In this present study, one-year (2002) of plot level stand characteristics at four study sites located in Virginia and North Carolina are used to calibrate species-specific plant parameters in a semi-empirical biogeochemical model. The Environmental Policy Integrated Climate (EPIC) model was designed primarily for managed agricultural field crop ecosystems, but also includes managed woody species that span both xeric and mesic sites (e.g., mesquite, pine, oak, etc.). LAI was simulated using EPIC at a 4 km2 and 12 km2 grid coincident with the regional Community Multiscale Air Quality Model (CMAQ) grid. LAI comparisons were made between model-simulated and MODIS-derived LAI. Field/satellite-upscaled LAI was also compared to the corresponding MODIS LAI value. Preliminary results show field/satel
Violent Crime in Post-Civil War Guatemala: Causes and Policy Implications
2015-03-01
on field research and case studies in Honduras, Bolivia, and Argentina. Bailey’s Security Trap theory is comprehensive in nature and derived from... research question. The second phase uses empirical data and comparative case studies to validate or challenge selected arguments that potentially...Contextual relevancy, historical inference, Tools: Empirics and case conclusions empirical data studies Figme2. Sample Research Methodology E
Farmanbar, Amir; Firouzi, Sanaz; Park, Sung-Joon; Nakai, Kenta; Uchimaru, Kaoru; Watanabe, Toshiki
2017-01-31
Clonal expansion of leukemic cells leads to onset of adult T-cell leukemia (ATL), an aggressive lymphoid malignancy with a very poor prognosis. Infection with human T-cell leukemia virus type-1 (HTLV-1) is the direct cause of ATL onset, and integration of HTLV-1 into the human genome is essential for clonal expansion of leukemic cells. Therefore, monitoring clonal expansion of HTLV-1-infected cells via isolation of integration sites assists in analyzing infected individuals from early infection to the final stage of ATL development. However, because of the complex nature of clonal expansion, the underlying mechanisms have yet to be clarified. Combining computational/mathematical modeling with experimental and clinical data of integration site-based clonality analysis derived from next generation sequencing technologies provides an appropriate strategy to achieve a better understanding of ATL development. As a comprehensively interdisciplinary project, this study combined three main aspects: wet laboratory experiments, in silico analysis and empirical modeling. We analyzed clinical samples from HTLV-1-infected individuals with a broad range of proviral loads using a high-throughput methodology that enables isolation of HTLV-1 integration sites and accurate measurement of the size of infected clones. We categorized clones into four size groups, "very small", "small", "big", and "very big", based on the patterns of clonal growth and observed clone sizes. We propose an empirical formal model based on deterministic finite state automata (DFA) analysis of real clinical samples to illustrate patterns of clonal expansion. Through the developed model, we have translated biological data of clonal expansion into the formal language of mathematics and represented the observed clonality data with DFA. Our data suggest that combining experimental data (absolute size of clones) with DFA can describe the clonality status of patients. This kind of modeling provides a basic understanding as well as a unique perspective for clarifying the mechanisms of clonal expansion in ATL.
Empirical mass-loss rates for 25 O and early B stars, derived from Copernicus observations
NASA Technical Reports Server (NTRS)
Gathier, R.; Lamers, H. J. G. L. M.; Snow, T. P.
1981-01-01
Ultraviolet line profiles are fitted with theoretical line profiles in the cases of 25 stars covering a spectral type range from O4 to B1, including all luminosity classes. Ion column densities are compared for the determination of wind ionization, and it is found that the O VI/N V ratio is dependent on the mean density of the wind and not on effective temperature value, while the Si IV/N V ratio is temperature-dependent. The column densities are used to derive a mass-loss rate parameter that is empirically correlated against the mass-loss rate by means of standard stars with well-determined rates from IR or radio data. The empirical mass-loss rates obtained are compared with those derived by others and found to vary by as much as a factor of 10, which is shown to be due to uncertainties or errors in the ionization fractions of models used for wind ionization balance prediction.
Geiger, Friedemann; Kasper, Jürgen
2012-01-01
Shared decision making (SDM) between patient and physician is an interpersonal process. Most SDM measures use the view of one party (patient, physician or observer) as a proxy to capture this process although these views typically diverge. This study suggests the compound measure SDM(MASS) (SDM Meeting its concept's ASSumptions) integrating these three perspectives in one single index. SDM(MASS) was derived theoretically and compared empirically to unilateral perspectives of patients, physicians and observers by application to a data set of 10 physicians (40 consultations) receiving an SDM training. The constituting parts of SDM(MASS) were highly reliable (Cronbach's alpha .94; interrater reliability .74-.87). Unilateral appraisal of training effects was divergent. SDM(MASS) revealed no effect. SDM(MASS) combines noteworthy information about SDM processes from different viewpoints and thereby delivers plausible assessments. It could overcome immanent shortcomings of unilateral approaches. However, it is a complex measure needing further validation. Copyright © 2012. Published by Elsevier GmbH.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Binford, Michael W.; Lee, Tae Jeong; Townsend, Robert M.
2004-01-01
Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability. PMID:15254298
Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model
NASA Astrophysics Data System (ADS)
Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran
2018-02-01
Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.
Cognitive hypnotherapy for major depressive disorder.
Alladin, Assen
2012-04-01
Since the publication of the special issue on cognitive hypnotherapy in the Journal of Cognitive Psychotherapy: An International Quarterly (1994), there have been major developments in the application of hypnosis to the treatment of depression. However, there is no "one-size-fits-all" treatment for depressive disorders as the conditions represent a complex set of heterogeneous symptoms, involving multiple etiologies. It is thus important for therapists to promote a multimodal approach to treating depressive disorders. This article describes cognitive hypnotherapy (CH), an evidence-based multimodal psychological treatment that can be applied to a wide range of depressed patients. CH combines hypnosis with cognitive behavior therapy as the latter provides the best integrative lodestone for assimilating empirically supported treatment techniques derived from various psychotherapies.
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Conservational PDF Equations of Turbulence
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey
2010-01-01
Recently we have revisited the traditional probability density function (PDF) equations for the velocity and species in turbulent incompressible flows. They are all unclosed due to the appearance of various conditional means which are modeled empirically. However, we have observed that it is possible to establish a closed velocity PDF equation and a closed joint velocity and species PDF equation through conditions derived from the integral form of the Navier-Stokes equations. Although, in theory, the resulted PDF equations are neither general nor unique, they nevertheless lead to the exact transport equations for the first moment as well as all higher order moments. We refer these PDF equations as the conservational PDF equations. This observation is worth further exploration for its validity and CFD application
Reevaluating Old Stellar Populations
NASA Astrophysics Data System (ADS)
Stanway, E. R.; Eldridge, J. J.
2018-05-01
Determining the properties of old stellar populations (those with age >1 Gyr) has long involved the comparison of their integrated light, either in the form of photometry or spectroscopic indexes, with empirical or synthetic templates. Here we reevaluate the properties of old stellar populations using a new set of stellar population synthesis models, designed to incorporate the effects of binary stellar evolution pathways as a function of stellar mass and age. We find that single-aged stellar population models incorporating binary stars, as well as new stellar evolution and atmosphere models, can reproduce the colours and spectral indices observed in both globular clusters and quiescent galaxies. The best fitting model populations are often younger than those derived from older spectral synthesis models, and may also lie at slightly higher metallicities.
Multiscale Modelling and Analysis of Collective Decision Making in Swarm Robotics
Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey
2014-01-01
We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable. PMID:25369026
Determining the non-inferiority margin for patient reported outcomes.
Gerlinger, Christoph; Schmelter, Thomas
2011-01-01
One of the cornerstones of any non-inferiority trial is the choice of the non-inferiority margin delta. This threshold of clinical relevance is very difficult to determine, and in practice, delta is often "negotiated" between the sponsor of the trial and the regulatory agencies. However, for patient reported, or more precisely patient observed outcomes, the patients' minimal clinically important difference (MCID) can be determined empirically by relating the treatment effect, for example, a change on a 100-mm visual analogue scale, to the patient's satisfaction with the change. This MCID can then be used to define delta. We used an anchor-based approach with non-parametric discriminant analysis and ROC analysis and a distribution-based approach with Norman's half standard deviation rule to determine delta in three examples endometriosis-related pelvic pain measured on a 100-mm visual analogue scale, facial acne measured by lesion counts, and hot flush counts. For each of these examples, all three methods yielded quite similar results. In two of the cases, the empirically derived MCIDs were smaller or similar of deltas used before in non-inferiority trials, and in the third case, the empirically derived MCID was used to derive a responder definition that was accepted by the FDA. In conclusion, for patient-observed endpoints, the delta can be derived empirically. In our view, this is a better approach than that of asking the clinician for a "nice round number" for delta, such as 10, 50%, π, e, or i. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Narvaez, C.; Mendillo, M.; Trovato, J.
2017-12-01
A semi-empirical model of the maximum electron density (Nmax) of the martian ionosphere [MIRI-mark-1](1) was derived from an initial set radar observations by the MEX/MARSIS instrument. To extend the model to full electron density profiles, normalized shapes of Ne(h) from a theoretical model(2) were calibrated by MIRI's Nmax. Subsequent topside ionosphere observations from MAVEN indicated that topside shapes from MEX/MARSIS(3) offered improved morphology. The MEX topside shapes were then merged to the bottomside shapes from the theoretical model. Using a larger set of MEX/MARSIS observations (07/31/2005 - 05/24/2015), a new specification of Nmax as a function of solar zenith angle and solar flux is now used to calibrate the normalized Ne(h) profiles. The MIRI-mark-2 model includes the integral with height of Ne(h) to form total electron content (TEC) values. Validation of the MIRI TEC was accomplished using an independent set of TEC derived from the SHARAD(4) experiment on MRO. (1) M. Mendillo, A. Marusiak, P. Withers, D. Morgan and D. Gurnett, A New Semi-empirical Model of the Peak Electron Density of the Martian Ionosphere, Geophysical Research Letters, 40, 1-5, doi:10.1002/2013GL057631, 2013. (2) Mayyasi, M. and M. Mendillo (2015), Why the Viking descent probes found only one ionospheric layer at Mars, Geophys. Res. Lett., 42, 7359-7365, doi:10.1002/2015GL065575 (3) Němec, F., D. Morgan, D. Gurnett, and D. Andrews (2016), Empirical model of the Martian dayside ionosphere: Effects of crustal magnetic fields and solar ionizing flux at higher altitudes, J. Geophys. Res. Space Physics, 121, 1760-1771, doi:10.1002/2015/A022060.(4) Campbell, B., and T. Watters (2016), Phase compensation of MARSIS subsurface sounding and estimation of ionospheric properties: New insights from SHARAD results, J.Geophys. Res. Planets, 121, 180-193, doi:10.1002/2015JE004917.
NASA Astrophysics Data System (ADS)
Xie, Yanan; Zhou, Mingliang; Pan, Dengke
2017-10-01
The forward-scattering model is introduced to describe the response of normalized radar cross section (NRCS) of precipitation with synthetic aperture radar (SAR). Since the distribution of near-surface rainfall is related to the rate of near-surface rainfall and horizontal distribution factor, a retrieval algorithm called modified regression empirical and model-oriented statistical (M-M) based on the volterra integration theory is proposed. Compared with the model-oriented statistical and volterra integration (MOSVI) algorithm, the biggest difference is that the M-M algorithm is based on the modified regression empirical algorithm rather than the linear regression formula to retrieve the value of near-surface rainfall rate. Half of the empirical parameters are reduced in the weighted integral work and a smaller average relative error is received while the rainfall rate is less than 100 mm/h. Therefore, the algorithm proposed in this paper can obtain high-precision rainfall information.
“Nobody tosses a dwarf!” The relation between the empirical and normative reexamined
Leget, C.; Borry, P.; De Vries, R.
2009-01-01
This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as point of departure because it challenges the reader to look upon several central bioethical themes – including human dignity, autonomy, and the protection of vulnerable people – with fresh eyes. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theorist ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. The approach we endorse acknowledges that a social practice can and should be judged by both the gathering of empirical data and by the normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical poles that, in our view, should operate as two independent focuses of the ellipse that is called bioethics. We conclude by applying our five stage critical applied ethics to the example of dwarf tossing. PMID:19338523
Against the empirical viability of the Deutsch-Wallace-Everett approach to quantum mechanics
NASA Astrophysics Data System (ADS)
Dawid, Richard; Thébault, Karim P. Y.
2014-08-01
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value.
NASA Astrophysics Data System (ADS)
Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili
2016-05-01
Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.
A Behavior-Analytic Account of Motivational Interviewing
ERIC Educational Resources Information Center
Christopher, Paulette J.; Dougher, Michael J.
2009-01-01
Several published reports have now documented the clinical effectiveness of motivational interviewing (MI). Despite its effectiveness, there are no generally accepted or empirically supported theoretical accounts of its effects. The theoretical accounts that do exist are mentalistic, descriptive, and not based on empirically derived behavioral…
Classification of Marital Relationships: An Empirical Approach.
ERIC Educational Resources Information Center
Snyder, Douglas K.; Smith, Gregory T.
1986-01-01
Derives an empirically based classification system of marital relationships, employing a multidimensional self-report measure of marital interaction. Spouses' profiles on the Marital Satisfaction Inventory for samples of clinic and nonclinic couples were subjected to cluster analysis, resulting in separate five-group typologies for husbands and…
Development of a detector model for generation of synthetic radiographs of cargo containers
NASA Astrophysics Data System (ADS)
White, Timothy A.; Bredt, Ofelia P.; Schweppe, John E.; Runkle, Robert C.
2008-05-01
Creation of synthetic cargo-container radiographs that possess attributes of their empirical counterparts requires accurate models of the imaging-system response. Synthetic radiographs serve as surrogate data in studies aimed at determining system effectiveness for detecting target objects when it is impractical to collect a large set of empirical radiographs. In the case where a detailed understanding of the detector system is available, an accurate detector model can be derived from first-principles. In the absence of this detail, it is necessary to derive empirical models of the imaging-system response from radiographs of well-characterized objects. Such a case is the topic of this work, where we demonstrate the development of an empirical model of a gamma-ray radiography system with the intent of creating a detector-response model that translates uncollided photon transport calculations into realistic synthetic radiographs. The detector-response model is calibrated to field measurements of well-characterized objects thus incorporating properties such as system sensitivity, spatial resolution, contrast and noise.
An Integrated Tone Mapping for High Dynamic Range Image Visualization
NASA Astrophysics Data System (ADS)
Liang, Lei; Pan, Jeng-Shyang; Zhuang, Yongjun
2018-01-01
There are two type tone mapping operators for high dynamic range (HDR) image visualization. HDR image mapped by perceptual operators have strong sense of reality, but will lose local details. Empirical operators can maximize local detail information of HDR image, but realism is not strong. A common tone mapping operator suitable for all applications is not available. This paper proposes a novel integrated tone mapping framework which can achieve conversion between empirical operators and perceptual operators. In this framework, the empirical operator is rendered based on improved saliency map, which simulates the visual attention mechanism of the human eye to the natural scene. The results of objective evaluation prove the effectiveness of the proposed solution.
NASA Technical Reports Server (NTRS)
Habbal, Shadia Rifai; Esser, Ruth; Guhathakurta, Madhulika; Fisher, Richard
1995-01-01
Using the empirical constraints provided by observations in the inner corona and in interplanetary space. we derive the flow properties of the solar wind using a two fluid model. Density and scale height temperatures are derived from White Light coronagraph observations on SPARTAN 201-1 and at Mauna Loa, from 1.16 to 5.5 R, in the two polar coronal holes on 11-12 Apr. 1993. Interplanetary measurements of the flow speed and proton mass flux are taken from the Ulysses south polar passage. By comparing the results of the model computations that fit the empirical constraints in the two coronal hole regions, we show how the effects of the line of sight influence the empirical inferences and subsequently the corresponding numerical results.
Multiscale empirical modeling of the geomagnetic field: From storms to substorms
NASA Astrophysics Data System (ADS)
Stephens, G. K.; Sitnov, M. I.; Korth, H.; Gkioulidou, M.; Ukhorskiy, A. Y.; Merkin, V. G.
2017-12-01
An advanced version of the TS07D empirical geomagnetic field model, herein called SST17, is used to model the global picture of the geomagnetic field and its characteristic variations on both storm and substorm scales. The new SST17 model uses two regular expansions describing the equatorial currents with each having distinctly different scales, one corresponding to a thick and one to a thin current sheet relative to the thermal ion gyroradius. These expansions have an arbitrary distribution of currents in the equatorial plane that is constrained only by magnetometer data. This multi-scale description allows one to reproduce the current sheet thinning during the growth phase. Additionaly, the model uses a flexible description of field-aligned currents that reproduces their spiral structure at low altitudes and provides a continuous transition from region 1 to region 2 current systems. The empirical picture of substorms is obtained by combining magnetometer data from Geotail, THEMIS, Van Allen Probes, Cluster II, Polar, IMP-8, GOES 8, 9, 10 and 12 and then binning this data based on similar values of the auroral index AL, its time derivative and the integral of the solar wind electric field parameter (from ACE, Wind, and IMP-8) in time over substorm scales. The performance of the model is demonstrated for several events, including the 3 July 2012 substorm, which had multi-probe coverage and a series of substorms during the March 2008 storm. It is shown that the AL binning helps reproduce dipolarization signatures in the northward magnetic field Bz, while the solar wind electric field integral allows one to capture the current sheet thinning during the growth phase. The model allows one to trace the substorm dipolarization from the tail to the inner magnetosphere where the dipolarization of strongly stretched tail field lines causes a redistribution of the tail current resulting in an enhancement of the partial ring current in the premidnight sector.
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand. PMID:24659835
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand.
MDO and Cross-Disciplinary Practice in R&D: A Portrait of Principles and Current Practice
NASA Technical Reports Server (NTRS)
Rivas McGowan, Anna-Maria; Papalambros, Panos Y.; Baker, Wayne E.
2014-01-01
For several decades, Multidisciplinary Design Optimization (MDO) has served an important role in aerospace engineering by incorporating physics based disciplinary models into integrated system or sub-system models for use in research, development, (R&D) and design. This paper examines MDO's role in facilitating the integration of the researchers from different single disciplines during R&D and early design of large-scale complex engineered systems (LaCES) such as aerospace systems. The findings in this paper are summarized from a larger study on interdisciplinary practices and perspectives that included considerable empirical data from surveys, interviews, and ethnography. The synthesized findings were derived by integrating the data with theories from organization science and engineering. The over-arching finding is that issues related to cognition, organization, and social interrelations mostly dominate interactions across disciplines. Engineering issues, such as the integration of hardware or physics-based models, are not as significant. Correspondingly, the data showed that MDO is not the primary integrator of researchers working across disciplines during R&D and early design of LaCES. Cognitive focus such as analysis versus design, organizational challenges such as incentives, and social opportunities such as personal networks often drove the human interactive practices among researchers from different disciplines. Facilitation of the inherent confusion, argument, and learning in crossdisciplinary research was identified as one of several needed elements of enabling successful research across disciplines.
Advancing the integration of spatial data to map human and natural drivers on coral reefs
Gove, Jamison M.; Walecka, Hilary R.; Donovan, Mary K.; Williams, Gareth J.; Jouffray, Jean-Baptiste; Crowder, Larry B.; Erickson, Ashley; Falinski, Kim; Friedlander, Alan M.; Kappel, Carrie V.; Kittinger, John N.; McCoy, Kaylyn; Norström, Albert; Nyström, Magnus; Oleson, Kirsten L. L.; Stamoulis, Kostantinos A.; White, Crow; Selkoe, Kimberly A.
2018-01-01
A major challenge for coral reef conservation and management is understanding how a wide range of interacting human and natural drivers cumulatively impact and shape these ecosystems. Despite the importance of understanding these interactions, a methodological framework to synthesize spatially explicit data of such drivers is lacking. To fill this gap, we established a transferable data synthesis methodology to integrate spatial data on environmental and anthropogenic drivers of coral reefs, and applied this methodology to a case study location–the Main Hawaiian Islands (MHI). Environmental drivers were derived from time series (2002–2013) of climatological ranges and anomalies of remotely sensed sea surface temperature, chlorophyll-a, irradiance, and wave power. Anthropogenic drivers were characterized using empirically derived and modeled datasets of spatial fisheries catch, sedimentation, nutrient input, new development, habitat modification, and invasive species. Within our case study system, resulting driver maps showed high spatial heterogeneity across the MHI, with anthropogenic drivers generally greatest and most widespread on O‘ahu, where 70% of the state’s population resides, while sedimentation and nutrients were dominant in less populated islands. Together, the spatial integration of environmental and anthropogenic driver data described here provides a first-ever synthetic approach to visualize how the drivers of coral reef state vary in space and demonstrates a methodological framework for implementation of this approach in other regions of the world. By quantifying and synthesizing spatial drivers of change on coral reefs, we provide an avenue for further research to understand how drivers determine reef diversity and resilience, which can ultimately inform policies to protect coral reefs. PMID:29494613
NASA Astrophysics Data System (ADS)
Lopes, Sílvia R. C.; Prass, Taiane S.
2014-05-01
Here we present a theoretical study on the main properties of Fractionally Integrated Exponential Generalized Autoregressive Conditional Heteroskedastic (FIEGARCH) processes. We analyze the conditions for the existence, the invertibility, the stationarity and the ergodicity of these processes. We prove that, if { is a FIEGARCH(p,d,q) process then, under mild conditions, { is an ARFIMA(q,d,0) with correlated innovations, that is, an autoregressive fractionally integrated moving average process. The convergence order for the polynomial coefficients that describes the volatility is presented and results related to the spectral representation and to the covariance structure of both processes { and { are discussed. Expressions for the kurtosis and the asymmetry measures for any stationary FIEGARCH(p,d,q) process are also derived. The h-step ahead forecast for the processes {, { and { are given with their respective mean square error of forecast. The work also presents a Monte Carlo simulation study showing how to generate, estimate and forecast based on six different FIEGARCH models. The forecasting performance of six models belonging to the class of autoregressive conditional heteroskedastic models (namely, ARCH-type models) and radial basis models is compared through an empirical application to Brazilian stock market exchange index.
Tree Guidelines for Inland Empire Communities
E.G. McPherson; J.R. Simpson; P.J. Peper; Q. Xiao; D.R. Pittenger; D.R. Hodel
2001-01-01
Communities in the Inland Empire region of California contain over 8 million people, or about 25% of the stateâs population. The regionâs inhabitants derive great benefit from trees because compared to coastal areas, the summers are hotter and air pollution levels are higher. The regionâs climate is still mild enough to grow a diverse mix of trees. The Inland Empireâs...
Dierssen, Heidi M
2010-10-05
Phytoplankton biomass and productivity have been continuously monitored from ocean color satellites for over a decade. Yet, the most widely used empirical approach for estimating chlorophyll a (Chl) from satellites can be in error by a factor of 5 or more. Such variability is due to differences in absorption and backscattering properties of phytoplankton and related concentrations of colored-dissolved organic matter (CDOM) and minerals. The empirical algorithms have built-in assumptions that follow the basic precept of biological oceanography--namely, oligotrophic regions with low phytoplankton biomass are populated with small phytoplankton, whereas more productive regions contain larger bloom-forming phytoplankton. With a changing world ocean, phytoplankton composition may shift in response to altered environmental forcing, and CDOM and mineral concentrations may become uncoupled from phytoplankton stocks, creating further uncertainty and error in the empirical approaches. Hence, caution is warranted when using empirically derived Chl to infer climate-related changes in ocean biology. The Southern Ocean is already experiencing climatic shifts and shows substantial errors in satellite-derived Chl for different phytoplankton assemblages. Accurate global assessments of phytoplankton will require improved technology and modeling, enhanced field observations, and ongoing validation of our "eyes in space."
Implementation of the AASHTO mechanistic-empirical pavement design guide for Colorado.
DOT National Transportation Integrated Search
2000-01-01
The objective of this project was to integrate the American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide, Interim Edition: A Manual of Practice and its accompanying software into the d...
NASA Technical Reports Server (NTRS)
Peddle, Derek R.; Huemmrich, K. Fred; Hall, Forrest G.; Masek, Jeffrey G.; Soenen, Scott A.; Jackson, Chris D.
2011-01-01
Canopy reflectance model inversion using look-up table approaches provides powerful and flexible options for deriving improved forest biophysical structural information (BSI) compared with traditional statistical empirical methods. The BIOPHYS algorithm is an improved, physically-based inversion approach for deriving BSI for independent use and validation and for monitoring, inventory and quantifying forest disturbance as well as input to ecosystem, climate and carbon models. Based on the multiple-forward mode (MFM) inversion approach, BIOPHYS results were summarized from different studies (Minnesota/NASA COVER; Virginia/LEDAPS; Saskatchewan/BOREAS), sensors (airborne MMR; Landsat; MODIS) and models (GeoSail; GOMS). Applications output included forest density, height, crown dimension, branch and green leaf area, canopy cover, disturbance estimates based on multi-temporal chronosequences, and structural change following recovery from forest fires over the last century. Good correspondences with validation field data were obtained. Integrated analyses of multiple solar and view angle imagery further improved retrievals compared with single pass data. Quantifying ecosystem dynamics such as the area and percent of forest disturbance, early regrowth and succession provide essential inputs to process-driven models of carbon flux. BIOPHYS is well suited for large-area, multi-temporal applications involving multiple image sets and mosaics for assessing vegetation disturbance and quantifying biophysical structural dynamics and change. It is also suitable for integration with forest inventory, monitoring, updating, and other programs.
NASA Astrophysics Data System (ADS)
Rudek, Benedikt; Bennett, Daniel; Bug, Marion U.; Wang, Mingjie; Baek, Woon Yong; Buhr, Ticia; Hilgers, Gerhard; Champion, Christophe; Rabus, Hans
2016-09-01
For track structure simulations in the Bragg peak region, measured electron emission cross sections of DNA constituents are required as input for developing parameterized model functions representing the scattering probabilities. In the present work, double differential cross sections were measured for the electron emission from vapor-phase pyrimidine, tetrahydrofuran, and trimethyl phosphate that are structural analogues to the base, the sugar, and the phosphate residue of the DNA, respectively. The range of proton energies was from 75 keV to 135 keV, the angles ranged from 15° to 135°, and the electron energies were measured from 10 eV to 200 eV. Single differential and total electron emission cross sections are derived by integration over angle and electron energy and compared to the semi-empirical Hansen-Kocbach-Stolterfoht (HKS) model and a quantum mechanical calculation employing the first Born approximation with corrected boundary conditions (CB1). The CB1 provides the best prediction of double and single differential cross section, while total cross sections can be fitted with semi-empirical models. The cross sections of the three samples are proportional to their total number of valence electrons.
The role of U.S. states in facilitating effective water governance under stress and change
NASA Astrophysics Data System (ADS)
Kirchhoff, Christine J.; Dilling, Lisa
2016-04-01
Worldwide water governance failures undermine effective water management under uncertainty and change. Overcoming these failures requires employing more adaptive, resilient water management approaches; yet, while scholars have advance theory of what adaptive, resilient approaches should be, there is little empirical evidence to support those normative propositions. To fill this gap, we reviewed the literature to derive theorized characteristics of adaptive, resilient water governance including knowledge generation and use, participation, clear rules for water use, and incorporating nonstationarity. Then, using interviews and documentary analysis focused on five U.S. states' allocation and planning approaches, we examined empirically if embodying these characteristics made states more (or less) adaptive and resilient in practice. We found that adaptive, resilient water governance requires not just possessing these characteristics but combining and building on them. That is, adaptive, resilient water governance requires well-funded, transparent knowledge systems combined with broad, multilevel participatory processes that support learning, strong institutional arrangements that establish authorities and rules and that allow flexibility as conditions change, and resources for integrated planning and allocation. We also found that difficulty incorporating climate change or altering existing water governance paradigms and inadequate funding of water programs undermine adaptive, resilient governance.
Clinical decision support alert malfunctions: analysis and empirically derived taxonomy.
Wright, Adam; Ai, Angela; Ash, Joan; Wiesen, Jane F; Hickman, Thu-Trang T; Aaron, Skye; McEvoy, Dustin; Borkowsky, Shane; Dissanayake, Pavithra I; Embi, Peter; Galanter, William; Harper, Jeremy; Kassakian, Steve Z; Ramoni, Rachel; Schreiber, Richard; Sirajuddin, Anwar; Bates, David W; Sittig, Dean F
2018-05-01
To develop an empirically derived taxonomy of clinical decision support (CDS) alert malfunctions. We identified CDS alert malfunctions using a mix of qualitative and quantitative methods: (1) site visits with interviews of chief medical informatics officers, CDS developers, clinical leaders, and CDS end users; (2) surveys of chief medical informatics officers; (3) analysis of CDS firing rates; and (4) analysis of CDS overrides. We used a multi-round, manual, iterative card sort to develop a multi-axial, empirically derived taxonomy of CDS malfunctions. We analyzed 68 CDS alert malfunction cases from 14 sites across the United States with diverse electronic health record systems. Four primary axes emerged: the cause of the malfunction, its mode of discovery, when it began, and how it affected rule firing. Build errors, conceptualization errors, and the introduction of new concepts or terms were the most frequent causes. User reports were the predominant mode of discovery. Many malfunctions within our database caused rules to fire for patients for whom they should not have (false positives), but the reverse (false negatives) was also common. Across organizations and electronic health record systems, similar malfunction patterns recurred. Challenges included updates to code sets and values, software issues at the time of system upgrades, difficulties with migration of CDS content between computing environments, and the challenge of correctly conceptualizing and building CDS. CDS alert malfunctions are frequent. The empirically derived taxonomy formalizes the common recurring issues that cause these malfunctions, helping CDS developers anticipate and prevent CDS malfunctions before they occur or detect and resolve them expediently.
ERIC Educational Resources Information Center
Poitras, Eric; Trevors, Gregory
2012-01-01
Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…
Asymmetrical Integration: Lessons from a Railway Empire.
McDonald, Kate
2015-01-01
This article reexamines railway imperialism in Manchuria from the perspective of global network building. Through a case study of the Japanese-owned South Manchuria Railway Company (SMR), I trace how one railway empire used through traffic agreements to integrate Northeast Asian railways into a global network while at the same time installing itself as the necessary intermediary between European and Asian overland traffic. I argue that the SMR's pursuit of global reach and local dominance compels us to reconsider the traditional division of border-crossing railways into international and imperialist types, and instead to examine how border-crossing railways contributed to the uneven or "asymmetrical" integration of the global transportation infrastructure.
Data-driven regions of interest for longitudinal change in frontotemporal lobar degeneration.
Pankov, Aleksandr; Binney, Richard J; Staffaroni, Adam M; Kornak, John; Attygalle, Suneth; Schuff, Norbert; Weiner, Michael W; Kramer, Joel H; Dickerson, Bradford C; Miller, Bruce L; Rosen, Howard J
2016-01-01
Current research is investigating the potential utility of longitudinal measurement of brain structure as a marker of drug effect in clinical trials for neurodegenerative disease. Recent studies in Alzheimer's disease (AD) have shown that measurement of change in empirically derived regions of interest (ROIs) allows more reliable measurement of change over time compared with regions chosen a-priori based on known effects of AD on brain anatomy. Frontotemporal lobar degeneration (FTLD) is a devastating neurodegenerative disorder for which there are no approved treatments. The goal of this study was to identify an empirical ROI that maximizes the effect size for the annual rate of brain atrophy in FTLD compared with healthy age matched controls, and to estimate the effect size and associated power estimates for a theoretical study that would use change within this ROI as an outcome measure. Eighty six patients with FTLD were studied, including 43 who were imaged twice at 1.5 T and 43 at 3 T, along with 105 controls (37 imaged at 1.5 T and 67 at 3 T). Empirically-derived maps of change were generated separately for each field strength and included the bilateral insula, dorsolateral, medial and orbital frontal, basal ganglia and lateral and inferior temporal regions. The extent of regions included in the 3 T map was larger than that in the 1.5 T map. At both field strengths, the effect sizes for imaging were larger than for any clinical measures. At 3 T, the effect size for longitudinal change measured within the empirically derived ROI was larger than the effect sizes derived from frontal lobe, temporal lobe or whole brain ROIs. The effect size derived from the data-driven 1.5 T map was smaller than at 3 T, and was not larger than the effect size derived from a-priori ROIs. It was estimated that measurement of longitudinal change using 1.5 T MR systems requires approximately a 3-fold increase in sample size to obtain effect sizes equivalent to those seen at 3 T. While the results should be confirmed in additional datasets, these results indicate that empirically derived ROIs can reduce the number of subjects needed for a longitudinal study of drug effects in FTLD compared with a-priori ROIs. Field strength may have a significant impact on the utility of imaging for measuring longitudinal change.
NASA Astrophysics Data System (ADS)
Claure, Yuri Navarro; Matsubara, Edson Takashi; Padovani, Carlos; Prati, Ronaldo Cristiano
2018-03-01
Traditional methods for estimating timing parameters in hydrological science require a rigorous study of the relations of flow resistance, slope, flow regime, watershed size, water velocity, and other local variables. These studies are mostly based on empirical observations, where the timing parameter is estimated using empirically derived formulas. The application of these studies to other locations is not always direct. The locations in which equations are used should have comparable characteristics to the locations from which such equations have been derived. To overcome this barrier, in this work, we developed a data-driven approach to estimate timing parameters such as travel time. Our proposal estimates timing parameters using historical data of the location without the need of adapting or using empirical formulas from other locations. The proposal only uses one variable measured at two different locations on the same river (for instance, two river-level measurements, one upstream and the other downstream on the same river). The recorded data from each location generates two time series. Our method aligns these two time series using derivative dynamic time warping (DDTW) and perceptually important points (PIP). Using data from timing parameters, a polynomial function generalizes the data by inducing a polynomial water travel time estimator, called PolyWaTT. To evaluate the potential of our proposal, we applied PolyWaTT to three different watersheds: a floodplain ecosystem located in the part of Brazil known as Pantanal, the world's largest tropical wetland area; and the Missouri River and the Pearl River, in United States of America. We compared our proposal with empirical formulas and a data-driven state-of-the-art method. The experimental results demonstrate that PolyWaTT showed a lower mean absolute error than all other methods tested in this study, and for longer distances the mean absolute error achieved by PolyWaTT is three times smaller than empirical formulas.
Empirical algorithms for ocean optics parameters
NASA Astrophysics Data System (ADS)
Smart, Jeffrey H.
2007-06-01
As part of the Worldwide Ocean Optics Database (WOOD) Project, The Johns Hopkins University Applied Physics Laboratory has developed and evaluated a variety of empirical models that can predict ocean optical properties, such as profiles of the beam attenuation coefficient computed from profiles of the diffuse attenuation coefficient. In this paper, we briefly summarize published empirical optical algorithms and assess their accuracy for estimating derived profiles. We also provide new algorithms and discuss their applicability for deriving optical profiles based on data collected from a variety of locations, including the Yellow Sea, the Sea of Japan, and the North Atlantic Ocean. We show that the scattering coefficient (b) can be computed from the beam attenuation coefficient (c) to about 10% accuracy. The availability of such relatively accurate predictions is important in the many situations where the set of data is incomplete.
Cheng, Xuemin; Yang, Yikang; Hao, Qun
2016-01-01
The thermal environment is an important factor in the design of optical systems. This study investigated the thermal analysis technology of optical systems for navigation guidance and control in supersonic aircraft by developing empirical equations for the front temperature gradient and rear thermal diffusion distance, and for basic factors such as flying parameters and the structure of the optical system. Finite element analysis (FEA) was used to study the relationship between flying and front dome parameters and the system temperature field. Systematic deduction was then conducted based on the effects of the temperature field on the physical geometry and ray tracing performance of the front dome and rear optical lenses, by deriving the relational expressions between the system temperature field and the spot size and positioning precision of the rear optical lens. The optical systems used for navigation guidance and control in supersonic aircraft when the flight speed is in the range of 1–5 Ma were analysed using the derived equations. Using this new method it was possible to control the precision within 10% when considering the light spot received by the four-quadrant detector, and computation time was reduced compared with the traditional method of separately analysing the temperature field of the front dome and rear optical lens using FEA. Thus, the method can effectively increase the efficiency of parameter analysis and computation in an airborne optical system, facilitating the systematic, effective and integrated thermal analysis of airborne optical systems for navigation guidance and control. PMID:27763515
Cheng, Xuemin; Yang, Yikang; Hao, Qun
2016-10-17
The thermal environment is an important factor in the design of optical systems. This study investigated the thermal analysis technology of optical systems for navigation guidance and control in supersonic aircraft by developing empirical equations for the front temperature gradient and rear thermal diffusion distance, and for basic factors such as flying parameters and the structure of the optical system. Finite element analysis (FEA) was used to study the relationship between flying and front dome parameters and the system temperature field. Systematic deduction was then conducted based on the effects of the temperature field on the physical geometry and ray tracing performance of the front dome and rear optical lenses, by deriving the relational expressions between the system temperature field and the spot size and positioning precision of the rear optical lens. The optical systems used for navigation guidance and control in supersonic aircraft when the flight speed is in the range of 1-5 Ma were analysed using the derived equations. Using this new method it was possible to control the precision within 10% when considering the light spot received by the four-quadrant detector, and computation time was reduced compared with the traditional method of separately analysing the temperature field of the front dome and rear optical lens using FEA. Thus, the method can effectively increase the efficiency of parameter analysis and computation in an airborne optical system, facilitating the systematic, effective and integrated thermal analysis of airborne optical systems for navigation guidance and control.
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Liang, Cui
2007-01-01
The quantum finance pricing formulas for coupon bond options and swaptions derived by Baaquie [Phys. Rev. E 75, 016703 (2006)] are reviewed. We empirically study the swaption market and propose an efficient computational procedure for analyzing the data. Empirical results of the swaption price, volatility, and swaption correlation are compared with the predictions of quantum finance. The quantum finance model generates the market swaption price to over 90% accuracy.
Baaquie, Belal E; Liang, Cui
2007-01-01
The quantum finance pricing formulas for coupon bond options and swaptions derived by Baaquie [Phys. Rev. E 75, 016703 (2006)] are reviewed. We empirically study the swaption market and propose an efficient computational procedure for analyzing the data. Empirical results of the swaption price, volatility, and swaption correlation are compared with the predictions of quantum finance. The quantum finance model generates the market swaption price to over 90% accuracy.
NASA Astrophysics Data System (ADS)
Ghysels, M.; Mondelain, D.; Kassi, S.; Nikitin, A. V.; Rey, M.; Campargue, A.
2018-07-01
The methane absorption spectrum is studied at 297 K and 80 K in the center of the Tetradecad between 5695 and 5850 cm-1. The spectra are recorded by differential absorption spectroscopy (DAS) with a noise equivalent absorption of about αmin≈ 1.5 × 10-7 cm-1. Two empirical line lists are constructed including about 4000 and 2300 lines at 297 K and 80 K, respectively. Lines due to 13CH4 present in natural abundance were identified by comparison with a spectrum of pure 13CH4 recorded in the same temperature conditions. About 1700 empirical values of the lower state energy level, Eemp, were derived from the ratios of the line intensities at 80 K and 296 K. They provide accurate temperature dependence for most of the absorption in the region (93% and 82% at 80 K and 296 K, respectively). The quality of the derived empirical values is illustrated by the clear propensity of the corresponding lower state rotational quantum number, Jemp, to be close to integer values. Using an effective Hamiltonian model derived from a previously published ab initio potential energy surface, about 2060 lines are rovibrationnally assigned, adding about 1660 new assignments to those provided in the HITRAN database for 12CH4 in the region.
Mohr, Philip; Golley, Sinéad
2016-01-25
This study examined community responses to use of genetically modified (GM) content in food in the context of responses to familiar food additives by testing an empirically and theoretically derived model of the predictors of responses to both GM content and food integrity issues generally. A nationwide sample of 849 adults, selected at random from the Australian Electoral Roll, responded to a postal Food and Health Survey. Structural equation modelling analyses confirmed that ratings of general concern about food integrity (related to the presence of preservatives and other additives) strongly predicted negativity towards GM content. Concern about food integrity was, in turn, predicted by environmental concern and health engagement. In addition, both concern about food integrity generally and responses to GM content specifically were weakly predicted by attitudes to benefits of science and an intuitive (i.e., emotionally-based) reasoning style. Data from a follow-up survey conducted under the same conditions (N=1184) revealed that ratings of concern were significantly lower for use of genetic engineering in food than for four other common food integrity issues examined. Whereas the question of community responses to GM is often treated as a special issue, these findings support the conclusion that responses to the concept of GM content in food in Australia are substantially a specific instance of a general sensitivity towards the integrity of the food supply. They indicate that the origins of responses to GM content may be largely indistinguishable from those of general responses to preservatives and other common food additives. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Keane, Terence M.; And Others
1984-01-01
Developed empirically based criteria for use of the Minnesota Multiphasic Personality Inventory (MMPI) to aid in the assessment and diagnosis of Posttraumatic Stress Disorder (PTSD) in patients (N=200). Analysis based on an empircally derived decision rule correctly classified 74 percent of the patients in each group. (LLL)
An Empirical Typology of Narcissism and Mental Health in Late Adolescence
ERIC Educational Resources Information Center
Lapsley, Daniel K.; Aalsma, Matthew C.
2006-01-01
A two-step cluster analytic strategy was used in two studies to identify an empirically derived typology of narcissism in late adolescence. In Study 1, late adolescents (N=204) responded to the profile of narcissistic dispositions and measures of grandiosity (''superiority'') and idealization (''goal instability'') inspired by Kohut's theory,…
ERIC Educational Resources Information Center
Gillespie, Ann
2014-01-01
Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…
Evaluating the intersection of a regional wildlife connectivity network with highways
Samuel A. Cushman; Jesse S. Lewis; Erin L. Landguth
2013-01-01
Reliable predictions of regional-scale population connectivity are needed to prioritize conservation actions. However, there have been few examples of regional connectivity models that are empirically derived and validated. The central goals of this paper were to (1) evaluate the effectiveness of factorial least cost path corridor mapping on an empirical...
Rayne, Sierra; Forest, Kaya; Friesen, Ken J
2009-08-01
A quantitative structure-activity model has been validated for estimating congener specific gas-phase hydroxyl radical reaction rates for perfluoroalkyl sulfonic acids (PFSAs), carboxylic acids (PFCAs), aldehydes (PFAls) and dihydrates, fluorotelomer olefins (FTOls), alcohols (FTOHs), aldehydes (FTAls), and acids (FTAcs), and sulfonamides (SAs), sulfonamidoethanols (SEs), and sulfonamido carboxylic acids (SAAs), and their alkylated derivatives based on calculated semi-empirical PM6 method ionization potentials. Corresponding gas-phase reaction rates with nitrate radicals and ozone have also been estimated using the computationally derived ionization potentials. Henry's law constants for these classes of perfluorinated compounds also appear to be reasonably approximated by the SPARC software program, thereby allowing estimation of wet and dry atmospheric deposition rates. Both congener specific gas-phase atmospheric and air-water interface fractionation of these compounds is expected, complicating current source apportionment perspectives and necessitating integration of such differential partitioning influences into future multimedia models. The findings will allow development and refinement of more accurate and detailed local through global scale atmospheric models for the atmospheric fate of perfluoroalkyl compounds.
Constituents of Music and Visual-Art Related Pleasure – A Critical Integrative Literature Review
Tiihonen, Marianne; Brattico, Elvira; Maksimainen, Johanna; Wikgren, Jan; Saarikallio, Suvi
2017-01-01
The present literature review investigated how pleasure induced by music and visual-art has been conceptually understood in empirical research over the past 20 years. After an initial selection of abstracts from seven databases (keywords: pleasure, reward, enjoyment, and hedonic), twenty music and eleven visual-art papers were systematically compared. The following questions were addressed: (1) What is the role of the keyword in the research question? (2) Is pleasure considered a result of variation in the perceiver’s internal or external attributes? (3) What are the most commonly employed methods and main variables in empirical settings? Based on these questions, our critical integrative analysis aimed to identify which themes and processes emerged as key features for conceptualizing art-induced pleasure. The results demonstrated great variance in how pleasure has been approached: In the music studies pleasure was often a clear object of investigation, whereas in the visual-art studies the term was often embedded into the context of an aesthetic experience, or used otherwise in a descriptive, indirect sense. Music studies often targeted different emotions, their intensity or anhedonia. Biographical and background variables and personality traits of the perceiver were often measured. Next to behavioral methods, a common method was brain imaging which often targeted the reward circuitry of the brain in response to music. Visual-art pleasure was also frequently addressed using brain imaging methods, but the research focused on sensory cortices rather than the reward circuit alone. Compared with music research, visual-art research investigated more frequently pleasure in relation to conscious, cognitive processing, where the variations of stimulus features and the changing of viewing modes were regarded as explanatory factors of the derived experience. Despite valence being frequently applied in both domains, we conclude, that in empirical music research pleasure seems to be part of core affect and hedonic tone modulated by stable personality variables, whereas in visual-art research pleasure is a result of the so called conceptual act depending on a chosen strategy to approach art. We encourage an integration of music and visual-art into to a multi-modal framework to promote a more versatile understanding of pleasure in response to aesthetic artifacts. PMID:28775697
Post, Brady; Buchmueller, Tom; Ryan, Andrew M
2017-08-01
Hospital-physician vertical integration is on the rise. While increased efficiencies may be possible, emerging research raises concerns about anticompetitive behavior, spending increases, and uncertain effects on quality. In this review, we bring together several of the key theories of vertical integration that exist in the neoclassical and institutional economics literatures and apply these theories to the hospital-physician relationship. We also conduct a literature review of the effects of vertical integration on prices, spending, and quality in the growing body of evidence ( n = 15) to evaluate which of these frameworks have the strongest empirical support. We find some support for vertical foreclosure as a framework for explaining the observed results. We suggest a conceptual model and identify directions for future research. Based on our analysis, we conclude that vertical integration poses a threat to the affordability of health services and merits special attention from policymakers and antitrust authorities.
Enhancing the Impact of Family Justice Centers via Motivational Interviewing: An Integrated Review.
Simmons, Catherine A; Howell, Kathryn H; Duke, Michael R; Beck, J Gayle
2016-12-01
The Family Justice Center (FJC) model is an approach to assisting survivors of intimate partner violence (IPV) that focuses on integration of services under one roof and co-location of staff members from a range of multidisciplinary agencies. Even though the FJC model is touted as a best practice strategy to help IPV survivors, empirical support for the effectiveness of this approach is scarce. The current article consolidates this small yet promising body of empirically based literature in a clinically focused review. Findings point to the importance of integrating additional resources into the FJC model to engage IPV survivors who have ambivalent feelings about whether to accept help, leave the abusive relationship, and/or participate in criminal justice processes to hold the offender accountable. One such resource, motivational interviewing (MI), holds promise in aiding IPV survivors with these decisions, but empirical investigation into how MI can be incorporated into the FJC model has yet to be published. This article, therefore, also integrates the body of literature supporting the FJC model with the body of literature supporting MI with IPV survivors. Implications for practice, policy, and research are incorporated throughout this review. © The Author(s) 2015.
Fire risk in San Diego County, California: A weighted Bayesian model approach
Kolden, Crystal A.; Weigel, Timothy J.
2007-01-01
Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.
Krieger, Jonathan D
2014-08-01
I present a protocol for creating geometric leaf shape metrics to facilitate widespread application of geometric morphometric methods to leaf shape measurement. • To quantify circularity, I created a novel shape metric in the form of the vector between a circle and a line, termed geometric circularity. Using leaves from 17 fern taxa, I performed a coordinate-point eigenshape analysis to empirically identify patterns of shape covariation. I then compared the geometric circularity metric to the empirically derived shape space and the standard metric, circularity shape factor. • The geometric circularity metric was consistent with empirical patterns of shape covariation and appeared more biologically meaningful than the standard approach, the circularity shape factor. The protocol described here has the potential to make geometric morphometrics more accessible to plant biologists by generalizing the approach to developing synthetic shape metrics based on classic, qualitative shape descriptors.
Evapotranspiration Calculations for an Alpine Marsh Meadow Site in Three-river Headwater Region
NASA Astrophysics Data System (ADS)
Zhou, B.; Xiao, H.
2016-12-01
Daily radiation and meteorological data were collected at an alpine marsh meadow site in the Three-river Headwater Region(THR). Use them to assess radiation models determined after comparing the performance between Zuo model and the model recommend by FAO56P-M.Four methods, FAO56P-M, Priestley-Taylor, Hargreaves, and Makkink methods were applied to determine daily reference evapotranspiration( ETr) for the growing season and built the empirical models for estimating daily actual evapotranspiration ETa between ETr derived from the four methods and evapotranspiration derived from Bowen Ratio method on alpine marsh meadow in this region. After comparing the performance of four empirical models by RMSE, MAE and AI, it showed these models all can get the better estimated daily ETaon alpine marsh meadow in this region, and the best performance of the FAO56 P-M, Makkink empirical model were better than Priestley-Taylor and Hargreaves model.
ERIC Educational Resources Information Center
Beekhoven, S.; De Jong, U.; Van Hout, H.
2002-01-01
Compared elements of rational choice theory and integration theory on the basis of their power to explain variance in academic progress. Asserts that the concepts should be combined, and the distinction between social and academic integration abandoned. Empirical analysis showed that an extended model, comprising both integration and rational…
NASA Astrophysics Data System (ADS)
Perrier, C.; Breysacher, J.; Rauw, G.
2009-09-01
Aims: We present a technique to determine the orbital and physical parameters of eclipsing eccentric Wolf-Rayet + O-star binaries, where one eclipse is produced by the absorption of the O-star light by the stellar wind of the W-R star. Methods: Our method is based on the use of the empirical moments of the light curve that are integral transforms evaluated from the observed light curves. The optical depth along the line of sight and the limb darkening of the W-R star are modelled by simple mathematical functions, and we derive analytical expressions for the moments of the light curve as a function of the orbital parameters and the key parameters of the transparency and limb-darkening functions. These analytical expressions are then inverted in order to derive the values of the orbital inclination, the stellar radii, the fractional luminosities, and the parameters of the wind transparency and limb-darkening laws. Results: The method is applied to the SMC W-R eclipsing binary HD 5980, a remarkable object that underwent an LBV-like event in August 1994. The analysis refers to the pre-outburst observational data. A synthetic light curve based on the elements derived for the system allows a quality assessment of the results obtained.
Gini, Rosa; Schuemie, Martijn; Brown, Jeffrey; Ryan, Patrick; Vacchi, Edoardo; Coppola, Massimo; Cazzola, Walter; Coloma, Preciosa; Berni, Roberto; Diallo, Gayo; Oliveira, José Luis; Avillach, Paul; Trifirò, Gianluca; Rijnbeek, Peter; Bellentani, Mariadonata; van Der Lei, Johan; Klazinga, Niek; Sturkenboom, Miriam
2016-01-01
Introduction: We see increased use of existing observational data in order to achieve fast and transparent production of empirical evidence in health care research. Multiple databases are often used to increase power, to assess rare exposures or outcomes, or to study diverse populations. For privacy and sociological reasons, original data on individual subjects can’t be shared, requiring a distributed network approach where data processing is performed prior to data sharing. Case Descriptions and Variation Among Sites: We created a conceptual framework distinguishing three steps in local data processing: (1) data reorganization into a data structure common across the network; (2) derivation of study variables not present in original data; and (3) application of study design to transform longitudinal data into aggregated data sets for statistical analysis. We applied this framework to four case studies to identify similarities and differences in the United States and Europe: Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR), Observational Medical Outcomes Partnership (OMOP), the Food and Drug Administration’s (FDA’s) Mini-Sentinel, and the Italian network—the Integration of Content Management Information on the Territory of Patients with Complex Diseases or with Chronic Conditions (MATRICE). Findings: National networks (OMOP, Mini-Sentinel, MATRICE) all adopted shared procedures for local data reorganization. The multinational EU-ADR network needed locally defined procedures to reorganize its heterogeneous data into a common structure. Derivation of new data elements was centrally defined in all networks but the procedure was not shared in EU-ADR. Application of study design was a common and shared procedure in all the case studies. Computer procedures were embodied in different programming languages, including SAS, R, SQL, Java, and C++. Conclusion: Using our conceptual framework we found several areas that would benefit from research to identify optimal standards for production of empirical knowledge from existing databases.an opportunity to advance evidence-based care management. In addition, formalized CM outcomes assessment methodologies will enable us to compare CM effectiveness across health delivery settings. PMID:27014709
Empirically derived guidance for social scientists to influence environmental policy
Brown, Katrina; Crissman, Charles; De Young, Cassandra; Gooch, Margaret; James, Craig; Jessen, Sabine; Johnson, Dave; Marshall, Paul; Wachenfeld, Dave; Wrigley, Damian
2017-01-01
Failure to stem trends of ecological disruption and associated loss of ecosystem services worldwide is partly due to the inadequate integration of the human dimension into environmental decision-making. Decision-makers need knowledge of the human dimension of resource systems and of the social consequences of decision-making if environmental management is to be effective and adaptive. Social scientists have a central role to play, but little guidance exists to help them influence decision-making processes. We distil 348 years of cumulative experience shared by 31 environmental experts across three continents into advice for social scientists seeking to increase their influence in the environmental policy arena. Results focus on the importance of process, engagement, empathy and acumen and reveal the importance of understanding and actively participating in policy processes through co-producing knowledge and building trust. The insights gained during this research might empower a science-driven cultural change in science-policy relations for the routine integration of the human dimension in environmental decision making; ultimately for an improved outlook for earth’s ecosystems and the billions of people that depend on them. PMID:28278238
Market environment and Medicaid acceptance: What influences the access gap?
Bond, Amelia; Pajerowski, William; Polsky, Daniel; Richards, Michael R
2017-12-01
The U.S. health care system is undergoing significant changes. Two prominent shifts include millions added to Medicaid and greater integration and consolidation among firms. We empirically assess if these two industry trends may have implications for each other. Using experimentally derived ("secret shopper") data on primary care physicians' real-world behavior, we observe their willingness to accept new privately insured and Medicaid patients across 10 states. We combine this measure of patient acceptance with detailed information on physician and commercial insurer market structure and show that insurer and provider concentration are each positively associated with relative improvements in appointment availability for Medicaid patients. The former is consistent with a smaller price discrepancy between commercial and Medicaid patients and suggests a beneficial spillover from greater insurer market power. The findings for physician concentration do not align with a simple price bargaining explanation but do appear driven by physician firms that are not vertically integrated with a health system. These same firms also tend to rely more on nonphysician clinical staff. Copyright © 2017 John Wiley & Sons, Ltd.
Coherence of Personal Narratives across the Lifespan: A Multidimensional Model and Coding Method
Reese, Elaine; Haden, Catherine A.; Baker-Ward, Lynne; Bauer, Patricia; Fivush, Robyn; Ornstein, Peter A.
2012-01-01
Personal narratives are integral to autobiographical memory and to identity, with coherent personal narratives being linked to positive developmental outcomes across the lifespan. In this article, we review the theoretical and empirical literature that sets the stage for a new lifespan model of personal narrative coherence. This new model integrates context, chronology, and theme as essential dimensions of personal narrative coherence, each of which relies upon different developmental achievements and has a different developmental trajectory across the lifespan. A multidimensional method of coding narrative coherence (the Narrative Coherence Coding Scheme or NaCCS) was derived from the model and is described here. The utility of this approach is demonstrated by its application to 498 narratives that were collected in six laboratories from participants ranging in age from 3 years to adulthood. The value of the model is illustrated further by a discussion of its potential to guide future research on the developmental foundations of narrative coherence and on the benefits of personal narrative coherence for different aspects of psychological functioning. PMID:22754399
Necessity and approach to integrated nanomaterial legislation and governance.
Wang, Jiafan; Gerlach, John D; Savage, Nora; Cobb, George P
2013-01-01
Nanotechnology is one of the most promising technologies to emerge in recent decades. Materials that are specially engineered to have at least one dimension that is no larger than 100 nm are now continuously manufactured and incorporated as critical components of different products that people use daily. While we are taking advantage of nanomaterials (NMs) and nano-products, they may pose a risk to humans and the broader environment. Some types of fibrous NMs such as carbon nanotubes and nano-fibers may present a risk similar to that of asbestos. Some carbon or metal based NMs may threaten the environment due to their bioaccumulative nature within food webs. In order to prevent future adverse effects from products or byproducts of nanotechnology, we suggest an integrated multi-faceted approach which includes an integrated regulation that is based upon life cycle assessment, empirically derived risk assessment. Advanced research that fills the knowledge gap regarding the understanding of NMs in scientific and social norms will be helpful in a full life cycle assessment of NMs. Emphasizing nanotechnology education to the public for an increased understanding and participation associated with media coverage will finally draw governments' attention with an integrated legislation to be instituted. Developing the optimal mix of these tools, including research, public education, media coverage, integrated legislation, will be significant to proactively manage the complexity of nanotechnology and prevent any undesirable effect due to the NMs exposure. Copyright © 2012 Elsevier B.V. All rights reserved.
Navigating Instructional Dialectics: Empirical Exploration of Paradox in Teaching
ERIC Educational Resources Information Center
Thompson, Blair; Rudick, C. Kyle; Kerssen-Griep, Jeff; Golsan, Kathryn
2018-01-01
Navigating contradiction represents an integral part of the teaching process. While educational literature has discussed the paradoxes that teachers experience in the classroom, minimal empirical research has analyzed the strategies teachers employ to address these paradoxes. Using relational dialectics as a theoretical framework for understanding…
ERIC Educational Resources Information Center
Konold, Timothy R.; Pianta, Robert C.
2005-01-01
School readiness assessment is a prominent feature of early childhood education. Because the construct of readiness is multifaceted, we examined children's patterns on multiple indicators previously found to be both theoretically and empirically linked to school readiness: social skill, interactions with parents, problem behavior, and performance…
GPP in Loblolly Pine: A Monthly Comparison of Empirical and Process Models
Christopher Gough; John Seiler; Kurt Johnsen; David Arthur Sampson
2002-01-01
Monthly and yearly gross primary productivity (GPP) estimates derived from an empirical and two process based models (3PG and BIOMASS) were compared. Spatial and temporal variation in foliar gas photosynthesis was examined and used to develop GPP prediction models for fertilized nine-year-old loblolly pine (Pinus taeda) stands located in the North...
NASA Technical Reports Server (NTRS)
Huddleston, D.; Neugebauer, M.; Goldstein, B.
1994-01-01
The shape of the velocity distribution of water-group ions observed by the Giotto ion mass spectrometer on its approach to comet Halley is modeled to derive empirical values for the rates on ionization, energy diffusion, and loss in the mid-cometosheath.
Community Participation of People with an Intellectual Disability: A Review of Empirical Findings
ERIC Educational Resources Information Center
Verdonschot, M. M. L.; de Witte, L. P.; Reichrath, E.; Buntinx, W. H. E.; Curfs, L. M. G.
2009-01-01
Study design: A systematic review of the literature. Objectives: To investigate community participation of persons with an intellectual disability (ID) as reported in empirical research studies. Method: A systematic literature search was conducted for the period of 1996-2006 on PubMed, CINAHL and PSYCINFO. Search terms were derived from the…
ERIC Educational Resources Information Center
Stavrou, Sophia
2016-01-01
This paper aims at providing a theoretical and empirical discussion on the concept of pedagogisation which derives from the hypothesis of a new era of "totally pedagogised society" in Basil Bernstein's work. The article is based on empirical research on higher education policy, with a focus on the implementation of curriculum change…
An empirical InSAR-optical fusion approach to mapping vegetation canopy height
Wayne S. Walker; Josef M. Kellndorfer; Elizabeth LaPoint; Michael Hoppus; James Westfall
2007-01-01
Exploiting synergies afforded by a host of recently available national-scale data sets derived from interferometric synthetic aperture radar (InSAR) and passive optical remote sensing, this paper describes the development of a novel empirical approach for the provision of regional- to continental-scale estimates of vegetation canopy height. Supported by data from the...
Angus, Lynne E; Boritz, Tali; Bryntwick, Emily; Carpenter, Naomi; Macaulay, Christianne; Khattra, Jasmine
2017-05-01
Recent studies suggest that it is not simply the expression of emotion or emotional arousal in session that is important, but rather it is the reflective processing of emergent, adaptive emotions, arising in the context of personal storytelling and/or Emotion-Focused Therapy (EFT) interventions, that is associated with change. To enhance narrative-emotion integration specifically in EFT, Angus and Greenberg originally identified a set of eight clinically derived narrative-emotion integration markers were originally identified for the implementation of process-guiding therapeutic responses. Further evaluation and testing by the Angus Narrative-Emotion Marker Lab resulted in the identification of 10 empirically validated Narrative-Emotion Process (N-EP) markers that are included in the Narrative-Emotion Process Coding System Version 2.0 (NEPCS 2.0). Based on empirical research findings, individual markers are clustered into Problem (e.g., stuckness in repetitive story patterns, over-controlled or dysregulated emotion, lack of reflectivity), Transition (e.g., reflective, access to adaptive emotions and new emotional plotlines, heightened narrative and emotion integration), and Change (e.g., new story outcomes and self-narrative discovery, and co-construction and re-conceptualization) subgroups. To date, research using the NEPCS 2.0 has investigated the proportion and pattern of narrative-emotion markers in Emotion-Focused, Client-Centered, and Cognitive Therapy for Major Depression, Motivational Interviewing plus Cognitive Behavioral Therapy for Generalized Anxiety Disorder, and EFT for Complex Trauma. Results have consistently identified significantly higher proportions of N-EP Transition and Change markers, and productive shifts, in mid- and late phase sessions, for clients who achieved recovery by treatment termination. Recovery is consistently associated with client storytelling that is emotionally engaged, reflective, and evidencing new story outcomes and self-narrative change. Implications for future research, practice and training are discussed.
Accurate Critical Stress Intensity Factor Griffith Crack Theory Measurements by Numerical Techniques
Petersen, Richard C.
2014-01-01
Critical stress intensity factor (KIc) has been an approximation for fracture toughness using only load-cell measurements. However, artificial man-made cracks several orders of magnitude longer and wider than natural flaws have required a correction factor term (Y) that can be up to about 3 times the recorded experimental value [1-3]. In fact, over 30 years ago a National Academy of Sciences advisory board stated that empirical KIc testing was of serious concern and further requested that an accurate bulk fracture toughness method be found [4]. Now that fracture toughness can be calculated accurately by numerical integration from the load/deflection curve as resilience, work of fracture (WOF) and strain energy release (SIc) [5, 6], KIc appears to be unnecessary. However, the large body of previous KIc experimental test results found in the literature offer the opportunity for continued meta analysis with other more practical and accurate fracture toughness results using energy methods and numerical integration. Therefore, KIc is derived from the classical Griffith Crack Theory [6] to include SIc as a more accurate term for strain energy release rate (𝒢Ic), along with crack surface energy (γ), crack length (a), modulus (E), applied stress (σ), Y, crack-tip plastic zone defect region (rp) and yield strength (σys) that can all be determined from load and deflection data. Polymer matrix discontinuous quartz fiber-reinforced composites to accentuate toughness differences were prepared for flexural mechanical testing comprising of 3 mm fibers at different volume percentages from 0-54.0 vol% and at 28.2 vol% with different fiber lengths from 0.0-6.0 mm. Results provided a new correction factor and regression analyses between several numerical integration fracture toughness test methods to support KIc results. Further, bulk KIc accurate experimental values are compared with empirical test results found in literature. Also, several fracture toughness mechanisms are discussed especially for fiber-reinforced composites. PMID:25620817
The Social Classroom: Integrating Social Network Use in Education
ERIC Educational Resources Information Center
Mallia, Gorg, Ed.
2014-01-01
As technology is being integrated into educational processes, teachers are searching for new ways to enhance student motivation and learning. Through shared experiences and the results of empirical research, educators can ease social networking sites into instructional usage. "The Social Classroom: Integrating Social Network Use in…
Teaching Theory in an Empirically-Oriented Graduate Program.
ERIC Educational Resources Information Center
Warner, R. Stephen
1987-01-01
Stresses that the role of theory is to facilitate cognitive integration, which has a vertical dimension (abstract to concrete) and a horizontal one (across schools and substantive fields). The author emphasizes horizontal integration over upper-level vertical integration to help students communicate across specialities. (Author/DH)
Are We Correctly Measuring Star-Formation Rates?
NASA Astrophysics Data System (ADS)
McQuinn, Kristen B.; Skillman, Evan D.; Dolphin, Andrew E.; Mitchell, Noah P.
2017-01-01
Integrating our knowledge of star formation (SF) traced by observations at different wavelengths is essential for correctly interpreting and comparing SF activity in a variety of systems and environments. This study compares extinction-corrected, integrated ultraviolet (UV) emission from resolved galaxies with color-magnitude diagram (CMD) based star-formation rates (SFRs) derived from resolved stellar populations and CMD fitting techniques in 19 nearby starburst and post-starburst dwarf galaxies. The data sets are from the panchromatic Starburst Irregular Dwarf Survey (STARBIRDS) and include deep legacy GALEX UV imaging, Hubble Space Telescope optical imaging, and Spitzer MIPS imaging. For the majority of the sample, the integrated near-UV fluxes predicted from the CMD-based SFRs—using four different models—agree with the measured, extinction corrected, integrated near-UV fluxes from GALEX images, but the far-UV (FUV) predicted fluxes do not. Furthermore, we find a systematic deviation between the SFRs based on integrated FUV luminosities and existing scaling relations, and the SFRs based on the resolved stellar populations. This offset is not driven by different SF timescales, variations in SFRs, UV attenuation, nor stochastic effects. This first comparison between CMD-based SFRs and an integrated FUV emission SFR indicator suggests that the most likely cause of the discrepancy is the theoretical FUV-SFR calibration from stellar evolutionary libraries and/or stellar atmospheric models. We present an empirical calibration of the FUV-based SFR relation for dwarf galaxies, with uncertainties, which is ˜53% larger than previous relations. These results have signficant implications for measuring FUV-based SFRs of high-redshift galaxies.
A Critical Review of Digital Storyline-Enhanced Learning
ERIC Educational Resources Information Center
Novak, Elena
2015-01-01
Storyline is one of the major motivators that lead people to play video games. However, little empirical evidence exists on the instructional effectiveness of integrating a storyline into digital learning materials. This systematic literature review presents current empirical findings on the effects of a storyline game design element for human…
An Empirical Taxonomy of Social-Psychological Risk Indicators in Youth Suicide
ERIC Educational Resources Information Center
Hyde, Toni; Kirkland, John; Bimler, David; Pechtel, Pia
2005-01-01
The current study integrates descriptive (though primarily social-psychological) statements about youth suicide into a coherent, empirically supported taxonomy. Drawing from relevant literature, a set of 107 items characterizing these contributions about youth suicide was created. Seventy-two participants sorted these statements according to their…
ERIC Educational Resources Information Center
Federici, Anita; Wisniewski, Lucene; Ben-Porath, Denise
2012-01-01
The authors describe an intensive outpatient dialectical behavior therapy (DBT) program for multidiagnostic clients with eating disorders who had not responded adequately to standard, empirically supported treatments for eating disorders. The program integrates DBT with empirically supported cognitive behavior therapy approaches that are well…
A Systematic Review of Strategies for Implementing Empirically Supported Mental Health Interventions
ERIC Educational Resources Information Center
Powell, Byron J.; Proctor, Enola K.; Glass, Joseph E.
2014-01-01
Objective: This systematic review examines experimental studies that test the effectiveness of strategies intended to integrate empirically supported mental health interventions into routine care settings. Our goal was to characterize the state of the literature and to provide direction for future implementation studies. Method: A literature…
Integrating social science into empirical models of coupled human and natural systems
Jeffrey D. Kline; Eric M. White; A Paige Fischer; Michelle M. Steen-Adams; Susan Charnley; Christine S. Olsen; Thomas A. Spies; John D. Bailey
2017-01-01
Coupled human and natural systems (CHANS) research highlights reciprocal interactions (or feedbacks) between biophysical and socioeconomic variables to explain system dynamics and resilience. Empirical models often are used to test hypotheses and apply theory that represent human behavior. Parameterizing reciprocal interactions presents two challenges for social...
Empirical relations between large wood transport and catchment characteristics
NASA Astrophysics Data System (ADS)
Steeb, Nicolas; Rickenmann, Dieter; Rickli, Christian; Badoux, Alexandre
2017-04-01
The transport of vast amounts of large wood (LW) in water courses can considerably aggravate hazardous situations during flood events, and often strongly affects resulting flood damage. Large wood recruitment and transport are controlled by various factors which are difficult to assess and the prediction of transported LW volumes is difficult. Such information are, however, important for engineers and river managers to adequately dimension retention structures or to identify critical stream cross-sections. In this context, empirical formulas have been developed to estimate the volume of transported LW during a flood event (Rickenmann, 1997; Steeb et al., 2017). The data base of existing empirical wood load equations is, however, limited. The objective of the present study is to test and refine existing empirical equations, and to derive new relationships to reveal trends in wood loading. Data have been collected for flood events with LW occurrence in Swiss catchments of various sizes. This extended data set allows us to derive statistically more significant results. LW volumes were found to be related to catchment and transport characteristics, such as catchment size, forested area, forested stream length, water discharge, sediment load, or Melton ratio. Both the potential wood load and the fraction that is effectively mobilized during a flood event (effective wood load) are estimated. The difference of potential and effective wood load allows us to derive typical reduction coefficients that can be used to refine spatially explicit GIS models for potential LW recruitment.
ERIC Educational Resources Information Center
Ní Chróinín, Déirdre; Ní Mhurchú, Siobhán; Ó Ceallaigh, T. J.
2016-01-01
Increased attention to integrated approaches has resulted from demands to prioritise literacy learning while maintaining a balanced curriculum in primary schools. Limited empirical evidence to support integrated approaches to teaching physical education (PE) exists. This study explored the integration of PE content learning and the learning of…
Permeability Estimation Directly From Logging-While-Drilling Induced Polarization Data
NASA Astrophysics Data System (ADS)
Fiandaca, G.; Maurya, P. K.; Balbarini, N.; Hördt, A.; Christiansen, A. V.; Foged, N.; Bjerg, P. L.; Auken, E.
2018-04-01
In this study, we present the prediction of permeability from time domain spectral induced polarization (IP) data, measured in boreholes on undisturbed formations using the El-log logging-while-drilling technique. We collected El-log data and hydraulic properties on unconsolidated Quaternary and Miocene deposits in boreholes at three locations at a field site in Denmark, characterized by different electrical water conductivity and chemistry. The high vertical resolution of the El-log technique matches the lithological variability at the site, minimizing ambiguity in the interpretation originating from resolution issues. The permeability values were computed from IP data using a laboratory-derived empirical relationship presented in a recent study for saturated unconsolidated sediments, without any further calibration. A very good correlation, within 1 order of magnitude, was found between the IP-derived permeability estimates and those derived using grain size analyses and slug tests, with similar depth trends and permeability contrasts. Furthermore, the effect of water conductivity on the IP-derived permeability estimations was found negligible in comparison to the permeability uncertainties estimated from the inversion and the laboratory-derived empirical relationship.
Data layer integration for the national map of the united states
Usery, E.L.; Finn, M.P.; Starbuck, M.
2009-01-01
The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... charges. An OTC derivatives dealer shall provide a description of all statistical models used for pricing... controls over those models, and a statement regarding whether the firm has developed its own internal VAR models. If the OTC derivatives dealer's VAR model incorporates empirical correlations across risk...
Two-nucleon S10 amplitude zero in chiral effective field theory
NASA Astrophysics Data System (ADS)
Sánchez, M. Sánchez; Yang, C.-J.; Long, Bingwei; van Kolck, U.
2018-02-01
We present a new rearrangement of short-range interactions in the S10 nucleon-nucleon channel within chiral effective field theory. This is intended to address the slow convergence of Weinberg's scheme, which we attribute to its failure to reproduce the amplitude zero (scattering momentum ≃340 MeV) at leading order. After the power counting scheme is modified to accommodate the zero at leading order, it includes subleading corrections perturbatively in a way that is consistent with renormalization-group invariance. Systematic improvement is shown at next-to-leading order, and we obtain results that fit empirical phase shifts remarkably well all the way up to the pion-production threshold. An approach in which pions have been integrated out is included, which allows us to derive analytic results that also fit phenomenology surprisingly well.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.
2015-01-01
This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.
NASA Technical Reports Server (NTRS)
Antoniadis, D. A.
1976-01-01
The time-dependent equations of neutral air motion are solved subject to three constraints: two of them are the usual upper and lower boundary conditions and the third is the value of the wind-induced ion drift at any given height. Using incoherent radar data, this procedure leads to a fast, direct numerical integration of the two coupled differential equations describing the horizontal wind components and yields time dependent wind profiles and meridional exospheric neutral temperature gradients. The diurnal behavior of the neutral wind system and of the exospheric temperature is presented for two solstice and two equinox days. The data used were obtained by the St. Santin and the Millstone Hill incoherent scatter radars. The derived geographic distributions of the exospheric temperatures are compared with those predicted by the OGO-6 empirical thermospheric model.
NASA Astrophysics Data System (ADS)
Xu, Jie; Wu, Tao; Peng, Chuang; Adegbite, Stephen
2017-09-01
The geometric Plateau border model for closed cell polyurethane foam was developed based on volume integrations of approximated 3D four-cusp hypocycloid structure. The tetrahedral structure of convex struts was orthogonally projected into 2D three-cusp deltoid with three central cylinders. The idealized single unit strut was modeled by superposition. The volume of each component was calculated by geometric analyses. The strut solid fraction f s and foam porosity coefficient δ were calculated based on representative elementary volume of Kelvin and Weaire-Phelan structures. The specific surface area Sv derived respectively from packing structures and deltoid approximation model were put into contrast against strut dimensional ratio ɛ. The characteristic foam parameters obtained from this semi-empirical model were further employed to predict foam thermal conductivity.
The evolutionary basis of human social learning
Morgan, T. J. H.; Rendell, L. E.; Ehn, M.; Hoppitt, W.; Laland, K. N.
2012-01-01
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules. PMID:21795267
The evolutionary basis of human social learning.
Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N
2012-02-22
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.
Incorporating Applied Behavior Analysis to Assess and Support Educators' Treatment Integrity
ERIC Educational Resources Information Center
Collier-Meek, Melissa A.; Sanetti, Lisa M. H.; Fallon, Lindsay M.
2017-01-01
For evidence-based interventions to be effective for students they must be consistently implemented, however, many teachers struggle with treatment integrity and require support. Although many implementation support strategies are research based, there is little empirical guidance about the types of treatment integrity, implementers, and contexts…
The Artful Teacher: A Conceptual Model for Arts Integration in Schools
ERIC Educational Resources Information Center
Chemi, Tatiana
2014-01-01
This article addresses specific issues within arts-integration experiences in schools. Focusing on the relationship between positive emotions, learning, and the Arts, the article discusses empirical data that has been drawn from a research study, Making the Ordinary Extraordinary: Adopting Artfulness in Danish Schools. When schools integrate the…
ERIC Educational Resources Information Center
Salajan, Florin D.; Chiper, Sorina
2013-01-01
This article conducts an exploration of Romania's European integration process through higher education. It contends that integration occurs at "formal" and "informal levels" through institutional norms and human agency, respectively. Through theoretical and empirical analysis, the authors discuss the modalities through which…
Statistical Measures of Integrity in Online Testing: Empirical Study
ERIC Educational Resources Information Center
Wielicki, Tom
2016-01-01
This paper reports on longitudinal study regarding integrity of testing in an online format as used by e-learning platforms. Specifically, this study explains whether online testing, which implies an open book format is compromising integrity of assessment by encouraging cheating among students. Statistical experiment designed for this study…
Empirical effective temperatures and bolometric corrections for early-type stars
NASA Technical Reports Server (NTRS)
Code, A. D.; Bless, R. C.; Davis, J.; Brown, R. H.
1976-01-01
An empirical effective temperature for a star can be found by measuring its apparent angular diameter and absolute flux distribution. The angular diameters of 32 bright stars in the spectral range O5f to F8 have recently been measured with the stellar interferometer at Narrabri Observatory, and their absolute flux distributions have been found by combining observations of ultraviolet flux from the Orbiting Astronomical Observatory (OAO-2) with ground-based photometry. In this paper, these data have been combined to derive empirical effective temperatures and bolometric corrections for these 32 stars.
Shriver, K A
1986-01-01
Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.
NASA Astrophysics Data System (ADS)
Lu, Xiao-Ping; Huang, Xiang-Jie; Ip, Wing-Huen; Hsia, Chi-Hao
2018-04-01
In the lightcurve inversion process where asteroid's physical parameters such as rotational period, pole orientation and overall shape are searched, the numerical calculations of the synthetic photometric brightness based on different shape models are frequently implemented. Lebedev quadrature is an efficient method to numerically calculate the surface integral on the unit sphere. By transforming the surface integral on the Cellinoid shape model to that on the unit sphere, the lightcurve inversion process based on the Cellinoid shape model can be remarkably accelerated. Furthermore, Matlab codes of the lightcurve inversion process based on the Cellinoid shape model are available on Github for free downloading. The photometric models, i.e., the scattering laws, also play an important role in the lightcurve inversion process, although the shape variations of asteroids dominate the morphologies of the lightcurves. Derived from the radiative transfer theory, the Hapke model can describe the light reflectance behaviors from the viewpoint of physics, while there are also many empirical models in numerical applications. Numerical simulations are implemented for the comparison of the Hapke model with the other three numerical models, including the Lommel-Seeliger, Minnaert, and Kaasalainen models. The results show that the numerical models with simple function expressions can fit well with the synthetic lightcurves generated based on the Hapke model; this good fit implies that they can be adopted in the lightcurve inversion process for asteroids to improve the numerical efficiency and derive similar results to those of the Hapke model.
Essays on energy derivatives pricing and financial risk management =
NASA Astrophysics Data System (ADS)
Madaleno, Mara Teresa da Silva
This thesis consists of an introductory chapter (essay I) and five more empirical essays on electricity markets and CO2 spot price behaviour, derivatives pricing analysis and hedging. Essay I presents the structure of the thesis and electricity markets functioning and characteristics, as well as the type of products traded, to be analyzed on the following essays. In the second essay we conduct an empirical study on co-movements in electricity markets resorting to wavelet analysis, discussing long-term dynamics and markets integration. Essay three is about hedging performance and multiscale relationships in the German electricity spot and futures markets, also using wavelet analysis. We concentrate the investigation on the relationship between coherence evolution and hedge ratio analysis, on a time-frequency-scale approach, between spot and futures which conditions the effectiveness of the hedging strategy. Essays four, five and six are interrelated between them and with the other two previous essays given the nature of the commodity analyzed, CO2 emission allowances, traded in electricity markets. Relationships between electricity prices, primary energy fuel prices and carbon dioxide permits are analyzed on essay four. The efficiency of the European market for allowances is examined taking into account markets heterogeneity. Essay five analyzes stylized statistical properties of the recent traded asset CO2 emission allowances, for spot and futures returns, examining also the relation linking convenience yield and risk premium, for the German European Energy Exchange (EEX) between October 2005 and October 2009. The study was conducted through empirical estimations of CO2 allowances risk premium, convenience yield, and their relation. Future prices from an ex-post perspective are examined to show evidence for significant negative risk premium, or else a positive forward premium. Finally, essay six analyzes emission allowances futures hedging effectiveness, providing evidence for utility gains increases with investor’s preference over risk. Deregulation of electricity markets has led to higher uncertainty in electricity prices and by presenting these essays we try to shed new lights about structuring, pricing and hedging in this type of markets.
Decision-Making Under Risk: Integrating Perspectives From Biology, Economics, and Psychology.
Mishra, Sandeep
2014-08-01
Decision-making under risk has been variably characterized and examined in many different disciplines. However, interdisciplinary integration has not been forthcoming. Classic theories of decision-making have not been amply revised in light of greater empirical data on actual patterns of decision-making behavior. Furthermore, the meta-theoretical framework of evolution by natural selection has been largely ignored in theories of decision-making under risk in the human behavioral sciences. In this review, I critically examine four of the most influential theories of decision-making from economics, psychology, and biology: expected utility theory, prospect theory, risk-sensitivity theory, and heuristic approaches. I focus especially on risk-sensitivity theory, which offers a framework for understanding decision-making under risk that explicitly involves evolutionary considerations. I also review robust empirical evidence for individual differences and environmental/situational factors that predict actual risky decision-making that any general theory must account for. Finally, I offer steps toward integrating various theoretical perspectives and empirical findings on risky decision-making. © 2014 by the Society for Personality and Social Psychology, Inc.
Davison, James A
2015-01-01
To present a cause of posterior capsule aspiration and a technique using optimized parameters to prevent it from happening when operating soft cataracts. A prospective list of posterior capsule aspiration cases was kept over 4,062 consecutive cases operated with the Alcon CENTURION machine and Balanced Tip. Video analysis of one case of posterior capsule aspiration was accomplished. A surgical technique was developed using empirically derived machine parameters and customized setting-selection procedure step toolbar to reduce the pace of aspiration of soft nuclear quadrants in order to prevent capsule aspiration. Two cases out of 3,238 experienced posterior capsule aspiration before use of the soft quadrant technique. Video analysis showed an attractive vortex effect with capsule aspiration occurring in 1/5 of a second. A soft quadrant removal setting was empirically derived which had a slower pace and seemed more controlled with no capsule aspiration occurring in the subsequent 824 cases. The setting featured simultaneous linear control from zero to preset maximums for: aspiration flow, 20 mL/min; and vacuum, 400 mmHg, with the addition of torsional tip amplitude up to 20% after the fluidic maximums were achieved. A new setting selection procedure step toolbar was created to increase intraoperative flexibility by providing instantaneous shifting between the soft and normal settings. A technique incorporating a reduced pace for soft quadrant acquisition and aspiration can be accomplished through the use of a dedicated setting of integrated machine parameters. Toolbar placement of the procedure button next to the normal setting procedure button provides the opportunity to instantaneously alternate between the two settings. Simultaneous surgeon control over vacuum, aspiration flow, and torsional tip motion may make removal of soft nuclear quadrants more efficient and safer.
ERIC Educational Resources Information Center
Biedron, Adriana; Pawlak, Miroslaw
2016-01-01
While a substantial body of empirical evidence has been accrued about the role of individual differences in second language acquisition, relatively little is still known about how factors of this kind can mediate the effects of instructional practices as well as how empirically-derived insights can inform foreign language pedagogy, both with…
ERIC Educational Resources Information Center
Chiu, Pit Ho Patrio; Cheng, Shuk Han
2017-01-01
Recent studies on active learning classrooms (ACLs) have demonstrated their positive influence on student learning. However, most of the research evidence is derived from a few subject-specific courses or limited student enrolment. Empirical studies on this topic involving large student populations are rare. The present work involved a large-scale…
ERIC Educational Resources Information Center
Green, Francis; Vignoles, Anna
2012-01-01
We present a method to compare different qualifications for entry to higher education by studying students' subsequent performance. Using this method for students holding either the International Baccalaureate (IB) or A-levels gaining their degrees in 2010, we estimate an "empirical" equivalence scale between IB grade points and UCAS…
Curran, Patrick J.
2009-01-01
The following manuscript is the final accepted manuscript. It has not been subjected to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive, publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this manuscript by NIH, or other third parties. The published version is available at www.apa.org/journals/met. The goal of any empirical science is to pursue the construction of a cumulative base of knowledge upon which the future of the science may be built. However, there is mixed evidence that the science of psychology can accurately be characterized by such a cumulative progression. Indeed, some argue that the development of a truly cumulative psychological science is not possible using the current paradigms of hypothesis testing in single-study designs. The author explores this controversy as a framework to introduce the six papers that make up this special issue that is focused on the integration of data and empirical findings across multiple studies. The author proposes that the methods and techniques described in this set of papers can significantly propel us forward in our ongoing quest to build a cumulative psychological science. PMID:19485622
Will sex selection reduce fertility?
Leung, S F
1994-01-01
Population control is one of the primary policies applied against poverty in many low income countries. The widespread prevalence of son preference in some countries such as China and India, however, works against any reduction of fertility. This is so because parents often continue to have children until they obtain the number of sons which they desire. The bias against girls has also led to higher abortion and mortality rates of female children. It is frequently argued that if sex selection methods are made available to parents so that they can control the gender of their children, population growth would be lowered and women's welfare improved. The author investigates both theoretically and numerically the impact of sex selection on fertility. A static quantity-quality model of fertility is used to compare fertility choices when parents cannot choose the gender of children versus a situation in which parents can choose gender. Empirical data are drawn from the 1976 Malaysian Family Life Survey. Analysis found that whether sex selection reduces fertility depends upon the second and third derivatives of the utility function and the child expenditure function. A numerical dynamic analysis is also presented. The simulation shows, using empirical dynamic models of fertility and the Monte Carlo integration technique, that sex selection on the firstborn child among the Chinese in Malaysia could reduce fertility by about 3%.
VizieR Online Data Catalog: 12um ISOCAM survey of the ESO-Sculptor field (Seymour+, 2007)
NASA Astrophysics Data System (ADS)
Seymour, N.; Rocca-Volmerange, B.; de Lapparent, V.
2007-11-01
We present a detailed reduction of a mid-infrared 12um (LW10 filter) ISOCAM open time observation performed on the ESO-Sculptor Survey field (Arnouts et al., 1997A&AS..124..163A). A complete catalogue of 142 sources (120 galaxies and 22 stars), detected with high significance (equivalent to 5{sigma}), is presented above an integrated flux density of 0.31mJy. Star/galaxy separation is performed by a detailed study of colour-colour diagrams. The catalogue is complete to 1mJy and, below this flux density, the incompleteness is corrected using two independent methods. The first method uses stars and the second uses optical counterparts of the ISOCAM galaxies; these methods yield consistent results. We also apply an empirical flux density calibration using stars in the field. For each star, the 12um flux density is derived by fitting optical colours from a multi-band {chi}2 to stellar templates (BaSel-2.0) and using empirical optical-IR colour-colour relations. This article is a companion analysis to our 2007 paper (Rocca-Volmerange et al. 2007A&A...475..801R) where the 12um faint galaxy counts are presented and analysed per galaxy type with the evolutionary code PEGASE.3. (1 data file).
Card, Noel A.
2011-01-01
The traditional psychological approach of studying aggression among schoolchildren in terms of individual differences in aggression and in victimization has been valuable in identifying prevalence rates, risk, and consequences of involvement in aggression. However, it is argued that a focus on aggressor-victim relationships is warranted based on both conceptual and empirical grounds. Such a shift in focus requires modification and integration of existing theories of aggression, and this paper integrates social cognitive theory and interdependence theory to suggest a new, interdependent social cognitive theory of aggression. Specifically, this paper identifies points of overlap and different foci between these theories, and it illustrates their integration through a proposed model of the emergence of aggressor-victim interactions and relationships. The paper concludes that expanding consideration to include aggressor-victim relationships among schoolchildren offers considerable theoretical, empirical, and intervention opportunities. PMID:26985397
ERIC Educational Resources Information Center
Glass, Gene V.; And Others
Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…
Empirical modeling of Single-Event Upset (SEU) in NMOS depletion-mode-load static RAM (SRAM) chips
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Smith, S. L.; Atwood, G. E.
1986-01-01
A detailed experimental investigation of single-event upset (SEU) in static RAM (SRAM) chips fabricated using a family of high-performance NMOS (HMOS) depletion-mode-load process technologies, has been done. Empirical SEU models have been developed with the aid of heavy-ion data obtained with a three-stage tandem van de Graaff accelerator. The results of this work demonstrate a method by which SEU may be empirically modeled in NMOS integrated circuits.
Ohshiro, Tomokazu; Angelaki, Dora E; DeAngelis, Gregory C
2017-07-19
Studies of multisensory integration by single neurons have traditionally emphasized empirical principles that describe nonlinear interactions between inputs from two sensory modalities. We previously proposed that many of these empirical principles could be explained by a divisive normalization mechanism operating in brain regions where multisensory integration occurs. This normalization model makes a critical diagnostic prediction: a non-preferred sensory input from one modality, which activates the neuron on its own, should suppress the response to a preferred input from another modality. We tested this prediction by recording from neurons in macaque area MSTd that integrate visual and vestibular cues regarding self-motion. We show that many MSTd neurons exhibit the diagnostic form of cross-modal suppression, whereas unisensory neurons in area MT do not. The normalization model also fits population responses better than a model based on subtractive inhibition. These findings provide strong support for a divisive normalization mechanism in multisensory integration. Copyright © 2017 Elsevier Inc. All rights reserved.
Compassion: An Evolutionary Analysis and Empirical Review
ERIC Educational Resources Information Center
Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana
2010-01-01
What is compassion? And how did it evolve? In this review, we integrate 3 evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct…
Assessment of rockfall susceptibility by integrating statistical and physically-based approaches
NASA Astrophysics Data System (ADS)
Frattini, Paolo; Crosta, Giovanni; Carrara, Alberto; Agliardi, Federico
In Val di Fassa (Dolomites, Eastern Italian Alps) rockfalls constitute the most significant gravity-induced natural disaster that threatens both the inhabitants of the valley, who are few, and the thousands of tourists who populate the area in summer and winter. To assess rockfall susceptibility, we developed an integrated statistical and physically-based approach that aimed to predict both the susceptibility to onset and the probability that rockfalls will attain specific reaches. Through field checks and multi-temporal aerial photo-interpretation, we prepared a detailed inventory of both rockfall source areas and associated scree-slope deposits. Using an innovative technique based on GIS tools and a 3D rockfall simulation code, grid cells pertaining to the rockfall source-area polygons were classified as active or inactive, based on the state of activity of the associated scree-slope deposits. The simulation code allows one to link each source grid cell with scree deposit polygons by calculating the trajectory of each simulated launch of blocks. By means of discriminant analysis, we then identified the mix of environmental variables that best identifies grid cells with low or high susceptibility to rockfalls. Among these variables, structural setting, land use, and morphology were the most important factors that led to the initiation of rockfalls. We developed 3D simulation models of the runout distance, intensity and frequency of rockfalls, whose source grid cells corresponded either to the geomorphologically-defined source polygons ( geomorphological scenario) or to study area grid cells with slope angle greater than an empirically-defined value of 37° ( empirical scenario). For each scenario, we assigned to the source grid cells an either fixed or variable onset susceptibility; the latter was derived from the discriminant model group (active/inactive) membership probabilities. Comparison of these four models indicates that the geomorphological scenario with variable onset susceptibility appears to be the most realistic model. Nevertheless, political and legal issues seem to guide local administrators, who tend to select the more conservative empirically-based scenario as a land-planning tool.
NASA Astrophysics Data System (ADS)
Nunnallee, Edmund Pierce, Jr.
1980-03-01
This dissertation consists of an investigation into the empirical scaling of a digital echo integrator for assessment of a population of juvenile sockeye salmon in Cultus Lake, British Columbia, Canada. The scaling technique was developed over the last ten years for use with totally uncalibrated but stabilized data collection and analysis equipment, and has been applied to populations of fish over a wide geographical range. This is the first investigation into the sources of bias and the accuracy of the technique, however, and constitutes a verification of the method. The initial section of the investigation describes hydroacoustic data analysis methods for estimation of effective sampling volume which is necessary for estimation of fish density. The second section consists of a computer simulation of effective sample volume estimation by this empirical method and is used to investigate the degree of bias introduced by electronic and physical parameters such as boat speed -fish depth interaction effects, electronic thresholding and saturation, transducer beam angle, fish depth stratification by size and spread of the target strength distribution of the fish. Comparisons of simulation predictions of sample volume estimation bias to actual survey results are given at the end of this section. A verification of the scaling method is then presented by comparison of a hydroacoustically derived estimation of the Cultus Lake smolt population to an independent and concurrent estimate made by counting the migrant fish as they passed through a weir in the outlet stream of the lake. Finally, the effect on conduct and accuracy of hydroacoustic assessment of juvenile sockeye salmon due to several behavioral traits are discussed. These traits include movements of presmolt fish in a lake just prior to their outmigration, daily vertical migrations and the emergence and dispersal of sockeye fry in Cultus Lake. In addition, a comparison of the summer depth preferences of the fish over their entire geographical distribution on the west coast of the U.S. and Canada are discussed in terms of hydroacoustic accessibility.
Stewart, Louis J; Trussel, John
2006-01-01
Although the use of derivatives, particularly interest rate swaps, has grown explosively over the past decade, derivative financial instrument use by nonprofits has received only limited attention in the research literature. Because little is known about the risk management activities of nonprofits, the impact of these instruments on the ability of nonprofits to raise capital may have significant public policy implications. The primary motivation of this study is to determine the types of derivatives used by nonprofits and estimate the frequency of their use among these organizations. Our study also extends contemporary finance theory by an empirical examination of the motivation for interest rate swap usage among nonprofits. Our empirical data came from 193 large nonprofit health care providers that issued debt to the public between 2000 and 2003. We used a univariate analysis and a multivariate analysis relying on logistic regression models to test alternative explanations of interest rate swaps usage by nonprofits, finding that more than 45 percent of our sample, 88 organizations, used interest rate swaps with an aggregate notional value in excess of $8.3 billion. Our empirical tests indicate the primary motive for nonprofits to use interest rate derivatives is to hedge their exposure to interest rate risk. Although these derivatives are a useful risk management tool, under conditions of falling bond market interest rates these derivatives may also expose a nonprofit swap user to the risk of a material unscheduled termination payment. Finally, we found considerable diversity in the informativeness of footnote disclosure among sample organizations that used interest rate swaps. Many nonprofits did not disclose these risks in their financial statements. In conclusion, we find financial managers in large nonprofits commonly use derivative financial instruments as risk management tools, but the use of interest rate swaps by nonprofits may expose them to other risks that are not adequately disclosed in their financial statements.
Deriving local demand for stumpage from estimates of regional supply and demand.
Kent P. Connaughton; Gerard A. Majerus; David H. Jackson
1989-01-01
The local (Forest-level or local-area) demand for stumpage can be derived from estimates of regional supply and demand. The derivation of local demand is justified when the local timber economy is similar to the regional timber economy; a simple regression of local on nonlocal prices can be used as an empirical test of similarity between local and regional economies....
ERIC Educational Resources Information Center
Song, Ji Hoon; Chermack, Thomas J.; Kim, Hong Min
2008-01-01
This research examined the link between learning processes and knowledge formation through an integrated literature review from both academic and practical viewpoints. Individuals' learning processes and organizational knowledge creation were reviewed by means of theoretical and integrative analysis based on a lack of empirical research on the…
ERIC Educational Resources Information Center
Weinberg, Andrea Elizabeth; Sample McMeeking, Laura Beth
2017-01-01
Numerous national initiatives call for interdisciplinary mathematics and science education, but few empirical studies have examined practical considerations for integrated instruction in high school settings. The purpose of this qualitative study was twofold. First, the study sought to describe how and to what extent teachers integrate mathematics…
ERIC Educational Resources Information Center
Dahiyat, Samer E.
2015-01-01
The aim of this research is to empirically investigate the relationships among the three vital knowledge management processes of acquisition, integration and application, and their effects on organisational innovation in the pharmaceutical manufacturing industry in Jordan; a knowledge-intensive business service (KIBS) sector. Structural equation…
Integrated Moral Conviction Theory of Student Cheating: An Empirical Test
ERIC Educational Resources Information Center
Roberts, Foster; Thomas, Christopher H.; Novicevic, Milorad M.; Ammeter, Anthony; Garner, Bart; Johnson, Paul; Popoola, Ifeoluwa
2018-01-01
In this article, we develop an "integrated moral conviction theory of student cheating" by integrating moral conviction with (a) the dual-process model of Hunt-Vitell's theory that gives primacy to individual ethical philosophies when moral judgments are made and (b) the social cognitive conceptualization that gives primacy to moral…
ERIC Educational Resources Information Center
Wilson, Elizabeth; Kirby, Barbara; Flowers, Jim
2002-01-01
Recent legislation encourages the integration of academic content in agricultural education. In North Carolina, high school agricultural education programs can now choose to offer a state adopted integrated biotechnology curriculum. Empirical evidence was needed to identify and describe factors related to the intent of agricultural educators to…
Irrigation water demand: A meta-analysis of price elasticities
NASA Astrophysics Data System (ADS)
Scheierling, Susanne M.; Loomis, John B.; Young, Robert A.
2006-01-01
Metaregression models are estimated to investigate sources of variation in empirical estimates of the price elasticity of irrigation water demand. Elasticity estimates are drawn from 24 studies reported in the United States since 1963, including mathematical programming, field experiments, and econometric studies. The mean price elasticity is 0.48. Long-run elasticities, those that are most useful for policy purposes, are likely larger than the mean estimate. Empirical results suggest that estimates may be more elastic if they are derived from mathematical programming or econometric studies and calculated at a higher irrigation water price. Less elastic estimates are found to be derived from models based on field experiments and in the presence of high-valued crops.
ERIC Educational Resources Information Center
Gaze, Eric C.
2005-01-01
We introduce a cooperative learning, group lab for a Calculus III course to facilitate comprehension of the gradient vector and directional derivative concepts. The lab is a hands-on experience allowing students to manipulate a tangent plane and empirically measure the effect of partial derivatives on the direction of optimal ascent. (Contains 7…
Learners with Dyslexia: Exploring Their Experiences with Different Online Reading Affordances
ERIC Educational Resources Information Center
Chen, Chwen Jen; Keong, Melissa Wei Yin; Teh, Chee Siong; Chuah, Kee Man
2015-01-01
To date, empirically derived guidelines for designing accessible online learning environments for learners with dyslexia are still scarce. This study aims to explore the learning experience of learners with dyslexia when reading passages using different online reading affordances to derive some guidelines for dyslexia-friendly online text. The…
Development and system identification of a light unmanned aircraft for flying qualities research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, M.E.; Andrisani, D. II
This paper describes the design, construction, flight testing and system identification of a light weight remotely piloted aircraft and its use in studying flying qualities in the longitudinal axis. The short period approximation to the longitudinal dynamics of the aircraft was used. Parameters in this model were determined a priori using various empirical estimators. These parameters were then estimated from flight data using a maximum likelihood parameter identification method. A comparison of the parameter values revealed that the stability derivatives obtained from the empirical estimators were reasonably close to the flight test results. However, the control derivatives determined by themore » empirical estimators were too large by a factor of two. The aircraft was also flown to determine how the longitudinal flying qualities of light weight remotely piloted aircraft compared to full size manned aircraft. It was shown that light weight remotely piloted aircraft require much faster short period dynamics to achieve level I flying qualities in an up-and-away flight task.« less
NASA Astrophysics Data System (ADS)
Poret, Matthieu; Corradini, Stefano; Merucci, Luca; Costa, Antonio; Andronico, Daniele; Montopoli, Mario; Vulpiani, Gianfranco; Freret-Lorgeril, Valentin
2018-04-01
Recent explosive volcanic eruptions recorded worldwide (e.g. Hekla in 2000, Eyjafjallajökull in 2010, Cordón-Caulle in 2011) demonstrated the necessity for a better assessment of the eruption source parameters (ESPs; e.g. column height, mass eruption rate, eruption duration, and total grain-size distribution - TGSD) to reduce the uncertainties associated with the far-travelling airborne ash mass. Volcanological studies started to integrate observations to use more realistic numerical inputs, crucial for taking robust volcanic risk mitigation actions. On 23 November 2013, Etna (Italy) erupted, producing a 10 km height plume, from which two volcanic clouds were observed at different altitudes from satellites (SEVIRI, MODIS). One was retrieved as mainly composed of very fine ash (i.e. PM20), and the second one as made of ice/SO2 droplets (i.e. not measurable in terms of ash mass). An atypical north-easterly wind direction transported the tephra from Etna towards the Calabria and Apulia regions (southern Italy), permitting tephra sampling in proximal (i.e. ˜ 5-25 km from the source) and medial areas (i.e. the Calabria region, ˜ 160 km). A primary TGSD was derived from the field measurement analysis, but the paucity of data (especially related to the fine ash fraction) prevented it from being entirely representative of the initial magma fragmentation. To better constrain the TGSD assessment, we also estimated the distribution from the X-band weather radar data. We integrated the field and radar-derived TGSDs by inverting the relative weighting averages to best fit the tephra loading measurements. The resulting TGSD is used as input for the FALL3D tephra dispersal model to reconstruct the whole tephra loading. Furthermore, we empirically modified the integrated TGSD by enriching the PM20 classes until the numerical results were able to reproduce the airborne ash mass retrieved from satellite data. The resulting TGSD is inverted by best-fitting the field, ground-based, and satellite-based measurements. The results indicate a total erupted mass of 1.2 × 109 kg, being similar to the field-derived value of 1.3 × 109 kg, and an initial PM20 fraction between 3.6 and 9.0 wt %, constituting the tail of the TGSD.
A Bayesian Analysis of Scale-Invariant Processes
2012-01-01
Earth Grid (EASE- Grid). The NED raster elevation data of one arc-second resolution (30 m) over the continental US are derived from multiple satellites ...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...empirical and ME distributions, yet ensuring computational efficiency. Instead of com- puting empirical histograms from large amount of data , only some
Nonlinear bulging factor based on R-curve data
NASA Technical Reports Server (NTRS)
Jeong, David Y.; Tong, Pin
1994-01-01
In this paper, a nonlinear bulging factor is derived using a strain energy approach combined with dimensional analysis. The functional form of the bulging factor contains an empirical constant that is determined using R-curve data from unstiffened flat and curved panel tests. The determination of this empirical constant is based on the assumption that the R-curve is the same for both flat and curved panels.
A Comprehensive Theory of Integration.
Singer, Sara J; Kerrissey, Michaela; Friedberg, Mark; Phillips, Russell
2018-03-01
Efforts to transform health care delivery to improve care have increasingly focused on care integration. However, variation in how integration is defined has complicated efforts to design, synthesize, and compare studies of integration in health care. Evaluations of integration initiatives would be enhanced by describing them according to clear definitions of integration and specifying which empirical relationships they seek to test-whether among types of integration or between integration and outcomes of care. Drawing on previous work, we present a comprehensive theoretical model of relationships between types of integration and propose how to measure them.
NASA Astrophysics Data System (ADS)
Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin
2016-04-01
The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.
ERIC Educational Resources Information Center
Liew, Chern Li; Chennupati, K. R.; Foo, Schubert
2001-01-01
Explores the potential and impact of an innovative information environment in enhancing user activities in using electronic documents for various tasks, and to support the value-adding of these e-documents. Discusses the conceptual design and prototyping of a proposed environment, PROPIE. Presents an empirical and formative evaluation of the…
Trends of Empirical Research in South Korean Mental Health Social Work
ERIC Educational Resources Information Center
Song, In Han; Lee, Eun Jung
2017-01-01
Since the introduction of evidence-based practice in South Korea, it has gained significant attention for its potential to promote the efficacy of social work services and to integrate knowledge and practice in mental health social work. In order to see how empirical research in South Korean mental health social work has changed, we examined…
Empirical likelihood-based tests for stochastic ordering
BARMI, HAMMOU EL; MCKEAGUE, IAN W.
2013-01-01
This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142
Mouzé-Amady, Marc; Raufaste, Eric; Prade, Henri; Meyer, Jean-Pierre
2013-01-01
The aim of this study was to assess mental workload in which various load sources must be integrated to derive reliable workload estimates. We report a new algorithm for computing weights from qualitative fuzzy integrals and apply it to the National Aeronautics and Space Administration -Task Load indeX (NASA-TLX) subscales in order to replace the standard pair-wise weighting technique (PWT). In this paper, two empirical studies were reported: (1) In a laboratory experiment, age- and task-related variables were investigated in 53 male volunteers and (2) In a field study, task- and job-related variables were studied on aircrews during 48 commercial flights. The results found in this study were as follows: (i) in the experimental setting, fuzzy estimates were highly correlated with classical (using PWT) estimates; (ii) in real work conditions, replacing PWT by automated fuzzy treatments simplified the NASA-TLX completion; (iii) the algorithm for computing fuzzy estimates provides a new classification procedure sensitive to various variables of work environments and (iv) subjective and objective measures can be used for the fuzzy aggregation of NASA-TLX subscales. NASA-TLX, a classical tool for mental workload assessment, is based on a weighted sum of ratings from six subscales. A new algorithm, which impacts on input data collection and computes weights and indexes from qualitative fuzzy integrals, is evaluated through laboratory and field studies. Pros and cons are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, Caroline; Forbes, Duncan A.; Proctor, Robert N.
2010-04-15
The Ca II triplet (CaT) feature in the near-infrared has been employed as a metallicity indicator for individual stars as well as integrated light of Galactic globular clusters (GCs) and galaxies with varying degrees of success, and sometimes puzzling results. Using the DEIMOS multi-object spectrograph on Keck we obtain a sample of 144 integrated light spectra of GCs around the brightest group galaxy NGC 1407 to test whether the CaT index can be used as a metallicity indicator for extragalactic GCs. Different sets of single stellar population models make different predictions for the behavior of the CaT as a functionmore » of metallicity. In this work, the metallicities of the GCs around NGC 1407 are obtained from CaT index values using an empirical conversion. The measured CaT/metallicity distributions show unexpected features, the most remarkable being that the brightest red and blue GCs have similar CaT values despite their large difference in mean color. Suggested explanations for this behavior in the NGC 1407 GC system are (1) the CaT may be affected by a population of hot blue stars, (2) the CaT may saturate earlier than predicted by the models, and/or (3) color may not trace metallicity linearly. Until these possibilities are understood, the use of the CaT as a metallicity indicator for the integrated spectra of extragalactic GCs will remain problematic.« less
Study of galaxies in the Lynx-Cancer void - VII. New oxygen abundances
NASA Astrophysics Data System (ADS)
Pustilnik, S. A.; Perepelitsyna, Y. A.; Kniazev, A. Y.
2016-11-01
We present new or improved oxygen abundances (O/H) for the nearby Lynx-Cancer void updated galaxy sample. They are obtained via the SAO 6-m telescope spectroscopy (25 objects), or derived from the Sloan Digital Sky Survey spectra (14 galaxies, of which for seven objects O/H values were unknown). For eight galaxies with detected [O III] λ4363 line, O/H values are derived via the direct (Te) method. For the remaining objects, O/H was estimated via semi-empirical and empirical methods. For all accumulated O/H data for 81 galaxies of this void (with 40 of them derived via Te method), their relation `O/H versus MB' is compared with that for similar late-type galaxies from denser environments (the Local Volume `reference sample'). We confirm our previous conclusion derived for a subsample of 48 objects: void galaxies show systematically reduced O/H for the same luminosity with respect to the reference sample, in average by 0.2 dex, or by a factor of ˜1.6. Moreover, we confirm the fraction of ˜20 per cent of strong outliers, with O/H of two to four times lower than the typical values for the `reference' sample. The new data are consistent with the conclusion on the slower evolution of the main void galaxy population. We obtained Hα velocity for the faint optical counterpart of the most gas-rich (M(H I)/LB = 25) void object J0723+3624, confirming its connection with the respective H I blob. For similar extremely gas-rich dwarf J0706+3020, we give a tentative O/H ˜(O/H)⊙/45. In Appendix A, we present the results of calibration of semi-empirical method by Izotov & Thuan and of empirical calibrators by Pilyugin & Thuan and Yin et al. on the sample of ˜150 galaxies from the literature with O/H measured by Te method.
Mathematical Features of the Calculus
ERIC Educational Resources Information Center
Sauerheber, Richard D.
2010-01-01
The fundamental theorems of the calculus describe the relationships between derivatives and integrals of functions. The value of any function at a particular location is the definite derivative of its integral and the definite integral of its derivative. Thus, any value is the magnitude of the slope of the tangent of its integral at that position,…
Colonius, Hans; Diederich, Adele
2011-07-01
The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.
Math, Science, and Engineering Integration in a High School Engineering Course: A Qualitative Study
ERIC Educational Resources Information Center
Valtorta, Clara G.; Berland, Leema K.
2015-01-01
Engineering in K-12 classrooms has been receiving expanding emphasis in the United States. The integration of science, mathematics, and engineering is a benefit and goal of K-12 engineering; however, current empirical research on the efficacy of K-12 science, mathematics, and engineering integration is limited. This study adds to this growing…
ERIC Educational Resources Information Center
Raptis, Helen
2011-01-01
Little empirical research has investigated the integration of Canada's Aboriginal children into provincial school systems. Furthermore, the limited existing research has tended to focus on policymakers and government officials at the national level. Thus, the policy shift from segregation to integration has generally been attributed to Canada's…
The timing and sources of information for the adoption and implementation of production innovations
NASA Technical Reports Server (NTRS)
Ettlie, J. E.
1976-01-01
Two dimensions (personal-impersonal and internal-external) are used to characterize information sources as they become important during the interorganizational transfer of production innovations. The results of three studies are reviewed for the purpose of deriving a model of the timing and importance of different information sources and the utilization of new technology. Based on the findings of two retrospective studies, it was concluded that the pattern of information seeking behavior in user organizations during the awareness stage of adoption is not a reliable predictor of the eventual utilization rate. Using the additional findings of a real-time study, an empirical model of the relative importance of information sources for successful user organizations is presented. These results are extended and integrated into a theoretical model consisting of a time-profile of successful implementations and the relative importance of four types of information sources during seven stages of the adoption-implementation process.
Modification and testing of an engine and fuel control system for a hydrogen fuelled gas turbine
NASA Astrophysics Data System (ADS)
Funke, H. H.-W.; Börner, S.; Hendrick, P.; Recker, E.
2011-10-01
The control of pollutant emissions has become more and more important by the development of new gas turbines. The use of hydrogen produced by renewable energy sources could be an alternative. Besides the reduction of NOx emissions emerged during the combustion process, another major question is how a hydrogen fuelled gas turbine including the metering unit can be controlled and operated. This paper presents a first insight in modifications on an Auxiliary Power Unit (APU) GTCP 36300 for using gaseous hydrogen as a gas turbine fuel. For safe operation with hydrogen, the metering of hydrogen has to be fast, precise, and secure. So, the quality of the metering unit's control loop has an important influence on this topic. The paper documents the empiric determination of the proportional integral derivative (PID) control parameters for the metering unit.
Training quality job interviews with adults with developmental disabilities.
Mozingo, D; Ackley, G B; Bailey, J S
1994-01-01
Supported work models of vocational integration have increased the employability of individuals with developmental disabilities. Interview questions most frequently used and corresponding responses considered most beneficial to job applicants were derived from an empirical analysis of the "hiring community" and served as a basis for the development of the verbal job interview skills training package evaluated in this research. Dependent measures were objective, behavioral indices of the quality of job interview responses. One-to-one training by a direct training staff, job coach, and a trained behavior analyst resulted in improved responding by all subjects as indicated in a multiple baseline design across interview questions. Improved quality in responding to questions generalized to variations in interview questions, to a novel interviewer, and in an in vivo interview situation. Finally, global measures of social validity support the value of the quality-of-response training.
Rehm, Jürgen
2008-06-01
In summarizing the key themes and results of the second meeting of the German Addiction Research Network 'Understanding Addiction: Mediators and Moderators of Behaviour Change Process', the following concrete steps forward were laid out to improve knowledge. The steps included pleas to (1) redefine substance abuse disorders, especially redefine the concept of abuse and harmful use; (2) increase the use of longitudinal and life-course studies with more adequate statistical methods such as latent growth modelling; (3) empirically test more specific and theoretically derived common factors and mechanisms of behavioural change processes; (4) better exploit cross-regional and cross-cultural differences.Funding agencies are urged to support these developments by specifically supporting interdisciplinary research along the lines specified above. This may include improved forms of international funding of groups of researchers from different countries, where each national group conducts a specific part of an integrated proposal. 2008 John Wiley & Sons, Ltd
Two-nucleon S 0 1 amplitude zero in chiral effective field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, M. Sanchez; Yang, C. -J.; Long, Bingwei
We present a new rearrangement of short-range interactions in the 1S 0 nucleon-nucleon channel within chiral effective field theory. This is intended to address the slow convergence of Weinberg’s scheme, which we attribute to its failure to reproduce the amplitude zero (scattering momentum ≃340 MeV) at leading order. After the power counting scheme is modified to accommodate the zero at leading order, it includes subleading corrections perturbatively in a way that is consistent with renormalization-group invariance. Systematic improvement is shown at next-to-leading order, and we obtain results that fit empirical phase shifts remarkably well all the way up to themore » pion-production threshold. As a result, an approach in which pions have been integrated out is included, which allows us to derive analytic results that also fit phenomenology surprisingly well.« less
Modeling Integrated Water-User Decisions with Intermittent Supplies
NASA Astrophysics Data System (ADS)
Lund, J. R.; Rosenberg, D.
2006-12-01
We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.
Nonparametric instrumental regression with non-convex constraints
NASA Astrophysics Data System (ADS)
Grasmair, M.; Scherzer, O.; Vanhems, A.
2013-03-01
This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.
Two-nucleon S 0 1 amplitude zero in chiral effective field theory
Sanchez, M. Sanchez; Yang, C. -J.; Long, Bingwei; ...
2018-02-05
We present a new rearrangement of short-range interactions in the 1S 0 nucleon-nucleon channel within chiral effective field theory. This is intended to address the slow convergence of Weinberg’s scheme, which we attribute to its failure to reproduce the amplitude zero (scattering momentum ≃340 MeV) at leading order. After the power counting scheme is modified to accommodate the zero at leading order, it includes subleading corrections perturbatively in a way that is consistent with renormalization-group invariance. Systematic improvement is shown at next-to-leading order, and we obtain results that fit empirical phase shifts remarkably well all the way up to themore » pion-production threshold. As a result, an approach in which pions have been integrated out is included, which allows us to derive analytic results that also fit phenomenology surprisingly well.« less
Orthorexia nervosa: An integrative literature review of a lifestyle syndrome.
Håman, Linn; Barker-Ruchti, Natalie; Patriksson, Göran; Lindgren, Eva-Carin
2015-01-01
Bratman first proposed orthorexia nervosa in the late 1990s, defining it an obsession with eating healthy food to achieve, for instance, improved health. Today, in the Swedish media, excessive exercising plays a central role in relation to orthorexia. A few review articles on orthorexia have been conducted; however, these have not focused on aspects of food and eating, sport, exercise, or a societal perspective. The overall aim of this study was to provide an overview and synthesis of what philosophies of science approaches form the current academic framework of orthorexia. Key questions were: What aspects of food and eating are related to orthorexia? What role do exercise and sports play in relation to orthorexia? In what ways are orthorexia contextualized? Consequently, the concept of healthism was used to discuss and contextualize orthorexia. The method used was an integrative literature review; the material covered 19 empirical and theoretical articles published in peer-reviewed journals. This review demonstrates a multifaceted nature of orthorexia research; this field has been examined from four different philosophies of science approaches (i.e., empirical-atomistic, empirical-atomistic with elements of empirical-holistic, empirical-holistic, and rational-holistic) on individual, social, and societal levels. The majority of the articles followed an empirical-atomistic approach that focused on orthorexia as an individual issue, which was discussed using healthism. Our analysis indicates a need for (a) more empirical-holistic research that applies interpretive qualitative methods and uses a social perspective of health, e.g., healthism and (b) examining the role of sports and exercise in relation to orthorexia that takes the problematizing of "orthorexic behaviours" within the sports context into account.
Orthorexia nervosa: An integrative literature review of a lifestyle syndrome
Håman, Linn; Barker-Ruchti, Natalie; Patriksson, Göran; Lindgren, Eva-Carin
2015-01-01
Bratman first proposed orthorexia nervosa in the late 1990s, defining it an obsession with eating healthy food to achieve, for instance, improved health. Today, in the Swedish media, excessive exercising plays a central role in relation to orthorexia. A few review articles on orthorexia have been conducted; however, these have not focused on aspects of food and eating, sport, exercise, or a societal perspective. The overall aim of this study was to provide an overview and synthesis of what philosophies of science approaches form the current academic framework of orthorexia. Key questions were: What aspects of food and eating are related to orthorexia? What role do exercise and sports play in relation to orthorexia? In what ways are orthorexia contextualized? Consequently, the concept of healthism was used to discuss and contextualize orthorexia. The method used was an integrative literature review; the material covered 19 empirical and theoretical articles published in peer-reviewed journals. This review demonstrates a multifaceted nature of orthorexia research; this field has been examined from four different philosophies of science approaches (i.e., empirical-atomistic, empirical-atomistic with elements of empirical-holistic, empirical-holistic, and rational-holistic) on individual, social, and societal levels. The majority of the articles followed an empirical-atomistic approach that focused on orthorexia as an individual issue, which was discussed using healthism. Our analysis indicates a need for (a) more empirical-holistic research that applies interpretive qualitative methods and uses a social perspective of health, e.g., healthism and (b) examining the role of sports and exercise in relation to orthorexia that takes the problematizing of “orthorexic behaviours” within the sports context into account. PMID:26282866
Increasing Chemical Space Coverage by Combining Empirical and Computational Fragment Screens
2015-01-01
Most libraries for fragment-based drug discovery are restricted to 1,000–10,000 compounds, but over 500,000 fragments are commercially available and potentially accessible by virtual screening. Whether this larger set would increase chemotype coverage, and whether a computational screen can pragmatically prioritize them, is debated. To investigate this question, a 1281-fragment library was screened by nuclear magnetic resonance (NMR) against AmpC β-lactamase, and hits were confirmed by surface plasmon resonance (SPR). Nine hits with novel chemotypes were confirmed biochemically with KI values from 0.2 to low mM. We also computationally docked 290,000 purchasable fragments with chemotypes unrepresented in the empirical library, finding 10 that had KI values from 0.03 to low mM. Though less novel than those discovered by NMR, the docking-derived fragments filled chemotype holes from the empirical library. Crystal structures of nine of the fragments in complex with AmpC β-lactamase revealed new binding sites and explained the relatively high affinity of the docking-derived fragments. The existence of chemotype holes is likely a general feature of fragment libraries, as calculation suggests that to represent the fragment substructures of even known biogenic molecules would demand a library of minimally over 32,000 fragments. Combining computational and empirical fragment screens enables the discovery of unexpected chemotypes, here by the NMR screen, while capturing chemotypes missing from the empirical library and tailored to the target, with little extra cost in resources. PMID:24807704
The First Empirical Determination of the Fe10+ and Fe13+ Freeze-in Distances in the Solar Corona
NASA Astrophysics Data System (ADS)
Boe, Benjamin; Habbal, Shadia; Druckmüller, Miloslav; Landi, Enrico; Kourkchi, Ehsan; Ding, Adalbert; Starha, Pavel; Hutton, Joseph
2018-06-01
Heavy ions are markers of the physical processes responsible for the density and temperature distribution throughout the fine-scale magnetic structures that define the shape of the solar corona. One of their properties, whose empirical determination has remained elusive, is the “freeze-in” distance (R f ) where they reach fixed ionization states that are adhered to during their expansion with the solar wind. We present the first empirical inference of R f for {Fe}}{10+} and {Fe}}{13+} derived from multi-wavelength imaging observations of the corresponding Fe XI ({Fe}}{10+}) 789.2 nm and Fe XIV ({Fe}}{13+}) 530.3 nm emission acquired during the 2015 March 20 total solar eclipse. We find that the two ions freeze-in at different heliocentric distances. In polar coronal holes (CHs) R f is around 1.45 R ⊙ for {Fe}}{10+} and below 1.25 R ⊙ for {Fe}}{13+}. Along open field lines in streamer regions, R f ranges from 1.4 to 2 R ⊙ for {Fe}}{10+} and from 1.5 to 2.2 R ⊙ for {Fe}}{13+}. These first empirical R f values: (1) reflect the differing plasma parameters between CHs and streamers and structures within them, including prominences and coronal mass ejections; (2) are well below the currently quoted values derived from empirical model studies; and (3) place doubt on the reliability of plasma diagnostics based on the assumption of ionization equilibrium beyond 1.2 R ⊙.
An Empirical Derivation of the Run Time of the Bubble Sort Algorithm.
ERIC Educational Resources Information Center
Gonzales, Michael G.
1984-01-01
Suggests a moving pictorial tool to help teach principles in the bubble sort algorithm. Develops such a tool applied to an unsorted list of numbers and describes a method to derive the run time of the algorithm. The method can be modified to run the times of various other algorithms. (JN)
The Empirical Derivation of Equations for Predicting Subjective Textual Information. Final Report.
ERIC Educational Resources Information Center
Kauffman, Dan; And Others
A study was made to derive an equation for predicting the "subjective" textual information contained in a text of material written in the English language. Specifically, this investigation describes, by a mathematical equation, the relationship between the "subjective" information content of written textual material and the relative number of…
Cultural Accommodation of Substance Abuse Treatment for Latino Adolescents
Burrow-Sanchez, Jason; Martinez, Charles; Hops, Hyman; Wrona, Megan
2011-01-01
Collaborating with community stakeholders is an often suggested step when integrating cultural variables into psychological treatments for members of ethnic minority groups. However, there is a dearth of literature describing how to accomplish this process within the context of substance abuse treatment studies. This paper describes a qualitative study conducted through a series of focus groups with stakeholders in the Latino community. Data from focus groups were used by researchers to guide the integration of cultural variables into an empirically-supported substance abuse treatment for Latino adolescents currently being evaluated for efficacy. A model for culturally accommodating empirically-supported treatments for ethnic minority participants is also described. PMID:21888499
An integrated conceptual framework for evaluating and improving 'understanding' in informed consent.
Bossert, Sabine; Strech, Daniel
2017-10-17
The development of understandable informed consent (IC) documents has proven to be one of the most important challenges in research with humans as well as in healthcare settings. Therefore, evaluating and improving understanding has been of increasing interest for empirical research on IC. However, several conceptual and practical challenges for the development of understandable IC documents remain unresolved. In this paper, we will outline and systematize some of these challenges. On the basis of our own experiences in empirical user testing of IC documents as well as the relevant literature on understanding in IC, we propose an integrated conceptual model for the development of understandable IC documents. The proposed conceptual model integrates different methods for the participatory improvement of written information, including IC, as well as quantitative methods for measuring understanding in IC. In most IC processes, understandable written information is an important prerequisite for valid IC. To improve the quality of IC documents, a conceptual model for participatory procedures of testing, revising, and retesting can be applied. However, the model presented in this paper needs further theoretical and empirical elaboration and clarification of several conceptual and practical challenges.
NASA Astrophysics Data System (ADS)
Fortenberry, Ryan
The Spitzer Space Telescope observation of spectra most likely attributable to diverse and abundant populations of polycyclic aromatic hydrocarbons (PAHs) in space has led to tremendous interest in these molecules as tracers of the physical conditions in different astrophysical regions. A major challenge in using PAHs as molecular tracers is the complexity of the spectral features in the 3-20 μm region. The large number and vibrational similarity of the putative PAHs responsible for these spectra necessitate determination for the most accurate basis spectra possible for comparison. It is essential that these spectra be established in order for the regions explored with the newest generation of observatories such as SOFIA and JWST to be understood. Current strategies to develop these spectra for individual PAHs involve either matrixisolation IR measurements or quantum chemical calculations of harmonic vibrational frequencies. These strategies have been employed to develop the successful PAH IR spectral database as a repository of basis functions used to fit astronomically observed spectra, but they are limited in important ways. Both techniques provide an adequate description of the molecules in their electronic, vibrational, and rotational ground state, but these conditions do not represent energetically hot regions for PAHs near strong radiation fields of stars and are not direct representations of the gas phase. Some non-negligible matrix effects are known in condensed-phase studies, and the inclusion of anharmonicity in quantum chemical calculations is essential to generate physically-relevant results especially for hot bands. While scaling factors in either case can be useful, they are agnostic to the system studied and are not robustly predictive. One strategy that has emerged to calculate the molecular vibrational structure uses vibrational perturbation theory along with a quartic force field (QFF) to account for higher-order derivatives of the potential energy surface. QFFs can regularly predict the fundamental vibrational frequencies to within 5 cm-1 of experimentally measured values. This level of accuracy represents a reduction in discrepancies by an order of magnitude compared with harmonic frequencies calculated with density functional theory (DFT). The major limitation of the QFF strategy is that the level of electronic-structure theory required to develop a predictive force field is prohibitively time consuming for molecular systems larger than 5 atoms. Recent advances in QFF techniques utilizing informed DFT approaches have pushed the size of the systems studied up to 24 heavy atoms, but relevant PAHs can have up to hundreds of atoms. We have developed alternative electronic-structure methods that maintain the accuracy of the coupled-cluster calculations extrapolated to the complete basis set limit with relativistic and core correlation corrections applied: the CcCR QFF. These alternative methods are based on simplifications of Hartree—Fock theory in which the computationally intensive two-electron integrals are approximated using empirical parameters. These methods reduce computational time to orders of magnitude less than the CcCR calculations. We have derived a set of optimized empirical parameters to minimize the difference molecular ions of astrochemical significance. We have shown that it is possible to derive a set of empirical parameters that will produce RMS energy differences of less than 2 cm- 1 for our test systems. We are proposing to adopt this reparameterization strategy and some of the lessons learned from the informed DFT studies to create a semi-empirical method whose tremendous speed will allow us to study the rovibrational structure of large PAHs with up to 100s of carbon atoms.
Integrating Professional Development across the Curriculum: An Effectiveness Study
ERIC Educational Resources Information Center
Ciarocco, Natalie J.; Dinella, Lisa M.; Hatchard, Christine J.; Valosin, Jayde
2016-01-01
The current study empirically tested the effectiveness of a modular approach to integrating professional development across an undergraduate psychology curriculum. Researchers conducted a two-group, between-subjects experiment on 269 undergraduate psychology students assessing perceptions of professional preparedness and learning. Analysis…
Are cross-cultural comparisons of norms on death anxiety valid?
Beshai, James A
2008-01-01
Cross-cultural comparisons of norms derived from research on Death Anxiety are valid as long as they provide existential validity. Existential validity is not empirically derived like construct validity. It is an understanding of being human unto death. It is the realization that death is imminent. It is the inner sense that provides a responder to death anxiety scales with a valid expression of his or her sense about the prospect of dying. It can be articulated in a life review by a disclosure of one's ontology. This article calls upon psychologists who develop death anxiety scales to disclose their presuppositions about death before administering a questionnaire. By disclosing his or her ontology a psychologist provides a means of disclosing his or her intentionality in responding to the items. This humanistic paradigm allows for an interactive participation between investigator and subject. Lester, Templer, and Abdel-Khalek (2006-2007) enriched psychology with significant empirical data on several correlates of death anxiety. But all scientists, especially psychologists, will always have alternative interpretations of the same empirical fact pattern. Empirical data is limited by the affirmation of the consequent limitation. A phenomenology of language and communication makes existential validity a necessary step for a broader understanding of the meaning of death anxiety.
Prediction of maximum earthquake intensities for the San Francisco Bay region
Borcherdt, Roger D.; Gibbs, James F.
1975-01-01
The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.
Empirical Corrections to Nutation Amplitudes and Precession Computed from a Global VLBI Solution
NASA Astrophysics Data System (ADS)
Schuh, H.; Ferrandiz, J. M.; Belda-Palazón, S.; Heinkelmann, R.; Karbon, M.; Nilsson, T.
2017-12-01
The IAU2000A nutation and IAU2006 precession models were adopted to provide accurate estimations and predictions of the Celestial Intermediate Pole (CIP). However, they are not fully accurate and VLBI (Very Long Baseline Interferometry) observations show that the CIP deviates from the position resulting from the application of the IAU2006/2000A model. Currently, those deviations or offsets of the CIP (Celestial Pole Offsets - CPO), can only be obtained by the VLBI technique. The accuracy of the order of 0.1 milliseconds of arc (mas) allows to compare the observed nutation with theoretical prediction model for a rigid Earth and constrain geophysical parameters describing the Earth's interior. In this study, we empirically evaluate the consistency, systematics and deviations of the IAU 2006/2000A precession-nutation model using several CPO time series derived from the global analysis of VLBI sessions. The final objective is the reassessment of the precession offset and rate, and the amplitudes of the principal terms of nutation, trying to empirically improve the conventional values derived from the precession/nutation theories. The statistical analysis of the residuals after re-fitting the main nutation terms demonstrates that our empirical corrections attain an error reduction by almost 15 micro arc seconds.
NASA Astrophysics Data System (ADS)
Hyer, E. J.; Reid, J. S.; Schmidt, C. C.; Giglio, L.; Prins, E.
2009-12-01
The diurnal cycle of fire activity is crucial for accurate simulation of atmospheric effects of fire emissions, especially at finer spatial and temporal scales. Estimating diurnal variability in emissions is also a critical problem for construction of emissions estimates from multiple sensors with variable coverage patterns. An optimal diurnal emissions estimate will use as much information as possible from satellite fire observations, compensate known biases in those observations, and use detailed theoretical models of the diurnal cycle to fill in missing information. As part of ongoing improvements to the Fire Location and Monitoring of Burning Emissions (FLAMBE) fire monitoring system, we evaluated several different methods of integrating observations with different temporal sampling. We used geostationary fire detections from WF_ABBA, fire detection data from MODIS, empirical diurnal cycles from TRMM, and simple theoretical diurnal curves based on surface heating. Our experiments integrated these data in different combinations to estimate the diurnal cycles of emissions for each location and time. Hourly emissions estimates derived using these methods were tested using an aerosol transport model. We present results of this comparison, and discuss the implications of our results for the broader problem of multi-sensor data fusion in fire emissions modeling.
Spence Laschinger, Heather K; Gilbert, Stephanie; Smith, Lesley M; Leslie, Kate
2010-01-01
The purpose of this theoretical paper is to propose an integrated model of nurse/patient empowerment that could be used as a guide for creating high-quality nursing practice work environments that ensure positive outcomes for both nurses and their patients. There are few integrated theoretical approaches to nurse and patient empowerment in the literature, although nurse empowerment is assumed to positively affect patient outcomes. The constructs described in Kanter's (1993) work empowerment theory are conceptually consistent with the nursing care process and can be logically extended to nurses' interactions with their patients and the outcomes of nursing care. We propose a model of nurse/patient empowerment derived from Kanter's theory that suggests that empowering working conditions increase feelings of psychological empowerment in nurses, resulting in greater use of patient empowerment strategies by nurses, and, ultimately, greater patient empowerment and better health outcomes. Empirical testing of the model is recommended prior to use of the model in clinical practice. We argue that empowered nurses are more likely to empower their patients, which results in better patient and system outcomes. Strategies for managers to empower nurses and for nurses to empower patients are suggested.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, E. S.; Hounshell, D. A.; Yeh, S.
2004-01-15
This project seeks to improve the ability of integrated assessment models (IA) to incorporate changes in technology, especially environmental technologies, cost and performance over time. In this report, we present results of research that examines past experience in controlling other major power plant emissions that might serve as a reasonable guide to future rates of technological progress in carbon capture and sequestration (CCS) systems. In particular, we focus on U.S. and worldwide experience with sulfur dioxide (SO{sub 2}) and nitrogen oxide (NO{sub x}) control technologies over the past 30 years, and derive empirical learning rates for these technologies. The patternsmore » of technology innovation are captured by our analysis of patent activities and trends of cost reduction over time. Overall, we found learning rates of 11% for the capital costs of flue gas desulfurization (FGD) system for SO{sub 2} control, and 13% for selective catalytic reduction (SCR) systems for NO{sub x} control. We explore the key factors responsible for the observed trends, especially the development of regulatory policies for SO{sub 2} and NO{sub x} control, and their implications for environmental control technology innovation.« less
NASA Technical Reports Server (NTRS)
Tilley, D. G.
1986-01-01
Directional ocean wave spectra were derived from Shuttle Imaging Radar (SIR-B) imagery in regions where nearly simultaneous aircraft-based measurements of the wave spectra were also available as part of the NASA Shuttle Mission 41G experiments. The SIR-B response to a coherently speckled scene is used to estimate the stationary system transfer function in the 15 even terms of an eighth-order two-dimensional polynomial. Surface elevation contours are assigned to SIR-B ocean scenes Fourier filtered using a empirical model of the modulation transfer function calibrated with independent measurements of wave height. The empirical measurements of the wave height distribution are illustrated for a variety of sea states.
Forest canopy effects on snow accumulation and ablation: an integrative review of empirical results
Andres Varhola; Nicholas C. Coops; Markus Weiler; R. Dan Moore
2010-01-01
The past century has seen significant research comparing snow accumulation and ablation in forested and open sites. In this review we compile and standardize the results of previous empirical studies to generate statistical relations between changes in forest cover and the associated changes in snow accumulation and ablation rate. The analysis drew upon 33 articles...
NASA Technical Reports Server (NTRS)
Wahls, Richard A.
1990-01-01
The method presented is designed to improve the accuracy and computational efficiency of existing numerical methods for the solution of flows with compressible turbulent boundary layers. A compressible defect stream function formulation of the governing equations assuming an arbitrary turbulence model is derived. This formulation is advantageous because it has a constrained zero-order approximation with respect to the wall shear stress and the tangential momentum equation has a first integral. Previous problems with this type of formulation near the wall are eliminated by using empirically based analytic expressions to define the flow near the wall. The van Driest law of the wall for velocity and the modified Crocco temperature-velocity relationship are used. The associated compressible law of the wake is determined and it extends the valid range of the analytical expressions beyond the logarithmic region of the boundary layer. The need for an inner-region eddy viscosity model is completely avoided. The near-wall analytic expressions are patched to numerically computed outer region solutions at a point determined during the computation. A new boundary condition on the normal derivative of the tangential velocity at the surface is presented; this condition replaces the no-slip condition and enables numerical integration to the surface with a relatively coarse grid using only an outer region turbulence model. The method was evaluated for incompressible and compressible equilibrium flows and was implemented into an existing Navier-Stokes code using the assumption of local equilibrium flow with respect to the patching. The method has proven to be accurate and efficient.
Chronic Fatigue Syndrome and Myalgic Encephalomyelitis: Toward An Empirical Case Definition
Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Evans, Meredyth; Jantke, Rachel; Williams, Yolonda; Furst, Jacob; Vernon, Suzanne D.
2015-01-01
Current case definitions of Myalgic Encephalomyelitis (ME) and chronic fatigue syndrome (CFS) have been based on consensus methods, but empirical methods could be used to identify core symptoms and thereby improve the reliability. In the present study, several methods (i.e., continuous scores of symptoms, theoretically and empirically derived cut off scores of symptoms) were used to identify core symptoms best differentiating patients from controls. In addition, data mining with decision trees was conducted. Our study found a small number of core symptoms that have good sensitivity and specificity, and these included fatigue, post-exertional malaise, a neurocognitive symptom, and unrefreshing sleep. Outcomes from these analyses suggest that using empirically selected symptoms can help guide the creation of a more reliable case definition. PMID:26029488
Uncertainties in Surface Layer Modeling
NASA Astrophysics Data System (ADS)
Pendergrass, W.
2015-12-01
A central problem for micrometeorologists has been the relationship of air-surface exchange rates of momentum and heat to quantities that can be predicted with confidence. The flux-gradient profile developed through Monin-Obukhov Similarity Theory (MOST) provides an integration of the dimensionless wind shear expression where is an empirically derived expression for stable and unstable atmospheric conditions. Empirically derived expressions are far from universally accepted (Garratt, 1992, Table A5). Regardless of what form of these relationships might be used, their significance over any short period of time is questionable since all of these relationships between fluxes and gradients apply to averages that might rarely occur. It is well accepted that the assumption of stationarity and homogeneity do not reflect the true chaotic nature of the processes that control the variables considered in these relationships, with the net consequence that the levels of predictability theoretically attainable might never be realized in practice. This matter is of direct relevance to modern prognostic models which construct forecasts by assuming the universal applicability of relationships among averages for the lower atmosphere, which rarely maintains an average state. Under a Cooperative research and Development Agreement between NOAA and Duke Energy Generation, NOAA/ATDD conducted atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of legacy flux-gradient formulations (the ϕ functions, see Monin and Obukhov, 1954) for the exchange of heat and momentum. At the Duke Energy Ocotillo site, NOAA/ATDD installed sonic anemometers reporting wind and temperature fluctuations at 10Hz at eight elevations. From these observations, ϕM and ϕH were derived from a two-year database of mean and turbulent wind and temperature observations. From this extensive measurement database, using a methodology proposed by Kanenasu, Wesely and Hicks (1979), the overall dependence of ϕM and ϕH on is characterized. Results indicate considerable scatter with the familiar relationships, such as Paulson (1970), best describing the averages; however it is the scatter that largely defines the attainable levels of predictability.
Spectral Classes for FAA's Integrated Noise Model Version 6.0.
DOT National Transportation Integrated Search
1999-12-07
The starting point in any empirical model such as the Federal Aviation Administrations (FAA) : Integrated Noise Model (INM) is a reference data base. In Version 5.2 and in previous versions : the reference data base consisted solely of a set of no...
The Mathematics of Medical Imaging in the Classroom.
ERIC Educational Resources Information Center
Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.
2002-01-01
Presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty. Reviews clinical medical practice and theoretical and empirical literature in mathematics education and radiology to develop and pilot model integrative classroom topics and activities. Suggests mathematical applications in numeration and…
Miller, Benjamin F; Mendenhall, Tai J; Malik, Alan D
2009-03-01
Integrating behavioral health services within the primary care setting drives higher levels of collaborative care, and is proving to be an essential part of the solution for our struggling American healthcare system. However, justification for implementing and sustaining integrated and collaborative care has shown to be a formidable task. In an attempt to move beyond conflicting terminology found in the literature, we delineate terms and suggest a standardized nomenclature. Further, we maintain that addressing the three principal worlds of healthcare (clinical, operational, financial) is requisite in making sense of the spectrum of available implementations and ultimately transitioning collaborative care into the mainstream. Using a model that deconstructs process metrics into factors/barriers and generalizes behavioral health provider roles into major categories provides a framework to empirically discriminate between implementations across specific settings. This approach offers practical guidelines for care sites implementing integrated and collaborative care and defines a research framework to produce the evidence required for the aforementioned clinical, operational and financial worlds of this important movement.
ERIC Educational Resources Information Center
Ben-Eliyahu, Adar; Linnenbrink-Garcia, Lisa
2015-01-01
An integrative framework for investigating self-regulated learning situated in students' favorite and least favorite courses was empirically tested in a sample of 178 high school and 280 college students. Building on cognitive, clinical, social, and educational conceptions of self-regulation, the current paper integrated affective (e.g.,…
Empirical STORM-E Model. [I. Theoretical and Observational Basis
NASA Technical Reports Server (NTRS)
Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III
2013-01-01
Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented
Generalizing Integrals Involving X [superscript X] and Series Involving N [superscript N
ERIC Educational Resources Information Center
Osler, Thomas J.; Tsay, Jeffrey
2005-01-01
In this paper, the authors evaluate the series and integrals presented by P. Glaister. The authors show that this function has the Maclauren series expansion. The authors derive the series from the integral in two ways. The first derivation uses the technique employed by Glaister. The second derivation uses a change in variable in the integral.
Sigmoid function based integral-derivative observer and application to autopilot design
NASA Astrophysics Data System (ADS)
Shao, Xingling; Wang, Honglun; Liu, Jun; Tang, Jun; Li, Jie; Zhang, Xiaoming; Shen, Chong
2017-02-01
To handle problems of accurate signal reconstruction and controller implementation with integral and derivative components in the presence of noisy measurement, motivated by the design principle of sigmoid function based tracking differentiator and nonlinear continuous integral-derivative observer, a novel integral-derivative observer (SIDO) using sigmoid function is developed. The key merit of the proposed SIDO is that it can simultaneously provide continuous integral and differential estimates with almost no drift phenomena and chattering effect, as well as acceptable noise-tolerance performance from output measurement, and the stability is established based on exponential stability and singular perturbation theory. In addition, the effectiveness of SIDO in suppressing drift phenomena and high frequency noises is firstly revealed using describing function and confirmed through simulation comparisons. Finally, the theoretical results on SIDO are demonstrated with application to autopilot design: 1) the integral and tracking estimates are extracted from the sensed pitch angular rate contaminated by nonwhite noises in feedback loop, 2) the PID(proportional-integral-derivative) based attitude controller is realized by adopting the error estimates offered by SIDO instead of using the ideal integral and derivative operator to achieve satisfactory tracking performance under control constraint.
von Zerssen, Detlev
2002-04-01
A unidimensional model of the relationships between normal temperament, psychopathic variants of it and the two main forms of so-called endogenous psychoses (major affective disorders and schizophrenia) was derived from Kretschmer's constitutional typology. It was, however, not confirmed by means of a biometric approach nor was Kretschmer's broad concept of cyclothymia as a correlate of physical stoutness on the one hand and major affective disorders on the other supported by empirical data. Yet the concept of the 'melancholic type' of personality of patients with severe unipolar major depression (melancholia) which resembles descriptions by psychoanalysts could be corroborated. This was also true for the 'manic type' of personality as a (premorbid) correlate of predominantly manic forms of a bipolar I disorder. As predicted from a spectrum concept of major affective disorders, the ratio of traits of either type co-varied with the ratio of the depressive and the manic components in the long-term course of such a disorder. The two types of premorbid personality and a rare variant of the 'manic type', named 'relaxed, easy-going type', were conceived as 'affective types' dominating in major affective disorders. They are opposed to three 'neurotoid types' prevailing in so-called neurotic disorders as well as in schizophrenic psychoses. The similarity among the types can be visualized as spatial relationships in a circular, i.e. a two-dimensional, model (circumplex). Personality disorders as maladapted extreme variants of personality are, by definition, located outside the circle, mainly along its 'neurotoid' side. However, due to their transitional nature, axis I disorders cannot be represented adequately within the plane which represents (adapted as well as maladapted) forms of habitual behaviour (personality types and disorders, respectively). To integrate them into the spatial model of similarity interrelations, a dimension of actual psychopathology has to be added to the two-dimensional plane as a third (orthogonal) axis. The distance of a case from the 'ground level' of habitual behaviour corresponds with the severity of the actual psychopathological state. The specific form of that state (e.g. manic or depressive), however, varies along one the axes which define the circumplex of habitual behaviour. This three-dimensional model is, by its very nature, more complex than the unidimensional one derived from Kretschmer's typological concept, but it is clearly more in accordance with empirical data.
The derivation of scenic utility functions and surfaces and their role in landscape management
John W. Hamilton; Gregory J. Buhyoff; J. Douglas Wellman
1979-01-01
This paper outlines a methodological approach for determining relevant physical landscape features which people use in formulating judgments about scenic utility. This information, coupled with either empirically derived or rationally stipulated regression techniques, may be used to produce scenic utility functions and surfaces. These functions can provide a means for...
TEACHING PHYSICS: Biking around a hollow sphere
NASA Astrophysics Data System (ADS)
Mak, Se-yuen; Yip, Din-yan
1999-11-01
The conditions required for a cyclist riding a motorbike in a horizontal circle on or above the equator of a hollow sphere are derived using concepts of equilibrium and the condition for uniform circular motion. The result is compared with an empirical analysis based on a video show. Some special cases of interest derived from the general solution are elaborated.
GPS-Derived Precipitable Water Compared with the Air Force Weather Agency’s MM5 Model Output
2002-03-26
and less then 100 sensors are available throughout Europe . While the receiver density is currently comparable to the upper-air sounding network...profiles from 38 upper air sites throughout Europe . Based on these empirical formulae and simplifications, Bevis (1992) has determined that the error...Alaska using Bevis’ (1992) empirical correlation based on 8718 radiosonde calculations over 2 years. Other studies have been conducted in Europe and
Promoting research on research integrity in Canada.
Master, Zubin; McDonald, Michael; Williams-Jones, Bryn
2012-01-01
Research on research integrity is an important element in building a strong national research integrity framework. There is a lack of empirical evidence and conceptual research on research integrity in Canada. To further strengthen and develop our system of research integrity, we believe that greater support is needed to promote research on research integrity. Research on research integrity is imperative in order to gain a richer understanding of the diversity of responsible conduct of research norms, practices, education and policies from a Canadian perspective. The knowledge gained would help in the development of an evidenced-based and responsive Canadian system of research integrity.
Where Are the People? The Human Viewpoint Approach for Architecting and Acquisition
2014-10-01
however, systems engineers currently do not have sufficient tools to quantitatively integrate human considerations into systems development ( Hardman ...Engineering, 13(1), 72–79. Hardman , N., & Colombi, J. (2012). An empirical methodology for human integration in the SE technical processes. Journal of Systems
The Mathematics of Medical Imaging in the Classroom
ERIC Educational Resources Information Center
Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.
2002-01-01
The article presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty together with related classroom activities. Clinical medical practice and theoretical and empirical literature in mathematics education and radiology were reviewed to develop and pilot model integrative classroom topics and…
Situational Favorability and Perceived Environmental Uncertainty: An Integrative Approach
ERIC Educational Resources Information Center
Nebeker, Delbert M.
1975-01-01
Presents the conceptual and empirical basis for a possible combining of Fiedler's contingency model of leadership effectiveness and Lawrence and Lorsch's contingency organization theory. Using perceived environmental uncertainty as the integrating concept, a measure of decision uncertainty was found to be significantly related to Fiedler's…
van Noort, Paul C M
2009-06-01
Fugacity ratios of organic compounds are used to calculate (subcooled) liquid properties, such as solubility or vapour pressure, from solid properties and vice versa. They can be calculated from the entropy of fusion, the melting temperature, and heat capacity data for the solid and the liquid. For many organic compounds, values for the fusion entropy are lacking. Heat capacity data are even scarcer. In the present study, semi-empirical compound class specific equations were derived to estimate fugacity ratios from molecular weight and melting temperature for polycyclic aromatic hydrocarbons and polychlorinated benzenes, biphenyls, dibenzo[p]dioxins and dibenzofurans. These equations estimate fugacity ratios with an average standard error of about 0.05 log units. In addition, for compounds with known fusion entropy values, a general semi-empirical correction equation based on molecular weight and melting temperature was derived for estimation of the contribution of heat capacity differences to the fugacity ratio. This equation estimates the heat capacity contribution correction factor with an average standard error of 0.02 log units for polycyclic aromatic hydrocarbons, polychlorinated benzenes, biphenyls, dibenzo[p]dioxins and dibenzofurans.
An Empirical Method for deriving RBE values associated with Electrons, Photons and Radionuclides
Bellamy, Michael B; Puskin, J.; Eckerman, Keith F.; ...
2015-01-01
There is substantial evidence to justify using relative biological effectiveness (RBE) values greater than one for low-energy electrons and photons. But, in the field of radiation protection, radiation associated with low linear energy transfer (LET) has been assigned a radiation weighting factor w R of one. This value may be suitable for radiation protection but, for risk considerations, it is important to evaluate the potential elevated biological effectiveness of radiation to improve the quality of risk estimates. RBE values between 2 and 3 for tritium are implied by several experimental measurements. Additionally, elevated RBE values have been found for othermore » similar low-energy radiation sources. In this work, RBE values are derived for electrons based upon the fractional deposition of absorbed dose of energies less than a few keV. Using this empirical method, RBE values were also derived for monoenergetic photons and 1070 radionuclides from ICRP Publication 107 for which photons and electrons are the primary emissions.« less
Domain walls and ferroelectric reversal in corundum derivatives
NASA Astrophysics Data System (ADS)
Ye, Meng; Vanderbilt, David
2017-01-01
Domain walls are the topological defects that mediate polarization reversal in ferroelectrics, and they may exhibit quite different geometric and electronic structures compared to the bulk. Therefore, a detailed atomic-scale understanding of the static and dynamic properties of domain walls is of pressing interest. In this work, we use first-principles methods to study the structures of 180∘ domain walls, both in their relaxed state and along the ferroelectric reversal pathway, in ferroelectrics belonging to the family of corundum derivatives. Our calculations predict their orientation, formation energy, and migration energy and also identify important couplings between polarization, magnetization, and chirality at the domain walls. Finally, we point out a strong empirical correlation between the height of the domain-wall-mediated polarization reversal barrier and the local bonding environment of the mobile A cations as measured by bond-valence sums. Our results thus provide both theoretical and empirical guidance for future searches for ferroelectric candidates in materials of the corundum derivative family.
Domain walls and ferroelectric reversal in corundum derivatives
NASA Astrophysics Data System (ADS)
Ye, Meng; Vanderbilt, David
Domain walls are the topological defects that mediate polarization reversal in ferroelectrics, and they may exhibit quite different geometric and electronic structures compared to the bulk. Therefore, a detailed atomic-scale understanding of the static and dynamic properties of domain walls is of pressing interest. In this work, we use first-principles methods to study the structures of 180° domain walls, both in their relaxed state and along the ferroelectric reversal pathway, in ferroelectrics belonging to the family of corundum derivatives. Our calculations predict their orientation, formation energy, and migration energy, and also identify important couplings between polarization, magnetization, and chirality at the domain walls. Finally, we point out a strong empirical correlation between the height of the domain-wall mediated polarization reversal barrier and the local bonding environment of the mobile A cations as measured by bond valence sums. Our results thus provide both theoretical and empirical guidance to further search for ferroelectric candidates in materials of the corundum derivative family. The work is supported by ONR Grant N00014-12-1-1035.
Boritz, Tali Z; Bryntwick, Emily; Angus, Lynne; Greenberg, Leslie S; Constantino, Michael J
2014-01-01
While the individual contributions of narrative and emotion processes to psychotherapy outcome have been the focus of recent interest in psychotherapy research literature, the empirical analysis of narrative and emotion integration has rarely been addressed. The Narrative-Emotion Processes Coding System (NEPCS) was developed to provide researchers with a systematic method for identifying specific narrative and emotion process markers, for application to therapy session videos. The present study examined the relationship between NEPCS-derived problem markers (same old storytelling, empty storytelling, unstoried emotion, abstract storytelling) and change markers (competing plotlines storytelling, inchoate storytelling, unexpected outcome storytelling, and discovery storytelling), and treatment outcome (recovered versus unchanged at therapy termination) and stage of therapy (early, middle, late) in brief emotion-focused (EFT), client-centred (CCT), and cognitive (CT) therapies for depression. Hierarchical linear modelling analyses demonstrated a significant Outcome effect for inchoate storytelling (p = .037) and discovery storytelling (p = .002), a Stage × Outcome effect for abstract storytelling (p = .05), and a Stage × Outcome × Treatment effect for competing plotlines storytelling (p = .001). There was also a significant Stage × Outcome effect for NEPCS problem markers (p = .007) and change markers (p = .03). The results provide preliminary support for the importance of assessing the contribution of narrative-emotion processes to efficacious treatment outcomes in EFT, CCT, and CT treatments of depression.
ERIC Educational Resources Information Center
Roeder, Peter Martin
1999-01-01
Critiques two normative premises that guide the researchers' interpretation of results from the Hamburg School Experiment, an empirical study that focused on mainstreaming elementary students diagnosed as needing special education: (1) integrating these children in normal classrooms is legitimated; and (2) social integration should not preclude…
Empirical microeconomics action functionals
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Du, Xin; Tanputraman, Winson
2015-06-01
A statistical generalization of microeconomics has been made in Baaquie (2013), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is modeled by an action functional-and the focus of this paper is to empirically determine the action functionals for different commodities. The correlation functions of the model are defined using a Feynman path integral. The model is calibrated using the unequal time correlation of the market commodity prices as well as their cubic and quartic moments using a perturbation expansion. The consistency of the perturbation expansion is verified by a numerical evaluation of the path integral. Nine commodities drawn from the energy, metal and grain sectors are studied and their market behavior is described by the model to an accuracy of over 90% using only six parameters. The paper empirically establishes the existence of the action functional for commodity prices that was postulated to exist in Baaquie (2013).
2013-02-01
outcomes related to leader performance. Another significant area of interest within the empirical literature is emotional intelligence (EI), which...officers’ overall emotional intelligence and effectiveness as a leader. More specifically, when analyzing the subscales, the researchers detected...commonly asso- ciated qualities like mental abilities or emotional intelligence .17 Similar results have been obtained in other studies with a variety
Modeling the Role of Priming in Executive Control: Cognitive and Neural Constraints
2012-01-24
theoretical and empirical advances in our understanding of cognitive control. We discovered new phenomena and developed theories to account for them. We...developed theories of cognitive control and visual attention that integrated mathematical psychology with cognitive science and with neuroscience. We...significant theoretical and empirical advances in our understanding of cognitive control. We discovered new phenomena and developed theories to account
VizieR Online Data Catalog: A framework for empirical galaxy phenomenology (Munoz+, 2015)
NASA Astrophysics Data System (ADS)
Munoz, J. A.; Peeples, M. S.
2017-11-01
In this study, we develop a cohesive theoretical formalism for translating empirical relations into an understanding of the variations in galactic star formation histories. We achieve this goal by incorporating into the Main Sequence Integration (MSI) method the scatter suggested by the evolving fraction of quiescent galaxies and the spread in the observed stellar mass-star formation rate relation. (2 data files).
Students' communication, argumentation and knowledge in a citizens' conference on global warming
NASA Astrophysics Data System (ADS)
Albe, Virginie; Gombert, Marie-José
2012-09-01
An empirical study on 12th-grade students' engagement on a global warming debate as a citizens' conference is reported. Within the design-based research methodology, an interdisciplinary teaching sequence integrating an initiation to non-violent communication was developed. Students' debates were analyzed according to three dimensions: communication, argumentation, and knowledge. Students regulated their oral contributions to the debate by identifying judgments in their discussions. Rhetorical processes developed by students were mainly related to the identity of debate protagonists with interest attributions, authority, and positions. Students' arguments also relied on empirical data. The students' knowledge focused on energy choices, economic, political, and science development issues. Implications for socioscientific issues integration in class are discussed.
NASA Astrophysics Data System (ADS)
Muzy, Jean-François; Baïle, Rachel; Bacry, Emmanuel
2013-04-01
In this paper we propose a new model for volatility fluctuations in financial time series. This model relies on a nonstationary Gaussian process that exhibits aging behavior. It turns out that its properties, over any finite time interval, are very close to continuous cascade models. These latter models are indeed well known to reproduce faithfully the main stylized facts of financial time series. However, it involves a large-scale parameter (the so-called “integral scale” where the cascade is initiated) that is hard to interpret in finance. Moreover, the empirical value of the integral scale is in general deeply correlated to the overall length of the sample. This feature is precisely predicted by our model, which, as illustrated by various examples from daily stock index data, quantitatively reproduces the empirical observations.
NASA Astrophysics Data System (ADS)
Ma, Chao; Ma, Qinghua; Yao, Haixiang; Hou, Tiancheng
2018-03-01
In this paper, we propose to use the Fractional Stable Process (FSP) for option pricing. The FSP is one of the few candidates to directly model a number of desired empirical properties of asset price risk neutral dynamics. However, pricing the vanilla European option under FSP is difficult and problematic. In the paper, built upon the developed Feynman Path Integral inspired techniques, we present a novel computational model for option pricing, i.e. the Fractional Stable Process Path Integral (FSPPI) model under a general fractional stable distribution that tackles this problem. Numerical and empirical experiments show that the proposed pricing model provides a correction of the Black-Scholes pricing error - overpricing long term options, underpricing short term options; overpricing out-of-the-money options, underpricing in-the-money options without any additional structures such as stochastic volatility and a jump process.
The birth of the empirical turn in bioethics.
Borry, Pascal; Schotsmans, Paul; Dierickx, Kris
2005-02-01
Since its origin, bioethics has attracted the collaboration of few social scientists, and social scientific methods of gathering empirical data have remained unfamiliar to ethicists. Recently, however, the clouded relations between the empirical and normative perspectives on bioethics appear to be changing. Three reasons explain why there was no easy and consistent input of empirical evidence in bioethics. Firstly, interdisciplinary dialogue runs the risk of communication problems and divergent objectives. Secondly, the social sciences were absent partners since the beginning of bioethics. Thirdly, the meta-ethical distinction between 'is' and 'ought' created a 'natural' border between the disciplines. Now, bioethics tends to accommodate more empirical research. Three hypotheses explain this emergence. Firstly, dissatisfaction with a foundationalist interpretation of applied ethics created a stimulus to incorporate empirical research in bioethics. Secondly, clinical ethicists became engaged in empirical research due to their strong integration in the medical setting. Thirdly, the rise of the evidence-based paradigm had an influence on the practice of bioethics. However, a problematic relationship cannot simply and easily evolve into a perfect interaction. A new and positive climate for empirical approaches has arisen, but the original difficulties have not disappeared.
Köster, Andreas; Spura, Thomas; Rutkai, Gábor; Kessler, Jan; Wiebeler, Hendrik; Vrabec, Jadran; Kühne, Thomas D
2016-07-15
The accuracy of water models derived from ab initio molecular dynamics simulations by means on an improved force-matching scheme is assessed for various thermodynamic, transport, and structural properties. It is found that although the resulting force-matched water models are typically less accurate than fully empirical force fields in predicting thermodynamic properties, they are nevertheless much more accurate than generally appreciated in reproducing the structure of liquid water and in fact superseding most of the commonly used empirical water models. This development demonstrates the feasibility to routinely parametrize computationally efficient yet predictive potential energy functions based on accurate ab initio molecular dynamics simulations for a large variety of different systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Common Factors: Where the Soul of Counseling and Psychotherapy Resides
ERIC Educational Resources Information Center
Ottens, Allen J.; Klein, James F.
2005-01-01
The authors show how theoretical and empirical findings from the common factors and psychotherapy integration literatures possess potential for infusing soul into psychotherapy. They describe the term soul, outline how the definition translates into soul-nurturing psychotherapy, examine the common factors and integration literatures, and discuss…
Moving toward a Mobile Learning Landscape: Presenting a M-Learning Integration Framework
ERIC Educational Resources Information Center
Crompton, Helen
2017-01-01
Purpose: Mobile devices transcend the educational affordances provided by conventional tethered electronic and traditional learning. However, empirical findings show that educators are not integrating technology effectively into the curriculum. This paper aims to discuss these issues. Design/Methodology/Approach: In this study, a thematic…
ERIC Educational Resources Information Center
Galliher, Renee V.; Rivas-Drake, Deborah; Dubow, Eric F.
2017-01-01
This introductory summary provides an overview of the content of the special issue entitled "Identity Development Process and Content: Toward an Integrated and Contextualized Science of Identity." The 16 theoretical and empirical articles that comprise this special issue were selected to highlight innovative methodologies, theoretical…
Precision Orbit Derived Atmospheric Density: Development and Performance
NASA Astrophysics Data System (ADS)
McLaughlin, C.; Hiatt, A.; Lechtenberg, T.; Fattig, E.; Mehta, P.
2012-09-01
Precision orbit ephemerides (POE) are used to estimate atmospheric density along the orbits of CHAMP (Challenging Minisatellite Payload) and GRACE (Gravity Recovery and Climate Experiment). The densities are calibrated against accelerometer derived densities and considering ballistic coefficient estimation results. The 14-hour density solutions are stitched together using a linear weighted blending technique to obtain continuous solutions over the entire mission life of CHAMP and through 2011 for GRACE. POE derived densities outperform the High Accuracy Satellite Drag Model (HASDM), Jacchia 71 model, and NRLMSISE-2000 model densities when comparing cross correlation and RMS with accelerometer derived densities. Drag is the largest error source for estimating and predicting orbits for low Earth orbit satellites. This is one of the major areas that should be addressed to improve overall space surveillance capabilities; in particular, catalog maintenance. Generally, density is the largest error source in satellite drag calculations and current empirical density models such as Jacchia 71 and NRLMSISE-2000 have significant errors. Dynamic calibration of the atmosphere (DCA) has provided measurable improvements to the empirical density models and accelerometer derived densities of extremely high precision are available for a few satellites. However, DCA generally relies on observations of limited accuracy and accelerometer derived densities are extremely limited in terms of measurement coverage at any given time. The goal of this research is to provide an additional data source using satellites that have precision orbits available using Global Positioning System measurements and/or satellite laser ranging. These measurements strike a balance between the global coverage provided by DCA and the precise measurements of accelerometers. The temporal resolution of the POE derived density estimates is around 20-30 minutes, which is significantly worse than that of accelerometer derived density estimates. However, major variations in density are observed in the POE derived densities. These POE derived densities in combination with other data sources can be assimilated into physics based general circulation models of the thermosphere and ionosphere with the possibility of providing improved density forecasts for satellite drag analysis. POE derived density estimates were initially developed using CHAMP and GRACE data so comparisons could be made with accelerometer derived density estimates. This paper presents the results of the most extensive calibration of POE derived densities compared to accelerometer derived densities and provides the reasoning for selecting certain parameters in the estimation process. The factors taken into account for these selections are the cross correlation and RMS performance compared to the accelerometer derived densities and the output of the ballistic coefficient estimation that occurs simultaneously with the density estimation. This paper also presents the complete data set of CHAMP and GRACE results and shows that the POE derived densities match the accelerometer densities better than empirical models or DCA. This paves the way to expand the POE derived densities to include other satellites with quality GPS and/or satellite laser ranging observations.
Empirical Model of Precipitating Ion Oval
NASA Astrophysics Data System (ADS)
Goldstein, Jerry
2017-10-01
In this brief technical report published maps of ion integral flux are used to constrain an empirical model of the precipitating ion oval. The ion oval is modeled as a Gaussian function of ionospheric latitude that depends on local time and the Kp geomagnetic index. The three parameters defining this function are the centroid latitude, width, and amplitude. The local time dependences of these three parameters are approximated by Fourier series expansions whose coefficients are constrained by the published ion maps. The Kp dependence of each coefficient is modeled by a linear fit. Optimization of the number of terms in the expansion is achieved via minimization of the global standard deviation between the model and the published ion map at each Kp. The empirical model is valid near the peak flux of the auroral oval; inside its centroid region the model reproduces the published ion maps with standard deviations of less than 5% of the peak integral flux. On the subglobal scale, average local errors (measured as a fraction of the point-to-point integral flux) are below 30% in the centroid region. Outside its centroid region the model deviates significantly from the H89 integral flux maps. The model's performance is assessed by comparing it with both local and global data from a 17 April 2002 substorm event. The model can reproduce important features of the macroscale auroral region but none of its subglobal structure, and not immediately following a substorm.
Loving-kindness brings loving-kindness: the impact of Buddhism on cognitive self-other integration.
Colzato, Lorenza S; Zech, Hilmar; Hommel, Bernhard; Verdonschot, Rinus; van den Wildenberg, Wery P M; Hsieh, Shulan
2012-06-01
Common wisdom has it that Buddhism enhances compassion and self-other integration. We put this assumption to empirical test by comparing practicing Taiwanese Buddhists with well-matched atheists. Buddhists showed more evidence of self-other integration in the social Simon task, which assesses the degree to which people co-represent the actions of a coactor. This suggests that self-other integration and task co-representation vary as a function of religious practice.
Godecharle, Simon; Nemery, Benoit; Dierickx, Kris
2017-09-14
Despite the ever increasing collaboration between industry and universities, the previous empirical studies on research integrity and misconduct excluded participants of biomedical industry. Hence, there is a lack of empirical data on how research managers and biomedical researchers active in industry perceive the issues of research integrity and misconduct, and whether or not their perspectives differ from those of researchers and research managers active in universities. If various standards concerning research integrity and misconduct are upheld between industry and universities, this might undermine research collaborations. Therefore we performed a qualitative study by conducting 22 semi-structured interviews in order to investigate and compare the perspectives and attitudes concerning the issues of research integrity and misconduct of research managers and biomedical researchers active in industry and universities. Our study showed clear discrepancies between both groups. Diverse strategies in order to manage research misconduct and to stimulate research integrity were observed. Different definitions of research misconduct were given, indicating that similar actions are judged heterogeneously. There were also differences at an individual level, whether the interviewees were active in industry or universities. Overall, the management of research integrity proves to be a difficult exercise, due to many diverse perspectives on several essential elements connected to research integrity and misconduct. A management policy that is not in line with the vision of the biomedical researchers and research managers is at risk of being inefficient.
Competency-Based Curriculum Development: A Pragmatic Approach
ERIC Educational Resources Information Center
Broski, David; And Others
1977-01-01
Examines the concept of competency-based education, describes an experience-based model for its development, and discusses some empirically derived rules-of-thumb for its application in allied health. (HD)
Systematic approach to developing empirical interatomic potentials for III-N semiconductors
NASA Astrophysics Data System (ADS)
Ito, Tomonori; Akiyama, Toru; Nakamura, Kohji
2016-05-01
A systematic approach to the derivation of empirical interatomic potentials is developed for III-N semiconductors with the aid of ab initio calculations. The parameter values of empirical potential based on bond order potential are determined by reproducing the cohesive energy differences among 3-fold coordinated hexagonal, 4-fold coordinated zinc blende, wurtzite, and 6-fold coordinated rocksalt structures in BN, AlN, GaN, and InN. The bond order p is successfully introduced as a function of the coordination number Z in the form of p = a exp(-bZn ) if Z ≤ 4 and p = (4/Z)α if Z ≥ 4 in empirical interatomic potential. Moreover, the energy difference between wurtzite and zinc blende structures can be successfully evaluated by considering interaction beyond the second-nearest neighbors as a function of ionicity. This approach is feasible for developing empirical interatomic potentials applicable to a system consisting of poorly coordinated atoms at surfaces and interfaces including nanostructures.
Potential relative increment (PRI): a new method to empirically derive optimal tree diameter growth
Don C Bragg
2001-01-01
Potential relative increment (PRI) is a new method to derive optimal diameter growth equations using inventory information from a large public database. Optimal growth equations for 24 species were developed using plot and tree records from several states (Michigan, Minnesota, and Wisconsin) of the North Central US. Most species were represented by thousands of...
Predictive and mechanistic multivariate linear regression models for reaction development
Santiago, Celine B.; Guo, Jing-Yao
2018-01-01
Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711
An Empirically-Derived Index of High School Academic Rigor. ACT Working Paper 2017-5
ERIC Educational Resources Information Center
Allen, Jeff; Ndum, Edwin; Mattern, Krista
2017-01-01
We derived an index of high school academic rigor by optimizing the prediction of first-year college GPA based on high school courses taken, grades, and indicators of advanced coursework. Using a large data set (n~108,000) and nominal parameterization of high school course outcomes, the high school academic rigor (HSAR) index capitalizes on…
A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution.
Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep
2017-01-01
The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section.
Mapping watershed integrity for the conterminous United States.
Thornbrugh, Darren J; Leibowitz, Scott G; Hill, Ryan A; Weber, Marc H; Johnson, Zachary C; Olsen, Anthony R; Flotemersch, Joseph E; Stoddard, John L; Peck, David V
2018-02-01
Watershed integrity is the capacity of a watershed to support and maintain the full range of ecological processes and functions essential to sustainability. Using information from EPA's StreamCat dataset, we calculated and mapped an Index of Watershed Integrity (IWI) for 2.6 million watersheds in the conterminous US with first-order approximations of relationships between stressors and six watershed functions: hydrologic regulation, regulation of water chemistry, sediment regulation, hydrologic connectivity, temperature regulation, and habitat provision. Results show high integrity in the western US, intermediate integrity in the southern and eastern US, and the lowest integrity in the temperate plains and lower Mississippi Valley. Correlation between the six functional components was high ( r = 0.85-0.98). A related Index of Catchment Integrity (ICI) was developed using local drainages of individual stream segments (i.e., excluding upstream information). We evaluated the ability of the IWI and ICI to predict six continuous site-level indicators with regression analyses - three biological indicators and principal components derived from water quality, habitat, and combined water quality and habitat variables - using data from EPA's National Rivers and Streams Assessment. Relationships were highly significant, but the IWI only accounted for 1-12% of the variation in the four biological and habitat variables. The IWI accounted for over 25% of the variation in the water quality and combined principal components nationally, and 32-39% in the Northern and Southern Appalachians. We also used multinomial logistic regression to compare the IWI with the categorical forms of the three biological indicators. Results were consistent: we found positive associations but modest results. We compared how the IWI and ICI predicted the water quality PC relative to agricultural and urban land use. The IWI or ICI are the best predictors of the water quality PC for the CONUS and six of the nine ecoregions, but they only perform marginally better than agriculture in most instances. However, results suggest that agriculture would not be appropriate in all parts of the country, and the index is meant to be responsive to all stressors. The IWI in its present form (available through the StreamCat website; https://www.epa.gov/national-aquatic-resource-surveys/streamcat) could be useful for management efforts at multiple scales, especially when combined with information on site condition. The IWI could be improved by incorporating empirical or literature-derived relationships between functional components and stressors. However, limitations concerning the absence of data for certain stressors should be considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langenbrunner, James R.; Booker, Jane M.
We examine the derivatives with respect to temperature, for various deuterium-tritium (DT) and deuterium-deuterium (D-D) fusion-reactivity formulations. Langenbrunner and Makaruk [1] had studied this as a means of understanding the time and temperature domain of reaction history measured in dynamic fusion experiments. Presently, we consider the temperature derivative dependence of fusion reactivity as a means of exercising and verifying the consistency of the various reactivity formulations.
How rational should bioethics be? The value of empirical approaches.
Alvarez, A A
2001-10-01
Rational justification of claims with empirical content calls for empirical and not only normative philosophical investigation. Empirical approaches to bioethics are epistemically valuable, i.e., such methods may be necessary in providing and verifying basic knowledge about cultural values and norms. Our assumptions in moral reasoning can be verified or corrected using these methods. Moral arguments can be initiated or adjudicated by data drawn from empirical investigation. One may argue that individualistic informed consent, for example, is not compatible with the Asian communitarian orientation. But this normative claim uses an empirical assumption that may be contrary to the fact that some Asians do value and argue for informed consent. Is it necessary and factual to neatly characterize some cultures as individualistic and some as communitarian? Empirical investigation can provide a reasonable way to inform such generalizations. In a multi-cultural context, such as in the Philippines, there is a need to investigate the nature of the local ethos before making any appeal to authenticity. Otherwise we may succumb to the same ethical imperialism we are trying hard to resist. Normative claims that involve empirical premises cannot be reasonable verified or evaluated without utilizing empirical methods along with philosophical reflection. The integration of empirical methods to the standard normative approach to moral reasoning should be reasonably guided by the epistemic demands of claims arising from cross-cultural discourse in bioethics.
Highly Integrated Quality Assurance – An Empirical Case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake Kirkham; Amy Powell; Lucas Rich
2011-02-01
Highly Integrated Quality Assurance – An Empirical Case Drake Kirkham1, Amy Powell2, Lucas Rich3 1Quality Manager, Radioisotope Power Systems (RPS) Program, Idaho National Laboratory, P.O. Box 1625 M/S 6122, Idaho Falls, ID 83415-6122 2Quality Engineer, RPS Program, Idaho National Laboratory 3Quality Engineer, RPS Program, Idaho National Laboratory Contact: Voice: (208) 533-7550 Email: Drake.Kirkham@inl.gov Abstract. The Radioisotope Power Systems Program of the Idaho National Laboratory makes an empirical case for a highly integrated Quality Assurance function pertaining to the preparation, assembly, testing, storage and transportation of 238Pu fueled radioisotope thermoelectric generators. Case data represents multiple campaigns including the Pluto/New Horizons mission,more » the Mars Science Laboratory mission in progress, and other related projects. Traditional Quality Assurance models would attempt to reduce cost by minimizing the role of dedicated Quality Assurance personnel in favor of either functional tasking or peer-based implementations. Highly integrated Quality Assurance adds value by placing trained quality inspectors on the production floor side-by-side with nuclear facility operators to enhance team dynamics, reduce inspection wait time, and provide for immediate, independent feedback. Value is also added by maintaining dedicated Quality Engineers to provide for rapid identification and resolution of corrective action, enhanced and expedited supply chain interfaces, improved bonded storage capabilities, and technical resources for requirements management including data package development and Certificates of Inspection. A broad examination of cost-benefit indicates highly integrated Quality Assurance can reduce cost through the mitigation of risk and reducing administrative burden thereby allowing engineers to be engineers, nuclear operators to be nuclear operators, and the cross-functional team to operate more efficiently. Applicability of this case extends to any high-value, long-term project where traceability and accountability are determining factors.« less
[Mobbing: a meta-analysis and integrative model of its antecedents and consequences].
Topa Cantisano, Gabriela; Depolo, Marco; Morales Domínguez, J Francisco
2007-02-01
Although mobbing has been extensively studied, empirical research has not led to firm conclusions regarding its antecedents and consequences, both at personal and organizational levels. An extensive literature search yielded 86 empirical studies with 93 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported hypotheses regarding organizational environmental factors as main predictors of mobbing.
Eichengreen, Adva; Hoofien, Dan; Bachar, Eytan
2016-02-01
The concept of the false self has been used widely in psychoanalytic theory and practice but seldom in empirical research. In this empirically based study, elevated features of false-self defense were hypothetically associated with risk factors attendant on processes of rehabilitation and integration of children with disabilities, processes that encourage adaptation of the child to the able-bodied environment. Self-report questionnaires and in-depth interviews were conducted with 88 deaf and hard-of-hearing students and a comparison group of 88 hearing counterparts. Results demonstrate that despite the important contribution of rehabilitation and integration to the well-being of these children, these efforts may put the child at risk of increased use of the false-self defense. The empirical findings suggest two general theoretical conclusions: (1) The Winnicottian concept of the environment, usually confined to the parent-child relationship, can be understood more broadly as including cultural, social, and rehabilitational variables that both influence the parent-child relationship and operate independently of it. (2) The monolithic conceptualization of the false self may be more accurately unpacked to reveal two distinct subtypes: the compliant and the split false self. © 2016 by the American Psychoanalytic Association.
Observability, Visualizability and the Question of Metaphysical Neutrality
NASA Astrophysics Data System (ADS)
Wolff, Johanna
2015-09-01
Theories in fundamental physics are unlikely to be ontologically neutral, yet they may nonetheless fail to offer decisive empirical support for or against particular metaphysical positions. I illustrate this point by close examination of a particular objection raised by Wolfgang Pauli against Hermann Weyl. The exchange reveals that both parties to the dispute appeal to broader epistemological principles to defend their preferred metaphysical starting points. I suggest that this should make us hesitant to assume that in deriving metaphysical conclusions from physical theories we place our metaphysical theories on a purely empirical foundation. The metaphysics within a particular physical theory may well be the result of a priori assumptions in the background, not particular empirical findings.
Modelling nutritional mutualisms: challenges and opportunities for data integration.
Clark, Teresa J; Friel, Colleen A; Grman, Emily; Shachar-Hill, Yair; Friesen, Maren L
2017-09-01
Nutritional mutualisms are ancient, widespread, and profoundly influential in biological communities and ecosystems. Although much is known about these interactions, comprehensive answers to fundamental questions, such as how resource availability and structured interactions influence mutualism persistence, are still lacking. Mathematical modelling of nutritional mutualisms has great potential to facilitate the search for comprehensive answers to these and other fundamental questions by connecting the physiological and genomic underpinnings of mutualisms with ecological and evolutionary processes. In particular, when integrated with empirical data, models enable understanding of underlying mechanisms and generalisation of principles beyond the particulars of a given system. Here, we demonstrate how mathematical models can be integrated with data to address questions of mutualism persistence at four biological scales: cell, individual, population, and community. We highlight select studies where data has been or could be integrated with models to either inform model structure or test model predictions. We also point out opportunities to increase model rigour through tighter integration with data, and describe areas in which data is urgently needed. We focus on plant-microbe systems, for which a wealth of empirical data is available, but the principles and approaches can be generally applied to any nutritional mutualism. © 2017 John Wiley & Sons Ltd/CNRS.
Integrative genetic risk prediction using non-parametric empirical Bayes classification.
Zhao, Sihai Dave
2017-06-01
Genetic risk prediction is an important component of individualized medicine, but prediction accuracies remain low for many complex diseases. A fundamental limitation is the sample sizes of the studies on which the prediction algorithms are trained. One way to increase the effective sample size is to integrate information from previously existing studies. However, it can be difficult to find existing data that examine the target disease of interest, especially if that disease is rare or poorly studied. Furthermore, individual-level genotype data from these auxiliary studies are typically difficult to obtain. This article proposes a new approach to integrative genetic risk prediction of complex diseases with binary phenotypes. It accommodates possible heterogeneity in the genetic etiologies of the target and auxiliary diseases using a tuning parameter-free non-parametric empirical Bayes procedure, and can be trained using only auxiliary summary statistics. Simulation studies show that the proposed method can provide superior predictive accuracy relative to non-integrative as well as integrative classifiers. The method is applied to a recent study of pediatric autoimmune diseases, where it substantially reduces prediction error for certain target/auxiliary disease combinations. The proposed method is implemented in the R package ssa. © 2016, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Michel, Clotaire; Hobiger, Manuel; Edwards, Benjamin; Poggi, Valerio; Burjanek, Jan; Cauzzi, Carlo; Kästli, Philipp; Fäh, Donat
2016-04-01
The Swiss Seismological Service operates one of the densest national seismic networks in the world, still rapidly expanding (see http://www.seismo.ethz.ch/monitor/index_EN). Since 2009, every newly instrumented site is characterized following an established procedure to derive realistic 1D VS velocity profiles. In addition, empirical Fourier spectral modeling is performed on the whole network for each recorded event with sufficient signal-to-noise ratio. Besides the source characteristics of the earthquakes, statistical real time analyses of the residuals of the spectral modeling provide a seamlessly updated amplification function w.r. to Swiss rock conditions at every station. Our site characterization procedure is mainly based on the analysis of surface waves from passive experiments and includes cross-checks of the derived amplification functions with those obtained through spectral modeling. The systematic use of three component surface-wave analysis, allowing the derivation of both Rayleigh and Love waves dispersion curves, also contributes to the improved quality of the retrieved profiles. The results of site characterisation activities at recently installed strong-motion stations depict the large variety of possible effects of surface geology on ground motion in the Alpine context. Such effects range from de-amplification at hard-rock sites to amplification up to a factor of 15 in lacustrine sediments with respect to the Swiss reference rock velocity model. The derived velocity profiles are shown to reproduce observed amplification functions from empirical spectral modeling. Although many sites are found to exhibit 1D behavior, our procedure allows the detection and qualification of 2D and 3D effects. All data collected during the site characterization procedures in the last 20 years are gathered in a database, implementing a data model proposed for community use at the European scale through NERA and EPOS (www.epos-eu.org). A web stationbook derived from it can be accessed through the interface www.stations.seismo.ethz.ch.
Preventing Suicide: A Mission Too Big to Fail
2013-03-01
Reframing Suicide in the Military,” 8. 141 Emile Durkheim , Suicide: A Study in Sociology (New York, New York: The Free Press, 1951), 298. 142...Positive Psychology Progress: Empirical Validation of Interventions,” American Psychology 60, no. 5 (Jul-Aug 2005): 410-21. 148 Emile Durkheim , Suicide: A...result, Durkheim studied two kinds of regulation: social integration and moral integration.145 Social and moral integration are the foundation on
NASA Astrophysics Data System (ADS)
Mullen, Katharine M.
Human-technology integration is the replacement of human parts and extension of human capabilities with engineered devices and substrates. Its result is hybrid biological-artificial systems. We discuss here four categories of products furthering human-technology integration: wearable computers, pervasive computing environments, engineered tissues and organs, and prosthetics, and introduce examples of currently realized systems in each category. We then note that realization of a completely artificial sytem via the path of human-technology integration presents the prospect of empirical confirmation of an aware artificially embodied system.
Types of Faculty Scholars in Community Colleges
ERIC Educational Resources Information Center
Park, Toby J.; Braxton, John M.; Lyken-Segosebe, Dawn
2015-01-01
This chapter describes three empirically derived types of faculty scholars in community colleges: Immersed Scholars, Scholars of Dissemination, and Scholars of Pedagogical Knowledge. This chapter discusses these types and offers a recommendation.
Can Integrated Skills Tasks Change Students' Learning Strategies and Materials?
ERIC Educational Resources Information Center
Wei, Wei
2017-01-01
The use of integrated skills tasks in language tests has been debated for many years and international English test developers such as Educational Testing Service (ETS) and Pearson Tests of English (PTE) already use such tests to assess English as a foreign language (EFL) learners' language proficiency. Empirical research has rarely investigated…
An Integrative Conceptual Framework for Assessing and Treating Suicidal Behavior in Adolescents.
ERIC Educational Resources Information Center
Rudd, M. David; Joiner, Thomas E., Jr.
1998-01-01
An integrative conceptual framework is provided for ongoing assessment and day-to-day treatment of suicidal adolescents. Goals are to provide a summary of therapeutic and assessment tasks consistent with existing standards of care and supported by empirical findings and to emphasize the roles, tasks, demands, and limitations of psychotherapy with…
The Common Factors Discrimination Model: An Integrated Approach to Counselor Supervision
ERIC Educational Resources Information Center
Crunk, A. Elizabeth; Barden, Sejal M.
2017-01-01
Numerous models of clinical supervision have been developed; however, there is little empirical support indicating that any one model is superior. Therefore, common factors approaches to supervision integrate essential components that are shared among counseling and supervision models. The purpose of this paper is to present an innovative model of…
ERIC Educational Resources Information Center
Fü rst, Guillaume; Ghisletta, Paolo; Lubart, Todd
2016-01-01
The present work proposes an integrative model of creativity that includes personality traits and cognitive processes. This model hypothesizes that three high-order personality factors predict two main process factors, which in turn predict intensity and achievement of creative activities. The personality factors are: "Plasticity" (high…
ERIC Educational Resources Information Center
Sanetti, Lisa M. Hagermoser; Gritter, Katie L.; Dobey, Lisa M.
2011-01-01
Increased accountability in education has resulted in a focus on implementing interventions with strong empirical support. Both student outcome and treatment integrity data are needed to draw valid conclusions about intervention effectiveness. Reviews of the literature in other fields (e.g., applied behavior analysis, prevention science) suggest…
Classroom Strategies Coaching Model: Integration of Formative Assessment and Instructional Coaching
ERIC Educational Resources Information Center
Reddy, Linda A.; Dudek, Christopher M.; Lekwa, Adam
2017-01-01
This article describes the theory, key components, and empirical support for the Classroom Strategies Coaching (CSC) Model, a data-driven coaching approach that systematically integrates data from multiple observations to identify teacher practice needs and goals, design practice plans, and evaluate progress towards goals. The primary aim of the…
Towards an Integration of Research on Teaching and Learning
ERIC Educational Resources Information Center
Svensson, Lennart
2016-01-01
The aim of this article is to present arguments for an integrated empirical research on teaching and learning based on previous research and the phenomenographic research tradition. From 1970 and for some years after, the main focus in phenomenographic research was on students' approaches to and understanding of subject matter. Later, based on…
Effects of Instruction-Supported Learning with Worked Examples in Quantitative Method Training
ERIC Educational Resources Information Center
Wagner, Kai; Klein, Martin; Klopp, Eric; Puhl, Thomas; Stark, Robin
2013-01-01
An experimental field study at a German university was conducted in order to test the effectiveness of an integrated learning environment to improve the acquisition of knowledge about empirical research methods. The integrated learning environment was based on the combination of instruction-oriented and problem-oriented design principles and…
The Effects of Ability Grouping: A Meta-Analysis of Research Findings.
ERIC Educational Resources Information Center
Noland, Theresa Koontz; Taylor, Bob L.
The study reported in this paper quantitatively integrated the recent research findings on ability grouping in order to generalize about these effects on student achievement and student self-concept. Meta-analysis was used to statistically integrate the empirical data. The relationships among various experimental variables including grade level,…
Coping, Regulation, and Development during Childhood and Adolescence
ERIC Educational Resources Information Center
Compas, Bruce E.
2009-01-01
This chapter identifies four challenges to the study of the development of coping and regulation and outlines specific theoretical and empirical strategies for addressing them. The challenges are (1) to integrate work on coping and processes of emotion regulation, (2) to use the integration of research on neuro-biology and context to inform the…
Integrated Sustainability Reporting at HNE Eberswalde--A Practice Report
ERIC Educational Resources Information Center
Kräusche, Kerstin; Pilz, Stefanie
2018-01-01
Purpose: The purpose of this paper is to present the development of an integrated sustainability reporting. In this paper success criteria are named and empirical values when dealingwith specific challenges are formulated. The focus is on the development of criteria for reporting, the involvement of university members and quality assurance.…
Conceptual Integration of Chemical Equilibrium by Prospective Physical Sciences Teachers
ERIC Educational Resources Information Center
Ganaras, Kostas; Dumon, Alain; Larcher, Claudine
2008-01-01
This article describes an empirical study concerning the mastering of the chemical equilibrium concept by prospective physical sciences teachers. The main objective was to check whether the concept of chemical equilibrium had become an integrating and unifying concept for them, that is to say an operational and functional knowledge to explain and…
ERIC Educational Resources Information Center
Dwyer, Tomás
2017-01-01
Student-faculty interactions are a component of social integration, a key concept in Tinto's theory of student persistence which has received empirical support. However, the influence of social integration for commuting students has been questioned. Furthermore, student-faculty interactions in the classroom are under-researched and arguably…
Integrating Learning Outcome Typologies for HRD: Review and Current Status
ERIC Educational Resources Information Center
Lim, Doo Hun; Yoon, Seung Won; Park, Sunyoung
2013-01-01
This study reports the result of literature review in regards to learning outcome studies and presents a framework that integrates content types with learning outcomes. Analysis of learning outcome studies between 1992 and 2006 using the ERIC database indicated that most empirical studies have assessed the learning outcome at lower levels of…
Integration of Learning: A Grounded Theory Analysis of College Students' Learning
ERIC Educational Resources Information Center
Barber, James P.
2012-01-01
This article presents a grounded theory of "integration of learning" among traditional aged college students, which is characterized by the demonstrated ability to link various skills and knowledge learned in a variety of contexts. The author analyzed 194 interviews with students at liberal arts colleges to investigate empirically the ways…
ERIC Educational Resources Information Center
Stukalina, Yulia
2016-01-01
Purpose: The purpose of this paper is to explore some issues related to enhancing the quality of educational services provided by a university in the agenda of integrating quality assurance activities and strategic management procedures. Design/methodology/approach: Employing multiple regression analysis the author has examined some factors that…
A Research Synthesis of the Evaluation Capacity Building Literature
ERIC Educational Resources Information Center
Labin, Susan N.; Duffy, Jennifer L.; Meyers, Duncan C.; Wandersman, Abraham; Lesesne, Catherine A.
2012-01-01
The continuously growing demand for program results has produced an increased need for evaluation capacity building (ECB). The "Integrative ECB Model" was developed to integrate concepts from existing ECB theory literature and to structure a synthesis of the empirical ECB literature. The study used a broad-based research synthesis method with…
ERIC Educational Resources Information Center
Strang, Kenneth David
2010-01-01
The study examined 2500 business degree students from 21 countries, enrolled at an Australian university, using a survey to assess learning style, which was integrated into a global culture taxonomy. The research hypothesis was that academic outcome could be explained through an interdisciplinary model, by integrating proven theories from…
Characterization of Nanoscale Gas Transport in Shale Formations
NASA Astrophysics Data System (ADS)
Chai, D.; Li, X.
2017-12-01
Non-Darcy flow behavior can be commonly observed in nano-sized pores of matrix. Most existing gas flow models characterize non-Darcy flow by empirical or semi-empirical methods without considering the real gas effect. In this paper, a novel layered model with physical meanings is proposed for both ideal and real gas transports in nanopores. It can be further coupled with hydraulic fracturing models and consequently benefit the storage evaluation and production prediction for shale gas recovery. It is hypothesized that a nanotube can be divided into a central circular zone where the viscous flow behavior mainly exists due to dominant intermolecular collisions and an outer annular zone where the Knudsen diffusion mainly exists because of dominant collisions between molecules and the wall. The flux is derived based on integration of two zones by applying the virtual boundary. Subsequently, the model is modified by incorporating slip effect, real gas effect, porosity distribution, and tortuosity. Meanwhile, a multi-objective optimization method (MOP) is applied to assist the validation of analytical model to search fitting parameters which are highly localized and contain significant uncertainties. The apparent permeability is finally derived and analyzed with various impact factors. The developed nanoscale gas transport model is well validated by the flux data collected from both laboratory experiments and molecular simulations over the entire spectrum of flow regimes. It has a decrease of as much as 43.8% in total molar flux when the real gas effect is considered in the model. Such an effect is found to be more significant as pore size shrinks. Knudsen diffusion accounts for more than 60% of the total gas flux when pressure is lower than 0.2 MPa and pore size is smaller than 50 nm. Overall, the apparent permeability is found to decrease with pressure, though it rarely changes when pressure is higher than 5.0 MPa and pore size is larger than 50 nm.
Davison, James A
2015-01-01
Purpose To present a cause of posterior capsule aspiration and a technique using optimized parameters to prevent it from happening when operating soft cataracts. Patients and methods A prospective list of posterior capsule aspiration cases was kept over 4,062 consecutive cases operated with the Alcon CENTURION machine and Balanced Tip. Video analysis of one case of posterior capsule aspiration was accomplished. A surgical technique was developed using empirically derived machine parameters and customized setting-selection procedure step toolbar to reduce the pace of aspiration of soft nuclear quadrants in order to prevent capsule aspiration. Results Two cases out of 3,238 experienced posterior capsule aspiration before use of the soft quadrant technique. Video analysis showed an attractive vortex effect with capsule aspiration occurring in 1/5 of a second. A soft quadrant removal setting was empirically derived which had a slower pace and seemed more controlled with no capsule aspiration occurring in the subsequent 824 cases. The setting featured simultaneous linear control from zero to preset maximums for: aspiration flow, 20 mL/min; and vacuum, 400 mmHg, with the addition of torsional tip amplitude up to 20% after the fluidic maximums were achieved. A new setting selection procedure step toolbar was created to increase intraoperative flexibility by providing instantaneous shifting between the soft and normal settings. Conclusion A technique incorporating a reduced pace for soft quadrant acquisition and aspiration can be accomplished through the use of a dedicated setting of integrated machine parameters. Toolbar placement of the procedure button next to the normal setting procedure button provides the opportunity to instantaneously alternate between the two settings. Simultaneous surgeon control over vacuum, aspiration flow, and torsional tip motion may make removal of soft nuclear quadrants more efficient and safer. PMID:26355695
Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes
2017-01-01
Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Terror management theory applied clinically: implications for existential-integrative psychotherapy.
Lewis, Adam M
2014-01-01
Existential psychotherapy and Terror Management Theory (TMT) offer explanations for the potential psychological effects of death awareness, although their respective literatures bases differ in clarity, research, and implications for treating psychopathology. Existential therapy is often opaque to many therapists, in part due to the lack of consensus on what constitutes its practice, limited published practical examples, and few empirical studies examining its efficacy. By contrast, TMT has an extensive empirical literature base, both within social psychology and spanning multiple disciplines, although previously unexplored within clinical and counseling psychology. This article explores the implications of a proposed TMT integrated existential therapy (TIE), bridging the gap between disciplines in order to meet the needs of the aging population and current challenges facing existential therapists.
Refractive Index of Alkali Halides and Its Wavelength and Temperature Derivatives.
1975-05-01
of CoBr . . . .......... 236 82. Comparison of Dispersion Equations Proposed for CsBr ... . 237 83. Recommmded Values on the Refractive Index and Its... discovery of empirical relationships which enable us to calculate dn/dT data at 293 K for some ma- terials on which no data are available. In the data...or in handbooks. In the present work, however, this problem 160 was solved by our empirical discoveries by which the unknown parameters of Eq. (19) for
Bounds on quantum confinement effects in metal nanoparticles
NASA Astrophysics Data System (ADS)
Blackman, G. Neal; Genov, Dentcho A.
2018-03-01
Quantum size effects on the permittivity of metal nanoparticles are investigated using the quantum box model. Explicit upper and lower bounds are derived for the permittivity and relaxation rates due to quantum confinement effects. These bounds are verified numerically, and the size dependence and frequency dependence of the empirical Drude size parameter is extracted from the model. Results suggest that the common practice of empirically modifying the dielectric function can lead to inaccurate predictions for highly uniform distributions of finite-sized particles.
Role of local network oscillations in resting-state functional connectivity.
Cabral, Joana; Hugues, Etienne; Sporns, Olaf; Deco, Gustavo
2011-07-01
Spatio-temporally organized low-frequency fluctuations (<0.1 Hz), observed in BOLD fMRI signal during rest, suggest the existence of underlying network dynamics that emerge spontaneously from intrinsic brain processes. Furthermore, significant correlations between distinct anatomical regions-or functional connectivity (FC)-have led to the identification of several widely distributed resting-state networks (RSNs). This slow dynamics seems to be highly structured by anatomical connectivity but the mechanism behind it and its relationship with neural activity, particularly in the gamma frequency range, remains largely unknown. Indeed, direct measurements of neuronal activity have revealed similar large-scale correlations, particularly in slow power fluctuations of local field potential gamma frequency range oscillations. To address these questions, we investigated neural dynamics in a large-scale model of the human brain's neural activity. A key ingredient of the model was a structural brain network defined by empirically derived long-range brain connectivity together with the corresponding conduction delays. A neural population, assumed to spontaneously oscillate in the gamma frequency range, was placed at each network node. When these oscillatory units are integrated in the network, they behave as weakly coupled oscillators. The time-delayed interaction between nodes is described by the Kuramoto model of phase oscillators, a biologically-based model of coupled oscillatory systems. For a realistic setting of axonal conduction speed, we show that time-delayed network interaction leads to the emergence of slow neural activity fluctuations, whose patterns correlate significantly with the empirically measured FC. The best agreement of the simulated FC with the empirically measured FC is found for a set of parameters where subsets of nodes tend to synchronize although the network is not globally synchronized. Inside such clusters, the simulated BOLD signal between nodes is found to be correlated, instantiating the empirically observed RSNs. Between clusters, patterns of positive and negative correlations are observed, as described in experimental studies. These results are found to be robust with respect to a biologically plausible range of model parameters. In conclusion, our model suggests how resting-state neural activity can originate from the interplay between the local neural dynamics and the large-scale structure of the brain. Copyright © 2011 Elsevier Inc. All rights reserved.
EGG: hatching a mock Universe from empirical prescriptions⋆
NASA Astrophysics Data System (ADS)
Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; Michałowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.
2017-06-01
This paper introduces EGG, the Empirical Galaxy Generator, a tool designed within the ASTRODEEP collaboration to generate mock galaxy catalogs for deep fields with realistic fluxes and simple morphologies. The simulation procedure is based exclusively on empirical prescriptions - rather than first principles - to provide the most accurate match with current observations at 0
Prediction of the Dynamic Yield Strength of Metals Using Two Structural-Temporal Parameters
NASA Astrophysics Data System (ADS)
Selyutina, N. S.; Petrov, Yu. V.
2018-02-01
The behavior of the yield strength of steel and a number of aluminum alloys is investigated in a wide range of strain rates, based on the incubation time criterion of yield and the empirical models of Johnson-Cook and Cowper-Symonds. In this paper, expressions for the parameters of the empirical models are derived through the characteristics of the incubation time criterion; a satisfactory agreement of these data and experimental results is obtained. The parameters of the empirical models can depend on some strain rate. The independence of the characteristics of the incubation time criterion of yield from the loading history and their connection with the structural and temporal features of the plastic deformation process give advantage of the approach based on the concept of incubation time with respect to empirical models and an effective and convenient equation for determining the yield strength in a wider range of strain rates.
A five-step procedure for the clinical use of the MPD in neuropsychological assessment of children.
Wallbrown, F H; Fuller, G B
1984-01-01
Described a five-step procedure that can be used to detect organicity on the basis of children's performance on the Minnesota Percepto Diagnostic Test (MPD). The first step consists of examining the T score for rotations to determine whether it is below the cut-off score, which has been established empirically as an indicator of organicity. The second step consists of matching the examinee's configuration of error scores, separation of circle-diamond (SpCD), distortion of circle-diamond (DCD), and distortion of dots (DD), with empirically derived tables. The third step consists of considering the T score for rotations and error configuration jointly. The fourth step consists of using empirically established discriminant equations, and the fifth step involves using data from limits testing and other data sources. The clinical and empirical bases for the five-step procedure also are discussed.
Empirical yield tables for Wisconsin.
Jerold T. Hahn; Joan M. Stelman
1989-01-01
Describes the tables derived from the 1983 Forest Survey of Wisconsin and presents ways the tables can be used. These tables are broken down according to Wisconsin`s five Forest Survey Units and 14 forest types.
NASA Technical Reports Server (NTRS)
Bergrun, Norman R
1952-01-01
An empirically derived basis for predicting the area, rate, and distribution of water-drop impingement on airfoils of arbitrary section is presented. The concepts involved represent an initial step toward the development of a calculation technique which is generally applicable to the design of thermal ice-prevention equipment for airplane wing and tail surfaces. It is shown that sufficiently accurate estimates, for the purpose of heated-wing design, can be obtained by a few numerical computations once the velocity distribution over the airfoil has been determined. The calculation technique presented is based on results of extensive water-drop trajectory computations for five airfoil cases which consisted of 15-percent-thick airfoils encompassing a moderate lift-coefficient range. The differential equations pertaining to the paths of the drops were solved by a differential analyzer.
NASA Astrophysics Data System (ADS)
Liu, X.; Wang, M.
2016-02-01
For coastal and inland waters, complete (in spatial) and frequent satellite measurements are important in order to monitor and understand coastal biological and ecological processes and phenomena, such as diurnal variations. High-frequency images of the water diffuse attenuation coefficient at the wavelength of 490 nm (Kd(490)) derived from the Korean Geostationary Ocean Color Imager (GOCI) provide a unique opportunity to study diurnal variation of the water turbidity in coastal regions of the Bohai Sea, Yellow Sea, and East China Sea. However, there are lots of missing pixels in the original GOCI-derived Kd(490) images due to clouds and various other reasons. Data Interpolating Empirical Orthogonal Function (DINEOF) is a method to reconstruct missing data in geophysical datasets based on Empirical Orthogonal Function (EOF). In this study, the DINEOF is applied to GOCI-derived Kd(490) data in the Yangtze River mouth and the Yellow River mouth regions, the DINEOF reconstructed Kd(490) data are used to fill in the missing pixels, and the spatial patterns and temporal functions of the first three EOF modes are also used to investigate the sub-diurnal variation due to the tidal forcing. In addition, DINEOF method is also applied to the Visible Infrared Imaging Radiometer Suite (VIIRS) on board the Suomi National Polar-orbiting Partnership (SNPP) satellite to reconstruct missing pixels in the daily Kd(490) and chlorophyll-a concentration images, and some application examples in the Chesapeake Bay and the Gulf of Mexico will be presented.
Are There Subtypes of Panic Disorder? An Interpersonal Perspective
Zilcha-Mano, Sigal; McCarthy, Kevin S.; Dinger, Ulrike; Chambless, Dianne L.; Milrod, Barbara L.; Kunik, Lauren; Barber, Jacques P.
2015-01-01
Objective Panic disorder (PD) is associated with significant personal, social, and economic costs. However, little is known about specific interpersonal dysfunctions that characterize the PD population. The current study systematically examined these interpersonal dysfunctions. Method The present analyses included 194 patients with PD out of a sample of 201 who were randomized to cognitive-behavioral therapy, panic-focused psychodynamic psychotherapy, or applied relaxation training. Interpersonal dysfunction was measured using the Inventory of Interpersonal Problems–Circumplex (Horowitz, Alden, Wiggins, & Pincus, 2000). Results Individuals with PD reported greater levels of interpersonal distress than that of a normative cohort (especially when PD was accompanied by agoraphobia), but lower than that of a cohort of patients with major depression. There was no single interpersonal profile that characterized PD patients. Symptom-based clusters (with versus without agoraphobia) could not be discriminated on core or central interpersonal problems. Rather, as revealed by cluster analysis based on the pathoplasticity framework, there were two empirically derived interpersonal clusters among PD patients which were not accounted for by symptom severity and were opposite in nature: domineering-intrusive and nonassertive. The empirically derived interpersonal clusters appear to be of clinical utility in predicting alliance development throughout treatment: While the domineering-intrusive cluster did not show any changes in the alliance throughout treatment, the non-assertive cluster showed a process of significant strengthening of the alliance. Conclusions Empirically derived interpersonal clusters in PD provide clinically useful and non-redundant information about individuals with PD. PMID:26030762
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Sheng; Li, Hongyi; Huang, Maoyi
2014-07-21
Subsurface stormflow is an important component of the rainfall–runoff response, especially in steep terrain. Its contribution to total runoff is, however, poorly represented in the current generation of land surface models. The lack of physical basis of these common parameterizations precludes a priori estimation of the stormflow (i.e. without calibration), which is a major drawback for prediction in ungauged basins, or for use in global land surface models. This paper is aimed at deriving regionalized parameterizations of the storage–discharge relationship relating to subsurface stormflow from a top–down empirical data analysis of streamflow recession curves extracted from 50 eastern United Statesmore » catchments. Detailed regression analyses were performed between parameters of the empirical storage–discharge relationships and the controlling climate, soil and topographic characteristics. The regression analyses performed on empirical recession curves at catchment scale indicated that the coefficient of the power-law form storage–discharge relationship is closely related to the catchment hydrologic characteristics, which is consistent with the hydraulic theory derived mainly at the hillslope scale. As for the exponent, besides the role of field scale soil hydraulic properties as suggested by hydraulic theory, it is found to be more strongly affected by climate (aridity) at the catchment scale. At a fundamental level these results point to the need for more detailed exploration of the co-dependence of soil, vegetation and topography with climate.« less
ERIC Educational Resources Information Center
Hashemi, Nourooz; Abu, Mohd Salleh; Kashefi, Hamidreza; Mokhtar, Mahani; Rahimi, Khadijeh
2015-01-01
Derivatives and integrals are two important concepts of calculus which are precondition topics for most of mathematics courses and other courses in different fields of studies. A majority of students at the undergraduate level have to master derivatives and integrals if they want to be successful in their studies However, students encounter…
Cloud vertical profiles derived from CALIPSO and CloudSat and a comparison with MODIS derived clouds
NASA Astrophysics Data System (ADS)
Kato, S.; Sun-Mack, S.; Miller, W. F.; Rose, F. G.; Minnis, P.; Wielicki, B. A.; Winker, D. M.; Stephens, G. L.; Charlock, T. P.; Collins, W. D.; Loeb, N. G.; Stackhouse, P. W.; Xu, K.
2008-05-01
CALIPSO and CloudSat from the a-train provide detailed information of vertical distribution of clouds and aerosols. The vertical distribution of cloud occurrence is derived from one month of CALIPSO and CloudSat data as a part of the effort of merging CALIPSO, CloudSat and MODIS with CERES data. This newly derived cloud profile is compared with the distribution of cloud top height derived from MODIS on Aqua from cloud algorithms used in the CERES project. The cloud base from MODIS is also estimated using an empirical formula based on the cloud top height and optical thickness, which is used in CERES processes. While MODIS detects mid and low level clouds over the Arctic in April fairly well when they are the topmost cloud layer, it underestimates high- level clouds. In addition, because the CERES-MODIS cloud algorithm is not able to detect multi-layer clouds and the empirical formula significantly underestimates the depth of high clouds, the occurrence of mid and low-level clouds is underestimated. This comparison does not consider sensitivity difference to thin clouds but we will impose an optical thickness threshold to CALIPSO derived clouds for a further comparison. The effect of such differences in the cloud profile to flux computations will also be discussed. In addition, the effect of cloud cover to the top-of-atmosphere flux over the Arctic using CERES SSF and FLASHFLUX products will be discussed.
Ten key principles for successful health systems integration.
Suter, Esther; Oelke, Nelly D; Adair, Carol E; Armitage, Gail D
2009-01-01
Integrated health systems are considered part of the solution to the challenge of sustaining Canada's healthcare system. This systematic literature review was undertaken to guide decision-makers and others to plan for and implement integrated health systems. This review identified 10 universal principles of successfully integrated healthcare systems that may be used by decision-makers to assist with integration efforts. These principles define key areas for restructuring and allow organizational flexibility and adaptation to local context. The literature does not contain a one-size-fits-all model or process for successful integration, nor is there a firm empirical foundation for specific integration strategies and processes.
Imperial Policy and the Integration of Gaul into the Roman Empire
2015-06-12
ECONOMIC BENEFITS OF EMPIRE From as early as the second century BC Gaul had a taste for the material outputs of the Roman economy. Wine in particular...Rome tolerated the establishment of local Gallic production. This meant the growth in Gaul’s wine consumption benefited Gallic producers and not...exporter of sought after wines . Wine production was not the only industry that benefited from the Roman conquest. Complementing Gallic viticulture
Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F.
2017-01-01
It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience—a situation model—and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature. PMID:28861011
Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F
2017-01-01
It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience-a situation model-and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature.
Reacting Chemistry Based Burn Model for Explosive Hydrocodes
NASA Astrophysics Data System (ADS)
Schwaab, Matthew; Greendyke, Robert; Steward, Bryan
2017-06-01
Currently, in hydrocodes designed to simulate explosive material undergoing shock-induced ignition, the state of the art is to use one of numerous reaction burn rate models. These burn models are designed to estimate the bulk chemical reaction rate. Unfortunately, these models are largely based on empirical data and must be recalibrated for every new material being simulated. We propose that the use of an equilibrium Arrhenius rate reacting chemistry model in place of these empirically derived burn models will improve the accuracy for these computational codes. Such models have been successfully used in codes simulating the flow physics around hypersonic vehicles. A reacting chemistry model of this form was developed for the cyclic nitramine RDX by the Naval Research Laboratory (NRL). Initial implementation of this chemistry based burn model has been conducted on the Air Force Research Laboratory's MPEXS multi-phase continuum hydrocode. In its present form, the burn rate is based on the destruction rate of RDX from NRL's chemistry model. Early results using the chemistry based burn model show promise in capturing deflagration to detonation features more accurately in continuum hydrocodes than previously achieved using empirically derived burn models.
Very empirical treatment of solvation and entropy: a force field derived from Log Po/w
NASA Astrophysics Data System (ADS)
Kellogg, Glen Eugene; Burnett, James C.; Abraham, Donald J.
2001-04-01
A non-covalent interaction force field model derived from the partition coefficient of 1-octanol/water solubility is described. This model, HINT for Hydropathic INTeractions, is shown to include, in very empirical and approximate terms, all components of biomolecular associations, including hydrogen bonding, Coulombic interactions, hydrophobic interactions, entropy and solvation/desolvation. Particular emphasis is placed on: (1) demonstrating the relationship between the total empirical HINT score and free energy of association, ΔG interaction; (2) showing that the HINT hydrophobic-polar interaction sub-score represents the energy cost of desolvation upon binding for interacting biomolecules; and (3) a new methodology for treating constrained water molecules as discrete independent small ligands. An example calculation is reported for dihydrofolate reductase (DHFR) bound with methotrexate (MTX). In that case the observed very tight binding, ΔG interaction≤-13.6 kcal mol-1, is largely due to ten hydrogen bonds between the ligand and enzyme with estimated strength ranging between -0.4 and -2.3 kcal mol-1. Four water molecules bridging between DHFR and MTX contribute an additional -1.7 kcal mol-1 stability to the complex. The HINT estimate of the cost of desolvation is +13.9 kcal mol-1.
Building a taxonomy of integrated palliative care initiatives: results from a focus group
Ewert, Benjamin; Hodiamont, Farina; van Wijngaarden, Jeroen; Payne, Sheila; Groot, Marieke; Hasselaar, Jeroen; Menten, Johann; Radbruch, Lukas
2016-01-01
Background Empirical evidence suggests that integrated palliative care (IPC) increases the quality of care for palliative patients and supports professional caregivers. Existing IPC initiatives in Europe vary in their design and are hardly comparable. InSuP-C, a European Union research project, aimed to build a taxonomy of IPC initiatives applicable across diseases, healthcare sectors and systems. Methods The taxonomy of IPC initiatives was developed in cooperation with an international and multidisciplinary focus group of 18 experts. Subsequently, a consensus meeting of 10 experts revised a preliminary taxonomy and adopted the final classification system. Results Consisting of eight categories, with two to four items each, the taxonomy covers the process and structure of IPC initiatives. If two items in at least one category apply to an initiative, a minimum level of integration is assumed to have been reached. Categories range from the type of initiative (items: pathway, model or guideline) to patients’ key contact (items: non-pc specialist, pc specialist, general practitioner). Experts recommended the inclusion of two new categories: level of care (items: primary, secondary or tertiary) indicating at which stage palliative care is integrated and primary focus of intervention describing IPC givers’ different roles (items: treating function, advising/consulting or training) in the care process. Conclusions Empirical studies are required to investigate how the taxonomy is used in practice and whether it covers the reality of patients in need of palliative care. The InSuP-C project will test this taxonomy empirically in selected initiatives using IPC. PMID:26647043
Spatial Selection and Local Adaptation Jointly Shape Life-History Evolution during Range Expansion.
Van Petegem, Katrien H P; Boeye, Jeroen; Stoks, Robby; Bonte, Dries
2016-11-01
In the context of climate change and species invasions, range shifts increasingly gain attention because the rates at which they occur in the Anthropocene induce rapid changes in biological assemblages. During range shifts, species experience multiple selection pressures. For poleward expansions in particular, it is difficult to interpret observed evolutionary dynamics because of the joint action of evolutionary processes related to spatial selection and to adaptation toward local climatic conditions. To disentangle the effects of these two processes, we integrated stochastic modeling and data from a common garden experiment, using the spider mite Tetranychus urticae as a model species. By linking the empirical data with those derived form a highly parameterized individual-based model, we infer that both spatial selection and local adaptation contributed to the observed latitudinal life-history divergence. Spatial selection best described variation in dispersal behavior, while variation in development was best explained by adaptation to the local climate. Divergence in life-history traits in species shifting poleward could consequently be jointly determined by contemporary evolutionary dynamics resulting from adaptation to the environmental gradient and from spatial selection. The integration of modeling with common garden experiments provides a powerful tool to study the contribution of these evolutionary processes on life-history evolution during range expansion.
MetaMapR: pathway independent metabolomic network analysis incorporating unknowns.
Grapov, Dmitry; Wanichthanarak, Kwanjeera; Fiehn, Oliver
2015-08-15
Metabolic network mapping is a widely used approach for integration of metabolomic experimental results with biological domain knowledge. However, current approaches can be limited by biochemical domain or pathway knowledge which results in sparse disconnected graphs for real world metabolomic experiments. MetaMapR integrates enzymatic transformations with metabolite structural similarity, mass spectral similarity and empirical associations to generate richly connected metabolic networks. This open source, web-based or desktop software, written in the R programming language, leverages KEGG and PubChem databases to derive associations between metabolites even in cases where biochemical domain or molecular annotations are unknown. Network calculation is enhanced through an interface to the Chemical Translation System, which allows metabolite identifier translation between >200 common biochemical databases. Analysis results are presented as interactive visualizations or can be exported as high-quality graphics and numerical tables which can be imported into common network analysis and visualization tools. Freely available at http://dgrapov.github.io/MetaMapR/. Requires R and a modern web browser. Installation instructions, tutorials and application examples are available at http://dgrapov.github.io/MetaMapR/. ofiehn@ucdavis.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Computational mate choice: theory and empirical evidence.
Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo
2012-06-01
The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for behavioural ecologist interested in integrating proximate and ultimate causes of mate choice. Copyright © 2012 Elsevier B.V. All rights reserved.
An empirical method for deriving RBE values associated with electrons, photons and radionuclides.
Bellamy, M; Puskin, J; Hertel, N; Eckerman, K
2015-12-01
There is substantial evidence to justify using relative biological effectiveness (RBE) values of >1 for low-energy electrons and photons. But, in the field of radiation protection, radiation associated with low linear energy transfer has been assigned a radiation weighting factor wR of 1. This value may be suitable for radiation protection but, for risk considerations, it is important to evaluate the potential elevated biological effectiveness of radiation to improve the quality of risk estimates. RBE values between 2 and 3 for tritium are implied by several experimental measurements. Additionally, elevated RBE values have been found for other similar low-energy radiation sources. In this work, RBE values are derived for electrons based upon the fractional deposition of absorbed dose of energies less than a few kiloelectron volts. Using this empirical method, RBE values were also derived for monoenergetic photons and 1070 radionuclides from ICRP Publication 107 for which photons and electrons are the primary emissions. Published by Oxford University Press 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Zhang, J; Feng, J-Y; Ni, Y-L; Wen, Y-J; Niu, Y; Tamba, C L; Yue, C; Song, Q; Zhang, Y-M
2017-06-01
Multilocus genome-wide association studies (GWAS) have become the state-of-the-art procedure to identify quantitative trait nucleotides (QTNs) associated with complex traits. However, implementation of multilocus model in GWAS is still difficult. In this study, we integrated least angle regression with empirical Bayes to perform multilocus GWAS under polygenic background control. We used an algorithm of model transformation that whitened the covariance matrix of the polygenic matrix K and environmental noise. Markers on one chromosome were included simultaneously in a multilocus model and least angle regression was used to select the most potentially associated single-nucleotide polymorphisms (SNPs), whereas the markers on the other chromosomes were used to calculate kinship matrix as polygenic background control. The selected SNPs in multilocus model were further detected for their association with the trait by empirical Bayes and likelihood ratio test. We herein refer to this method as the pLARmEB (polygenic-background-control-based least angle regression plus empirical Bayes). Results from simulation studies showed that pLARmEB was more powerful in QTN detection and more accurate in QTN effect estimation, had less false positive rate and required less computing time than Bayesian hierarchical generalized linear model, efficient mixed model association (EMMA) and least angle regression plus empirical Bayes. pLARmEB, multilocus random-SNP-effect mixed linear model and fast multilocus random-SNP-effect EMMA methods had almost equal power of QTN detection in simulation experiments. However, only pLARmEB identified 48 previously reported genes for 7 flowering time-related traits in Arabidopsis thaliana.
NASA Astrophysics Data System (ADS)
Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi
2018-03-01
We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.
Philosophy and the front line of science.
Pernu, Tuomas K
2008-03-01
According to one traditional view, empirical science is necessarily preceded by philosophical analysis. Yet the relevance of philosophy is often doubted by those engaged in empirical sciences. I argue that these doubts can be substantiated by two theoretical problems that the traditional conception of philosophy is bound to face. First, there is a strong normative etiology to philosophical problems, theories, and notions that is dfficult to reconcile with descriptive empirical study. Second, conceptual analysis (a role that is typically assigned to philosophy) seems to lose its object of study if it is granted that terms do not have purely conceptual meanings detached from their actual use in empirical sciences. These problems are particularly acute to the current naturalistic philosophy of science. I suggest a more concrete integration of philosophy and the sciences as a possible way of making philosophy of science have more impact.
Interest Rates and Coupon Bonds in Quantum Finance
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.
2009-09-01
1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.
Jackknife variance of the partial area under the empirical receiver operating characteristic curve.
Bandos, Andriy I; Guo, Ben; Gur, David
2017-04-01
Receiver operating characteristic analysis provides an important methodology for assessing traditional (e.g., imaging technologies and clinical practices) and new (e.g., genomic studies, biomarker development) diagnostic problems. The area under the clinically/practically relevant part of the receiver operating characteristic curve (partial area or partial area under the receiver operating characteristic curve) is an important performance index summarizing diagnostic accuracy at multiple operating points (decision thresholds) that are relevant to actual clinical practice. A robust estimate of the partial area under the receiver operating characteristic curve is provided by the area under the corresponding part of the empirical receiver operating characteristic curve. We derive a closed-form expression for the jackknife variance of the partial area under the empirical receiver operating characteristic curve. Using the derived analytical expression, we investigate the differences between the jackknife variance and a conventional variance estimator. The relative properties in finite samples are demonstrated in a simulation study. The developed formula enables an easy way to estimate the variance of the empirical partial area under the receiver operating characteristic curve, thereby substantially reducing the computation burden, and provides important insight into the structure of the variability. We demonstrate that when compared with the conventional approach, the jackknife variance has substantially smaller bias, and leads to a more appropriate type I error rate of the Wald-type test. The use of the jackknife variance is illustrated in the analysis of a data set from a diagnostic imaging study.
Empirical approaches to metacommunities: a review and comparison with theory.
Logue, Jürg B; Mouquet, Nicolas; Peter, Hannes; Hillebrand, Helmut
2011-09-01
Metacommunity theory has advanced understanding of how spatial dynamics and local interactions shape community structure and biodiversity. Here, we review empirical approaches to metacommunities, both observational and experimental, pertaining to how well they relate to and test theoretical metacommunity paradigms and how well they capture the realities of natural ecosystems. First, we show that the species-sorting and mass-effects paradigms are the most commonly tested and supported paradigms. Second, the dynamics observed can often be ascribed to two or more of the four non-exclusive paradigms. Third, empirical approaches relate only weakly to the concise assumptions and predictions made by the paradigms. Consequently, we suggest major avenues of improvement for empirical metacommunity approaches, including the integration across theoretical approaches and the incorporation of evolutionary and meta-ecosystem dynamics. We hope for metacommunity ecology to thereby bridge existing gaps between empirical and theoretical work, thus becoming a more powerful framework to understand dynamics across ecosystems. Copyright © 2011 Elsevier Ltd. All rights reserved.
Active Learning in the Online Environment: The Integration of Student-Generated Audio Files
ERIC Educational Resources Information Center
Bolliger, Doris U.; Armier, David Des, Jr.
2013-01-01
Educators have integrated instructor-produced audio files in a variety of settings and environments for purposes such as content presentation, lecture reviews, student feedback, and so forth. Few instructors, however, require students to produce audio files and share them with peers. The purpose of this study was to obtain empirical data on…
ERIC Educational Resources Information Center
Bialka, Christa S.; Morro, Danielle; Brown, Kara; Hannah, Gregory
2017-01-01
While scholars have indicated that social involvement is crucial to students' development and success in college life and beyond, very little empirical research investigates how students with disabilities become socially integrated in college settings. In response, this qualitative study examines the social experiences of five college students…
Integrating Mobile Multimedia into Textbooks: 2D Barcodes
ERIC Educational Resources Information Center
Uluyol, Celebi; Agca, R. Kagan
2012-01-01
The major goal of this study was to empirically compare text-plus-mobile phone learning using an integrated 2D barcode tag in a printed text with three other conditions described in multimedia learning theory. The method examined in the study involved modifications of the instructional material such that: a 2D barcode was used near the text, the…
ERIC Educational Resources Information Center
Atabekova, Atabekova; Gorbatenko, Rimma; Belousov, Aleksandr; Grebnev, Ruslan; Sheremetieva, Olga
2016-01-01
The paper explores the ways in which non-formal content and language integrated learning within university studies can affect students' academic progress. The research has included theoretical and empirical studies. The article focuses on the observation of students' learning process, draws attention to challenges and benefits students experienced…
ERIC Educational Resources Information Center
Doerann-George, Judith
The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…
ERIC Educational Resources Information Center
Kern, Margaret L.; Hampson, Sarah E.; Goldberg, Lewis R.; Friedman, Howard S.
2014-01-01
The present study used a collaborative framework to integrate 2 long-term prospective studies: the Terman Life Cycle Study and the Hawaii Personality and Health Longitudinal Study. Within a 5-factor personality-trait framework, teacher assessments of child personality were rationally and empirically aligned to establish similar factor structures…
Learner Behaviour in a MOOC Practice-Oriented Course: In Empirical Study Integrating TAM and TPB
ERIC Educational Resources Information Center
Yang, Hsi-Hsun; Su, Chung-Ho
2017-01-01
Few practice-oriented courses are currently integrated into online learning platforms, such as OpenCourseWare, Khan Academy, and Massive Open Online Courses (MOOCs). It is worthwhile to explore how learners respond to information technology and new teaching methods when practice-oriented course are placed online. Therefore, this study probes…
Use, Updating and Integration of ICT in Higher Education: Linking Purpose, People and Pedagogy
ERIC Educational Resources Information Center
Stensaker, Bjorn; Maassen, Peter; Borgan, Monika; Oftebro, Mette; Karseth, Berit
2007-01-01
In this article the use, updating and integration of Information and Communication Technology (ICT) for teaching and learning purposes is discussed. Based on an empirical study using interviews and document analysis of the implementation of ICT in five Norwegian universities and colleges, the article analyse a number of factors that are of…
ERIC Educational Resources Information Center
Everhart, Nancy; Mardis, Marcia A.; Johnston, Melissa
2011-01-01
In an effort to address the lack of empirical knowledge about the school librarians' role in technology, the Institute for Museum and Library Services funded Project Leadership-in-Action (LIA) to study leadership practices of school librarians. This current grant project includes a survey of the technology integration practices of school…
ERIC Educational Resources Information Center
Flanagan, Rosemary; Esquivel, Giselle B.
2006-01-01
School psychologists have a critical role in identifying social-emotional problems and psychopathology in youth based on a set of personality-assessment competencies. The development of competencies in assessing personality and psychopathology is complex, requiring a variety of integrated methods and approaches. Given the limited extent and scope…
A non-conventional procedure for the 3D modeling of WWI forts
NASA Astrophysics Data System (ADS)
Nocerino, E.; Fiorillo, F.; Minto, S.; Menna, F.; Remondino, F.
2014-06-01
2014 is the hundredth anniversary of the outbreak of the First World War (WWI) - or Great War - in Europe and a number of initiatives have been planned to commemorate the tragic event. Until 1918, the Italian Trentino - Alto Adige region was under the Austro - Hungarian Empire and represented one of the most crucial and bloody war front between the Austrian and Italian territories. The region borders were constellated of military fortresses, theatre of battles between the two opposite troops. Unfortunately, most of these military buildings are now ruined and their architectures can be hardly appreciated. The paper presents the initial results of the VAST project (VAlorizzazione Storia e Territorio - Valorization of History and Landscape), that aims to digitally reconstruct the forts located on the plateaus of Luserna, Lavarone and Folgaria. An integrated methodology has been adopted to collect and employ all possible source of information in order to derive precise and photo-realistic 3D digital representations of WWI forts.
Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S>
2007-01-01
In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.
NASA Technical Reports Server (NTRS)
Daiker, Ron; Schnell, Thomas
2010-01-01
A human motor model was developed on the basis of performance data that was collected in a flight simulator. The motor model is under consideration as one component of a virtual pilot model for the evaluation of NextGen crew alerting and notification systems in flight decks. This model may be used in a digital Monte Carlo simulation to compare flight deck layout design alternatives. The virtual pilot model is being developed as part of a NASA project to evaluate multiple crews alerting and notification flight deck configurations. Model parameters were derived from empirical distributions of pilot data collected in a flight simulator experiment. The goal of this model is to simulate pilot motor performance in the approach-to-landing task. The unique challenges associated with modeling the complex dynamics of humans interacting with the cockpit environment are discussed, along with the current state and future direction of the model.
Biome-specific scaling of ocean productivity, temperature, and carbon export efficiency
NASA Astrophysics Data System (ADS)
Britten, Gregory L.; Primeau, François W.
2016-05-01
Mass conservation and metabolic theory place constraints on how marine export production (EP) scales with net primary productivity (NPP) and sea surface temperature (SST); however, little is empirically known about how these relationships vary across ecologically distinct ocean biomes. Here we compiled in situ observations of EP, NPP, and SST and used statistical model selection theory to demonstrate significant biome-specific scaling relationships among these variables. Multiple statistically similar models yield a threefold variation in the globally integrated carbon flux (~4-12 Pg C yr-1) when applied to climatological satellite-derived NPP and SST. Simulated NPP and SST input variables from a 4×CO2 climate model experiment further show that biome-specific scaling alters the predicted response of EP to simulated increases of atmospheric CO2. These results highlight the need to better understand distinct pathways of carbon export across unique ecological biomes and may help guide proposed efforts for in situ observations of the ocean carbon cycle.
2012-01-01
PI3K, AKT, and mTOR are key kinases from PI3K signaling pathway being extensively pursued to treat a variety of cancers in oncology. To search for a structurally differentiated back-up candidate to PF-04691502, which is currently in phase I/II clinical trials for treating solid tumors, a lead optimization effort was carried out with a tricyclic imidazo[1,5]naphthyridine series. Integration of structure-based drug design and physical properties-based optimization yielded a potent and selective PI3K/mTOR dual kinase inhibitor PF-04979064. This manuscript discusses the lead optimization for the tricyclic series, which both improved the in vitro potency and addressed a number of ADMET issues including high metabolic clearance mediated by both P450 and aldehyde oxidase (AO), poor permeability, and poor solubility. An empirical scaling tool was developed to predict human clearance from in vitro human liver S9 assay data for tricyclic derivatives that were AO substrates. PMID:24900568
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
NASA Astrophysics Data System (ADS)
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
Harman, Christopher; Thomas, Kevin V; Tollefsen, Knut Erik; Meier, Sonnich; Bøyum, Olav; Grung, Merete
2009-11-01
In order to assess the environmental impact of aquatic discharges from the offshore oil industry, polar organic chemical integrative samplers (POCIS) and semipermeable membrane devices (SPMDs) were deployed around an oil platform and at reference locations in the North Sea. Exposure to polycyclic aromatic hydrocarbons (PAH) and alkylated phenols (AP) was determined from passive sampler accumulations using an empirical uptake model, the dissipation of performance reference compounds and adjusted laboratory derived sampling rates. Exposure was relatively similar within 1-2 km of the discharge point, with levels dominated by short chained C1-C3 AP isomers (19-51 ngL(-1)) and alkylated naphthalenes, phenanthrenes and dibenzothiophenes (NPD, 29-45 ngL(-1)). Exposure stations showed significant differences to reference sites for NPD, but not always for more hydrophobic PAH. These concentrations are several orders of magnitude lower than those reported to give both acute and sub-lethal effects, although their long term consequences are unknown.
Leichsenring, Falk; Steinert, Christiane
2017-01-01
Obsessive-compulsive disorder (OCD) is a chronic disabling disorder characterized by recurrent obsessions and uncontrolled compulsions. Recent research on anxiety disorders suggests that manual-guided short-term psychodynamic therapy (STPP) may be a promising approach. Building on this, a model of STPP for OCD was developed based on Luborsky's supportive-expressive (SE) therapy. Treatment consists of 12 modules, which include the characteristic elements of SE therapy, that is, a focus on the Core Conflictual Relationship Theme (CCRT) associated with OCD symptoms and on establishing a secure alliance. Disorder-specific treatment elements were integrated, including addressing ambivalence, differentiating between thinking and acting, mitigating the superego, addressing existential issues, and, last but not least, implementing Freud's original recommendation to induce OCD patients to face the feared situation and to use the aroused experiences to work on the underlying conflict (i.e., CCRT). There are reasons to assume that the empirically derived model of STPP described here may be beneficial in OCD.
Non-Linear Steady State Vibrations of Beams Excited by Vortex Shedding
NASA Astrophysics Data System (ADS)
LEWANDOWSKI, R.
2002-05-01
In this paper the non-linear vibrations of beams excited by vortex-shedding are considered. In particular, the steady state responses of beams near the synchronization region are taken into account. The main aerodynamic properties of wind are described by using the semi-empirical model proposed by Hartlen and Currie. The finite element method and the strip method are used to formulate the equation of motion of the system treated. The harmonic balance method is adopted to derive the amplitude equations. These equations are solved with the help of the continuation method which is very convenient to perform the parametric studies of the problem and to determine the response curve in the synchronization region. Moreover, the equations of motion are also integrated using the Newmark method. The results of calculations of several example problems are also shown to confirm the efficiency and accuracy of the presented method. The results obtained by the harmonic balance method and by the Newmark methods are in good agreement with each other.
Simplified analysis about horizontal displacement of deep soil under tunnel excavation
NASA Astrophysics Data System (ADS)
Tian, Xiaoyan; Gu, Shuancheng; Huang, Rongbin
2017-11-01
Most of the domestic scholars focus on the study about the law of the soil settlement caused by subway tunnel excavation, however, studies on the law of horizontal displacement are lacking. And it is difficult to obtain the horizontal displacement data of any depth in the project. At present, there are many formulas for calculating the settlement of soil layers. In terms of integral solutions of Mindlin classic elastic theory, stochastic medium theory, source-sink theory, the Peck empirical formula is relatively simple, and also has a strong applicability at home. Considering the incompressibility of rock and soil mass, based on the principle of plane strain, the calculation formula of the horizontal displacement of the soil along the cross section of the tunnel was derived by using the Peck settlement formula. The applicability of the formula is verified by comparing with the existing engineering cases, a simple and rapid analytical method for predicting the horizontal displacement is presented.
Knight, Rod
2016-05-01
The field of population and public health ethics (PPHE) has yet to fully embrace the generation of evidence as an important project. This article reviews the philosophical debates related to the 'empirical turn' in clinical bioethics, and critically analyses how PPHE has and can engage with the philosophical implications of generating empirical data within the task of normative inquiry. A set of five conceptual and theoretical issues pertaining to population health that are unresolved and could potentially benefit from empirical PPHE approaches to normative inquiry are discussed. Each issue differs from traditional empirical bioethical approaches, in that they emphasize (1) concerns related to the population, (2) 'upstream' policy-relevant health interventions - within and outside of the health care system and (3) the prevention of illness and disease. Within each theoretical issue, a conceptual example from population and public health approaches to HIV prevention and health promotion is interrogated. Based on the review and critical analysis, this article concludes that empirical-normative approaches to population and public health ethics would be most usefully pursued as an iterative project (rather than as a linear project), in which the normative informs the empirical questions to be asked and new empirical evidence constantly directs conceptualizations of what constitutes morally robust public health practices. Finally, a conceptualization of an empirical population and public health ethics is advanced in order to open up new interdisciplinary 'spaces', in which empirical and normative approaches to ethical inquiry are transparently (and ethically) integrated. © The Author(s) 2015.
1986-02-01
Ellipses Derived from Both MacAdam’s Empirically Derived Color Matching Standard Deviation and Stiles’ Line Element Predictions 28 2.1.1.2-9 CIELUV Color...Coordinates 141 2.2.2-3 Derivation of CIE (L*, U*, V*) Coordinates 145 2.2.2-4 Three-Dimensional Representation of CIELUV Colcr Difference Estimates...145 2.2.2-5 Application of CIELUV for Estimating Color Difference on an Electronic Color Display 146 2.2.2-6 Color Performance Envelopes and Optimized
Integration of culture and biology in human development.
Mistry, Jayanthi
2013-01-01
The challenge of integrating biology and culture is addressed in this chapter by emphasizing human development as involving mutually constitutive, embodied, and epigenetic processes. Heuristically rich constructs extrapolated from cultural psychology and developmental science, such as embodiment, action, and activity, are presented as promising approaches to the integration of cultural and biology in human development. These theoretical notions are applied to frame the nascent field of cultural neuroscience as representing this integration of culture and biology. Current empirical research in cultural neuroscience is then synthesized to illustrate emerging trends in this body of literature that examine the integration of biology and culture.
Revisiting Organisational Learning in Integrated Care.
Nuño-Solinís, Roberto
2017-08-11
Progress in health care integration is largely linked to changes in processes and ways of doing. These changes have knowledge management and learning implications. For this reason, the use of the concept of organisational learning is explored in the field of integrated care. There are very limited contributions that have connected the fields of organisational learning and care integration in a systematic way, both at the theoretical and empirical level. For this reason, hybridization of both perspectives still provides opportunities for understanding care integration initiatives from a research perspective as well as potential applications in health care management and planning.
Revisiting Organisational Learning in Integrated Care
2017-01-01
Progress in health care integration is largely linked to changes in processes and ways of doing. These changes have knowledge management and learning implications. For this reason, the use of the concept of organisational learning is explored in the field of integrated care. There are very limited contributions that have connected the fields of organisational learning and care integration in a systematic way, both at the theoretical and empirical level. For this reason, hybridization of both perspectives still provides opportunities for understanding care integration initiatives from a research perspective as well as potential applications in health care management and planning. PMID:28970762
Spatial and temporal patterns of xylem sap pH derived from stems and twigs of Populus deltoides L.
Doug Aubrey; Justin Boyles; Laura Krysinsky; Robert Teskey
2011-01-01
Xylem sap pH (pHX) is critical in determining the quantity of inorganic carbon dissolved in xylem solution from gaseous [CO2] measurements. Studies of internal carbon transport have generally assumed that pHX derived from stems and twigs is similar and that pHX remains constant through time; however, no empirical studies have investigated these assumptions. If any of...
A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution
Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep
2017-01-01
The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section. PMID:28983398
A Moisture Function of Soil Heterotrophic Respiration Derived from Pore-scale Mechanisms
NASA Astrophysics Data System (ADS)
Yan, Z.; Todd-Brown, K. E.; Bond-Lamberty, B. P.; Bailey, V.; Liu, C.
2017-12-01
Soil heterotrophic respiration (HR) is an important process controlling carbon (C) flux, but its response to changes in soil water content (θ) is poorly understood. Earth system models (ESMs) use empirical moisture functions developed from specific sites to describe the HR-θ relationship in soils, introducing significant uncertainty. Generalized models derived from mechanisms that control substrate availability and microbial respiration are thus urgently needed. Here we derive, present, and test a novel moisture function fp developed from pore-scale mechanisms. This fp encapsulates primary physicochemical and biological processes controlling HR response to moisture variation in soils. We tested fp against a wide range of published data for different soil types, and found that fp reliably predicted diverse HR- relationships. The mathematical relationship between the parameters in fp and macroscopic soil properties such as porosity and organic C content was also established, enabling to estimate fp using soil properties. Compared with empirical moisture functions used in ESMs, this derived fp could reduce uncertainty in predicting the response of soil organic C stock to climate changes. In addition, this work is one of the first studies to upscale a mechanistic soil HR model based on pore-scale processes, thus linking the pore-scale mechanisms with macroscale observations.
Hellyer, Peter J; Scott, Gregory; Shanahan, Murray; Sharp, David J; Leech, Robert
2015-06-17
Current theory proposes that healthy neural dynamics operate in a metastable regime, where brain regions interact to simultaneously maximize integration and segregation. Metastability may confer important behavioral properties, such as cognitive flexibility. It is increasingly recognized that neural dynamics are constrained by the underlying structural connections between brain regions. An important challenge is, therefore, to relate structural connectivity, neural dynamics, and behavior. Traumatic brain injury (TBI) is a pre-eminent structural disconnection disorder whereby traumatic axonal injury damages large-scale connectivity, producing characteristic cognitive impairments, including slowed information processing speed and reduced cognitive flexibility, that may be a result of disrupted metastable dynamics. Therefore, TBI provides an experimental and theoretical model to examine how metastable dynamics relate to structural connectivity and cognition. Here, we use complementary empirical and computational approaches to investigate how metastability arises from the healthy structural connectome and relates to cognitive performance. We found reduced metastability in large-scale neural dynamics after TBI, measured with resting-state functional MRI. This reduction in metastability was associated with damage to the connectome, measured using diffusion MRI. Furthermore, decreased metastability was associated with reduced cognitive flexibility and information processing. A computational model, defined by empirically derived connectivity data, demonstrates how behaviorally relevant changes in neural dynamics result from structural disconnection. Our findings suggest how metastable dynamics are important for normal brain function and contingent on the structure of the human connectome. Copyright © 2015 the authors 0270-6474/15/359050-14$15.00/0.
Is Directivity Still Effective in a PSHA Framework?
NASA Astrophysics Data System (ADS)
Spagnuolo, E.; Herrero, A.; Cultrera, G.
2008-12-01
Source rupture parameters, like directivity, modulate the energy release causing variations in the radiated signal amplitude. Thus they affect the empirical predictive equations and as a consequence the seismic hazard assessment. Classical probabilistic hazard evaluations, e.g. Cornell (1968), use very simple predictive equations only based on magnitude and distance which do not account for variables concerning the rupture process. However nowadays, a few predictive equations (e.g. Somerville 1997, Spudich and Chiou 2008) take into account for rupture directivity. Also few implementations have been made in a PSHA framework (e.g. Convertito et al. 2006, Rowshandel 2006). In practice, these new empirical predictive models incorporate quantitatively the rupture propagation effects through the introduction of variables like rake, azimuth, rupture velocity and laterality. The contribution of all these variables is summarized in corrective factors derived from measuring differences between the real data and the predicted ones Therefore, it's possible to keep the older computation, making use of a simple predictive model, and besides, to incorporate the directivity effect through the corrective factors. Any single supplementary variable meaning a new integral in the parametric space. However the difficulty consists of the constraints on parameter distribution functions. We present the preliminary result for ad hoc distributions (Gaussian, uniform distributions) in order to test the impact of incorporating directivity into PSHA models. We demonstrate that incorporating directivity in PSHA by means of the new predictive equations may lead to strong percentage variations in the hazard assessment.
Synoptic, Global Mhd Model For The Solar Corona
NASA Astrophysics Data System (ADS)
Cohen, Ofer; Sokolov, I. V.; Roussev, I. I.; Gombosi, T. I.
2007-05-01
The common techniques for mimic the solar corona heating and the solar wind acceleration in global MHD models are as follow. 1) Additional terms in the momentum and energy equations derived from the WKB approximation for the Alfv’en wave turbulence; 2) some empirical heat source in the energy equation; 3) a non-uniform distribution of the polytropic index, γ, used in the energy equation. In our model, we choose the latter approach. However, in order to get a more realistic distribution of γ, we use the empirical Wang-Sheeley-Arge (WSA) model to constrain the MHD solution. The WSA model provides the distribution of the asymptotic solar wind speed from the potential field approximation; therefore it also provides the distribution of the kinetic energy. Assuming that far from the Sun the total energy is dominated by the energy of the bulk motion and assuming the conservation of the Bernoulli integral, we can trace the total energy along a magnetic field line to the solar surface. On the surface the gravity is known and the kinetic energy is negligible. Therefore, we can get the surface distribution of γ as a function of the final speed originating from this point. By interpolation γ to spherically uniform value on the source surface, we use this spatial distribution of γ in the energy equation to obtain a self-consistent, steady state MHD solution for the solar corona. We present the model result for different Carrington Rotations.
Fisher's geometrical model emerges as a property of complex integrated phenotypic networks.
Martin, Guillaume
2014-05-01
Models relating phenotype space to fitness (phenotype-fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher's geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model "from first principles" is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher's model's assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.
Integrating mediation and moderation to advance theory development and testing.
Karazsia, Bryan T; Berlin, Kristoffer S; Armstrong, Bridget; Janicke, David M; Darling, Katherine E
2014-03-01
The concepts and associated analyses of mediation and moderation are important to the field of psychology. Although pediatric psychologists frequently incorporate mediation and moderation in their theories and empirical research, on few occasions have we integrated mediation and moderation. In this article, conceptual reasons for integrating mediation and moderation are offered. We illustrate a model that integrates mediation and moderation. In our illustration, the strength of an indirect or a mediating effect varied as a function of a moderating variable. Clinical implications of the integration of mediation and moderation are discussed, as is the potential of integrated models to advance research programs in pediatric psychology.
Statistical microeconomics and commodity prices: theory and empirical results.
Baaquie, Belal E
2016-01-13
A review is made of the statistical generalization of microeconomics by Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is given by the unequal time correlation function and is modelled by the Feynman path integral based on an action functional. The correlation functions of the model are defined using the path integral. The existence of the action functional for commodity prices that was postulated to exist in Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)) has been empirically ascertained in Baaquie et al. (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). The model's action functionals for different commodities has been empirically determined and calibrated using the unequal time correlation functions of the market commodity prices using a perturbation expansion (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). Nine commodities drawn from the energy, metal and grain sectors are empirically studied and their auto-correlation for up to 300 days is described by the model to an accuracy of R(2)>0.90-using only six parameters. © 2015 The Author(s).
Constitutional Reform for Conflict Management
2014-04-01
conflict. Empirical studies reveal that both accommodative and integrative constitutional design can produce political stability , if properly...reform, and the ability of various constitutional designs to promote democracy and political stability .
NASA Astrophysics Data System (ADS)
Hur, Y.-J.; McManus, I. C.
2017-07-01
This commentary considers the role of the sublime in the Vienna Integrated Model of Art Perception (VIMAP; Pelowski, Markey, Forster, Gerger, & Leder [17]), and suggest that it is not precisely conceptualised in the model. In part that reflects different views and usages of the sublime in the literature, and here it is recommended that Burke's [2] view of the sublime is used as a primary framework for empirical research on the sublime.
A Motivational Theory of Life-Span Development
Heckhausen, Jutta; Wrosch, Carsten; Schulz, Richard
2010-01-01
This article had four goals. First, the authors identified a set of general challenges and questions that a life-span theory of development should address. Second, they presented a comprehensive account of their Motivational Theory of Life-Span Development. They integrated the model of optimization in primary and secondary control and the action-phase model of developmental regulation with their original life-span theory of control to present a comprehensive theory of development. Third, they reviewed the relevant empirical literature testing key propositions of the Motivational Theory of Life-Span Development. Finally, because the conceptual reach of their theory goes far beyond the current empirical base, they pointed out areas that deserve further and more focused empirical inquiry. PMID:20063963
Theory, modeling, and integrated studies in the Arase (ERG) project
NASA Astrophysics Data System (ADS)
Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa
2018-02-01
Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.
Empirical yield tables for Michigan.
Jerold T. Hahn; Joan M. Stelman
1984-01-01
Describes the tables derived from the 1980 Forest Survey of Michigan and presents ways the tables can be used. These tables are broken down according to Michigan's four Forest Survey Units, 14 forest types, and 5 site-index classes.
Leadership Development and Self-Development: An Empirical Study.
ERIC Educational Resources Information Center
McCollum, Bruce
1999-01-01
Describes a theory about consciousness and leadership practices derived from the Hindu Vedas. Shows how subjects who learned Transcendental Meditation as a self-development technique improved their leadership behaviors as measured by the Leadership Practices Inventory. (SK)
Estimation of two ordered mean residual lifetime functions.
Ebrahimi, N
1993-06-01
In many statistical studies involving failure data, biometric mortality data, and actuarial data, mean residual lifetime (MRL) function is of prime importance. In this paper we introduce the problem of nonparametric estimation of a MRL function on an interval when this function is bounded from below by another such function (known or unknown) on that interval, and derive the corresponding two functional estimators. The first is to be used when there is a known bound, and the second when the bound is another MRL function to be estimated independently. Both estimators are obtained by truncating the empirical estimator discussed by Yang (1978, Annals of Statistics 6, 112-117). In the first case, it is truncated at a known bound; in the second, at a point somewhere between the two empirical estimates. Consistency of both estimators is proved, and a pointwise large-sample distribution theory of the first estimator is derived.
Statistical parameters of thermally driven turbulent anabatic flow
NASA Astrophysics Data System (ADS)
Hilel, Roni; Liberzon, Dan
2016-11-01
Field measurements of thermally driven turbulent anabatic flow over a moderate slope are reported. A collocated hot-films-sonic anemometer (Combo) obtained the finer scales of the flow by implementing a Neural Networks based in-situ calibration technique. Eight days of continuous measurements of the wind and temperature fluctuations reviled a diurnal pattern of unstable stratification that forced development of highly turbulent unidirectional up slope flow. Empirical fits of important turbulence statistics were obtained from velocity fluctuations' time series alongside fully resolved spectra of velocity field components and characteristic length scales. TKE and TI showed linear dependence on Re, while velocity derivative skewness and dissipation rates indicated the anisotropic nature of the flow. Empirical fits of normalized velocity fluctuations power density spectra were derived as spectral shapes exhibited high level of similarity. Bursting phenomenon was detected at 15% of the total time. Frequency of occurrence, spectral characteristics and possible generation mechanism are discussed. BSF Grant #2014075.
Fanaian, Safa; Graas, Susan; Jiang, Yong; van der Zaag, Pieter
2015-02-01
The flow regime of rivers, being an integral part of aquatic ecosystems, provides many important services benefiting humans in catchments. Past water resource developments characterized by river embankments and dams, however, were often dominated by one (or few) economic use(s) of water. This results in a dramatically changed flow regime negatively affecting the provision of other ecosystem services sustained by the river flow. This study is intended to demonstrate the value of alternative flow regimes in a river that is highly modified by the presence of large hydropower dams and reservoirs, explicitly accounting for a broad range of flow-dependent ecosystem services. In this study, we propose a holistic approach for conducting an ecological economic assessment of a river's flow regime. This integrates recent advances in the conceptualization and classification of ecosystem services (UK NEA, 2011) with the flow regime evaluation technique developed by Korsgaard (2006). This integrated approach allows for a systematic comparison of the economic values of alternative flow regimes, including those that are considered beneficial for aquatic ecosystems. As an illustration, we applied this combined approach to the Lower Zambezi Basin, Mozambique. Empirical analysis shows that even though re-operating dams to create environmentally friendly flow regimes reduces hydropower benefits, the gains to goods derived from the aquatic ecosystem may offset the forgone hydropower benefits, thereby increasing the total economic value of river flow to society. The proposed integrated flow assessment approach can be a useful tool for welfare-improving decision-making in managing river basins. Copyright © 2014 Elsevier B.V. All rights reserved.
Alternative Derivations for the Poisson Integral Formula
ERIC Educational Resources Information Center
Chen, J. T.; Wu, C. S.
2006-01-01
Poisson integral formula is revisited. The kernel in the Poisson integral formula can be derived in a series form through the direct BEM free of the concept of image point by using the null-field integral equation in conjunction with the degenerate kernels. The degenerate kernels for the closed-form Green's function and the series form of Poisson…
Steichen, Clara; Luce, Eléanor; Maluenda, Jérôme; Tosca, Lucie; Moreno-Gimeno, Inmaculada; Desterke, Christophe; Dianat, Noushin; Goulinet-Mainot, Sylvie; Awan-Toor, Sarah; Burks, Deborah; Marie, Joëlle; Weber, Anne; Tachdjian, Gérard; Melki, Judith; Dubart-Kupperschmitt, Anne
2014-06-01
The use of synthetic messenger RNAs to generate human induced pluripotent stem cells (iPSCs) is particularly appealing for potential regenerative medicine applications, because it overcomes the common drawbacks of DNA-based or virus-based reprogramming strategies, including transgene integration in particular. We compared the genomic integrity of mRNA-derived iPSCs with that of retrovirus-derived iPSCs generated in strictly comparable conditions, by single-nucleotide polymorphism (SNP) and copy number variation (CNV) analyses. We showed that mRNA-derived iPSCs do not differ significantly from the parental fibroblasts in SNP analysis, whereas retrovirus-derived iPSCs do. We found that the number of CNVs seemed independent of the reprogramming method, instead appearing to be clone-dependent. Furthermore, differentiation studies indicated that mRNA-derived iPSCs differentiated efficiently into hepatoblasts and that these cells did not load additional CNVs during differentiation. The integration-free hepatoblasts that were generated constitute a new tool for the study of diseased hepatocytes derived from patients' iPSCs and their use in the context of stem cell-derived hepatocyte transplantation. Our findings also highlight the need to conduct careful studies on genome integrity for the selection of iPSC lines before using them for further applications. ©AlphaMed Press.
NASA Astrophysics Data System (ADS)
Shevtsova, Ekaterina
2011-10-01
For the general renormalizable N=1 supersymmetric Yang-Mills theory, regularized by higher covariant derivatives, a two-loop β-function is calculated. It is shown that all integrals, needed for its obtaining are integrals of total derivatives.
On the evaluation of derivatives of Gaussian integrals
NASA Technical Reports Server (NTRS)
Helgaker, Trygve; Taylor, Peter R.
1992-01-01
We show that by a suitable change of variables, the derivatives of molecular integrals over Gaussian-type functions required for analytic energy derivatives can be evaluated with significantly less computational effort than current formulations. The reduction in effort increases with the order of differentiation.
NASA Astrophysics Data System (ADS)
Ferber, Steven Dwight
2005-11-01
The Vibrational Circular Dichroism (VCD) of Nucleic Acids is a sensitive function of their conformation. DeVoe's classically derived polarizability theory allows the calculation of polymer absorption and circular dichroism spectra in any frequency range. Following the approach of Tinoco and Cech as modified by Moore and Self, calculations were done in the infrared (IR) region with theoretically derived monomer input parameters. Presented herein are calculated absorption and CD spectra for nucleic acid oligomers and polymers. These calculations improve upon earlier attempts, which utilized frequencies, intensities and normal modes from empirical analysis of the nitrogenous base of the monomers. These more complete input polarizability parameters include all contributions to specific vibrational normal modes for the entire nucleotide structure. They are derived from density functional theory (DFT) vibrational analysis on quasi-nucleotide monomers using the GAUSSIAN '98/'03 program. The normal modes are "integrated" for the first time into single virtual (DeVoe) oscillators by incorporating "fixed partial charges" in the manner of Schellman. The results include the complete set of monomer normal modes. All of these modes may be analyzed, in a manner similar to those demonstrated here (for the 1500-1800 cm-1 region). A model is utilized for the polymer/oligomer monomers which maintains the actual electrostatic charge on the adjacent protonated phosphoryl groups (hydrogen phosphate, a mono-anion). This deters the optimization from "collapsing" into a hydrogen-bonded "ball" and thereby maintains the extended (polymer-like) conformation. As well, the precise C2 "endo" conformation of the sugar ring is maintained in the DNA monomers. The analogous C3 "endo" conformation is also maintained for the RNA monomers, which are constrained by massive "anchors" at the phosphates. The complete IR absorbance spectra (0-4,000 cm-1) are calculated directly in Gaussian. Calculated VCD and Absorbance Spectra for the eight standard Ribonucleic and Deoxy-ribonucleic acid homo-polymers in the nitrogenous base absorbing region 1550-1750 cm-1 are presented. These spectra match measured spectra at least as well as spectra calculated from empirical parameters. These results demonstrate that the purely theoretical calculation, an example given herein, should serve to provide more transferable, universal parameters for the polarizability treatment of the optical properties of oligomers and polymers.
A root-mean-square pressure fluctuations model for internal flow applications
NASA Technical Reports Server (NTRS)
Chen, Y. S.
1985-01-01
A transport equation for the root-mean-square pressure fluctuations of turbulent flow is derived from the time-dependent momentum equation for incompressible flow. Approximate modeling of this transport equation is included to relate terms with higher order correlations to the mean quantities of turbulent flow. Three empirical constants are introduced in the model. Two of the empirical constants are estimated from homogeneous turbulence data and wall pressure fluctuations measurements. The third constant is determined by comparing the results of large eddy simulations for a plane channel flow and an annulus flow.
Provoost, Veerle
2015-03-01
This paper aims to provide a description of how authors publishing in medical ethics journals have made use of empirical research data in papers on the topic of gamete or embryo donation by means of references to studies conducted by others (secondary use). Rather than making a direct contribution to the theoretical methodological literature about the role empirical research data could play or should play in ethics studies, the focus is on the particular uses of these data and the problems that can be encountered with this use. In the selection of papers examined, apart from being used to describe the context, empirical evidence was mainly used to recount problems that needed solving. Few of the authors looked critically at the quality of the studies they quoted, and several instances were found of empirical data being used poorly or inappropriately. This study provides some initial baseline evidence that shows empirical data, in the form of references to studies, are sometimes being used in inappropriate ways. This suggests that medical ethicists should be more concerned about the quality of the empirical data selected, the appropriateness of the choice for a particular type of data (from a particular type of study) and the correct integration of this evidence in sound argumentation. Given that empirical data can be misused also when merely cited instead of reported, it may be worthwhile to explore good practice requirements for this type of use of empirical data in medical ethics.
Towards an purely data driven view on the global carbon cycle and its spatiotemporal variability
NASA Astrophysics Data System (ADS)
Zscheischler, Jakob; Mahecha, Miguel; Reichstein, Markus; Avitabile, Valerio; Carvalhais, Nuno; Ciais, Philippe; Gans, Fabian; Gruber, Nicolas; Hartmann, Jens; Herold, Martin; Jung, Martin; Landschützer, Peter; Laruelle, Goulven; Lauerwald, Ronny; Papale, Dario; Peylin, Philippe; Regnier, Pierre; Rödenbeck, Christian; Cuesta, Rosa Maria Roman; Valentini, Ricardo
2015-04-01
Constraining carbon (C) fluxes between the Earth's surface and the atmosphere at regional scale via observations is essential for understanding the Earth's carbon budget and predicting future atmospheric C concentrations. Carbon budgets have often been derived based on merging observations, statistical models and process-based models, for example in the Global Carbon Project (GCP). However, it would be helpful to derive global C budgets and fluxes at global scale as independent as possible from model assumptions to obtain an independent reference. Long-term in-situ measurements of land and ocean C stocks and fluxes have enabled the derivation of a new generation of data driven upscaled data products. Here, we combine a wide range of in-situ derived estimates of terrestrial and aquatic C fluxes for one decade. The data were produced and/or collected during the FP7 project GEOCARBON and include surface-atmosphere C fluxes from the terrestrial biosphere, fossil fuels, fires, land use change, rivers, lakes, estuaries and open ocean. By including spatially explicit uncertainties in each dataset we are able to identify regions that are well constrained by observations and areas where more measurements are required. Although the budget cannot be closed at the global scale, we provide, for the first time, global time-varying maps of the most important C fluxes, which are all directly derived from observations. The resulting spatiotemporal patterns of C fluxes and their uncertainties inform us about the needs for intensifying global C observation activities. Likewise, we provide priors for inversion exercises or to identify regions of high (and low) uncertainty of integrated C fluxes. We discuss the reasons for regions of high observational uncertainties, and for biases in the budget. Our data synthesis might also be used as empirical reference for other local and global C budgeting exercises.
Rigorous covariance propagation of geoid errors to geodetic MDT estimates
NASA Astrophysics Data System (ADS)
Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.
2012-04-01
The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.
Symbiotic empirical ethics: a practical methodology.
Frith, Lucy
2012-05-01
Like any discipline, bioethics is a developing field of academic inquiry; and recent trends in scholarship have been towards more engagement with empirical research. This 'empirical turn' has provoked extensive debate over how such 'descriptive' research carried out in the social sciences contributes to the distinctively normative aspect of bioethics. This paper will address this issue by developing a practical research methodology for the inclusion of data from social science studies into ethical deliberation. This methodology will be based on a naturalistic conception of ethical theory that sees practice as informing theory just as theory informs practice - the two are symbiotically related. From this engagement with practice, the ways that such theories need to be extended and developed can be determined. This is a practical methodology for integrating theory and practice that can be used in empirical studies, one that uses ethical theory both to explore the data and to draw normative conclusions. © 2010 Blackwell Publishing Ltd.
Qu, Haiyan; Shewchuk, Richard M; Alarcón, Graciela; Fraenkel, Liana; Leong, Amye; Dall'Era, Maria; Yazdany, Jinoos; Singh, Jasvinder A
2016-12-01
Numerous factors can impede or facilitate patients' medication decision-making and adherence to physicians' recommendations. Little is known about how patients and physicians jointly view issues that affect the decision-making process. Our objective was to derive an empirical framework of patient-identified facilitators to lupus medication decision-making from key stakeholders (including 15 physicians, 5 patients/patient advocates, and 8 medical professionals) using a patient-centered cognitive mapping approach. We used nominal group patient panels to identify facilitators to lupus treatment decision-making. Stakeholders independently sorted the identified facilitators (n = 98) based on their similarities and rated the importance of each facilitator in patient decision-making. Data were analyzed using multidimensional scaling and hierarchical cluster analysis. A cognitive map was derived that represents an empirical framework of facilitators for lupus treatment decisions from multiple stakeholders' perspectives. The facilitator clusters were 1) hope for a normal/healthy life, 2) understand benefits and effectiveness of taking medications, 3) desire to minimize side effects, 4) medication-related data, 5) medication effectiveness for "me," 6) family focus, 7) confidence in physician, 8) medication research, 9) reassurance about medication, and 10) medication economics. Consideration of how different stakeholders perceive the relative importance of lupus medication decision-making clusters is an important step toward improving patient-physician communication and effective shared decision-making. The empirically derived framework of medication decision-making facilitators can be used as a guide to develop a lupus decision aid that focuses on improving physician-patient communication. © 2016, American College of Rheumatology.
Eddy, Kamryn T.; le Grange, Daniel; Crosby, Ross D.; Hoste, Renee Rienecke; Doyle, Angela Celio; Smyth, Angela; Herzog, David B.
2009-01-01
Objective The purpose of this study was to empirically derive eating disorder phenotypes in a clinical sample of children and adolescents using latent profile analysis (LPA) and compare these latent profile (LP) groups to the DSM-IV-TR eating disorder categories. Method Eating disorder symptom data collected from 401 youth (ages 7–19; mean 15.14 ± 2.35y) seeking eating disorder treatment were included in LPA; general linear models were used to compare LP groups to DSM-IV-TR eating disorder categories on pre-treatment and outcome indices. Results Three LP groups were identified: LP1 (n=144), characterized binge eating and purging (“Binge/purge”); LP2 (n=126), characterized by excessive exercise and extreme eating disorder cognitions (“Exercise-extreme cognitions”); and LP3 (n=131), characterized by minimal eating disorder behaviors and cognitions (“Minimal behaviors/cognitions”). Identified LPs imperfectly resembled DSM-IV-TR eating disorders. LP1 resembled bulimia nervosa; LP2 and LP3 broadly resembled anorexia nervosa with a relaxed weight criterion, differentiated by excessive exercise and severity of eating disorder cognitions. LP groups were more differentiated than the DSM-IV-TR categories across pre-treatment eating disorder and general psychopathology indices, as well as weight change at follow-up. Neither LP nor DSM-IV-TR categories predicted change in binge/purge behaviors. Validation analyses suggest these empirically-derived groups improve upon the current DSM-IV-TR categories. Conclusions In children and adolescents, revisions for DSM-V should consider recognition of patients with minimal cognitive eating disorder symptoms. PMID:20410717
ERIC Educational Resources Information Center
Wink, Rudiger
2008-01-01
The article analyses the role of gatekeepers between regional and disciplinary innovation systems in stem cell research as a case of integrative technologies. Which kind of gatekeepers is needed and which function can be fulfilled, differs along the knowledge value chain. Empirical results are used to explain the rationality of stem cell policies…
ERIC Educational Resources Information Center
Hooper, Barbara R.; Greene, David; Sample, Pat L.
2014-01-01
The interconnected nature of knowledge in the health sciences is not always reflected in how curricula, courses, and learning activities are designed. Thus have scholars advocated for more explicit attention to connection-making, or integration, in teaching and learning. However, conceptual and empirical work to guide such efforts is limited. This…
ERIC Educational Resources Information Center
Phan, Huy Phuong
2010-01-01
The main aim of this study is to test a conceptualised framework that involved the integration of achievement goals, self-efficacy and self-esteem beliefs, and study-processing strategies. Two hundred and ninety (178 females, 112 males) first-year university students were administered a number of Likert-scale inventories in tutorial classes. Data…
ERIC Educational Resources Information Center
Kerkvliet, J.; Nowell, C.
2005-01-01
We develop and empirically implement a model of university student retention using opportunity cost, financial aid, academic and social integration, and students' background explanatory variables. For one year, we tracked students from Weber State University (WSU) and Oregon State University (OSU) to learn whether they remained enrolled for 0, 1,…
ERIC Educational Resources Information Center
Abdallah, Mahmoud Mohammad Sayed
2013-01-01
This paper reports on a design study conducted within the Egyptian context of pre-service EFL teacher education, which implemented a Community of Practice (CoP) design facilitated by Facebook, to integrate some new forms of online writing. Based on some preliminary empirical results triangulated with literature review, a preliminary design…
Salloch, Sabine; Wäscher, Sebastian; Vollmann, Jochen; Schildmann, Jan
2015-04-04
Empirical-ethical research constitutes a relatively new field which integrates socio-empirical research and normative analysis. As direct inferences from descriptive data to normative conclusions are problematic, an ethical framework is needed to determine the relevance of the empirical data for normative argument. While issues of normative-empirical collaboration and questions of empirical methodology have been widely discussed in the literature, the normative methodology of empirical-ethical research has seldom been addressed. Based on our own research experience, we discuss one aspect of this normative methodology, namely the selection of an ethical theory serving as a background for empirical-ethical research. Whereas criteria for a good ethical theory in philosophical ethics are usually related to inherent aspects, such as the theory's clarity or coherence, additional points have to be considered in the field of empirical-ethical research. Three of these additional criteria will be discussed in the article: (a) the adequacy of the ethical theory for the issue at stake, (b) the theory's suitability for the purposes and design of the empirical-ethical research project, and (c) the interrelation between the ethical theory selected and the theoretical backgrounds of the socio-empirical research. Using the example of our own study on the development of interventions which support clinical decision-making in oncology, we will show how the selection of an ethical theory as a normative background for empirical-ethical research can proceed. We will also discuss the limitations of the procedures chosen in our project. The article stresses that a systematic and reasoned approach towards theory selection in empirical-ethical research should be given priority rather than an accidental or implicit way of choosing the normative framework for one's own research. It furthermore shows that the overall design of an empirical-ethical study is a multi-faceted endeavor which has to balance between theoretical and pragmatic considerations.
NASA Astrophysics Data System (ADS)
Manners, R.; Wilcox, A. C.; Merritt, D. M.
2016-12-01
The ecogeomorphic response of riparian ecosystems to a change in hydrologic properties is difficult to predict because of the interactions and feedbacks among plants, water, and sediment. Most riparian models of community dynamics assume a static channel, yet geomorphic processes strongly control the establishment and survival of riparian vegetation. Using a combination of approaches that includes empirical relationships and hydrodynamic models, we model the coupled vegetation-topographic response of three cross-sections on the Yampa and Green Rivers in Dinosaur National Monument, to a shift in the flow regime. The locations represent the variable geomorphology and vegetation composition of these canyon-bound rivers. We account for the inundation and hydraulic properties of vegetation plots surveyed over three years within International River Interface Cooperative (iRIC) Fastmech, equipped with a vegetation module that accounts for flexible stems and plant reconfiguration. The presence of functional groupings of plants, or those plants that respond similarly to environmental factors such as water availability and disturbance are determined from flow response curves developed for the Yampa River. Using field measurements of vegetation morphology, distance from the channel centerline, and dominant particle size and modeled inundation properties we develop an empirical relationship between these variables and topographic change. We evaluate vegetation and channel form changes over decadal timescales, allowing for the integration of processes over time. From our analyses, we identify thresholds in the flow regime that alter the distribution of plants and reduce geomorphic complexity, predominately through side-channel and backwater infilling. Simplification of some processes (e.g., empirically-derived sedimentation) and detailed treatment of others (e.g., plant-flow interactions) allows us to model the coupled dynamics of riparian ecosystems and evaluate the impact of small to large shifts in the flow regime. This approach will be useful to river managers and scientists, as they try to understand the potential changes to riparian ecosystems with uncertain changes to hydrologic regimes as a result of a changing climate and human demands.
Pinichka, Chayut; Bundhamcharoen, Kanitta; Shibuya, Kenji
2015-05-14
Ambient ozone (O3) pollution has increased globally since preindustrial times. At present, O3 is one of the major air pollution concerns in Thailand, and is associated with health impacts such as chronic obstructive pulmonary disease (COPD). The objective of our study is to estimate the burden of disease attributed to O3 in 2009 in Thailand based on empirical evidence. We estimated disability-adjusted life years (DALYs) attributable to O3 using the comparative risk assessment framework in the Global Burden of Diseases (GBD) study. We quantified the population attributable fraction (PAF), integrated from Geographic Information Systems (GIS)-based spatial interpolation, the population distribution of exposure, and the exposure-response coefficient to spatially characterize exposure to ambient O3 pollution on a national scale. Exposure distribution was derived from GIS-based spatial interpolation O3 exposure model using Pollution Control Department Thailand (PCD) surface air pollution monitor network sources. Relative risk (RR) and population attributable fraction (PAF) were determined using health impact function estimates for O3. PAF (%) of COPD attributable to O3 were determined by region: at approximately, Northern=2.1, Northeastern=7.1, Central=9.6, Eastern=1.75, Western=1.47 and Southern=1.74. The total COPD burden attributable to O3 for Thailand in 2009 was 61,577 DALYs. Approximately 0.6% of the total DALYs in Thailand is male: 48,480 DALYs; and female: 13,097 DALYs. This study provides the first empirical evidence on the health burden (DALYs) attributable to O3 pollution in Thailand. Varying across regions, the disease burden attributable to O3 was 0.6% of the total national burden in 2009. Better empirical data on local specific sites, e.g. urban and rural areas, alternative exposure assessment, e.g. land use regression (LUR), and a local concentration-response coefficient are required for future studies in Thailand.
Empirical Characterization of Low-Altitude Ion Flux Derived from TWINS
NASA Astrophysics Data System (ADS)
Goldstein, J.; LLera, K.; McComas, D. J.; Redfern, J.; Valek, P. W.
2018-05-01
In this study we analyze ion differential flux from 10 events between 2008 and 2015. The ion fluxes are derived from low-altitude emissions (LAEs) in energetic neutral atom (ENA) images obtained by Two Wide-angle Imaging Neutral-atom Spectrometers (TWINS). The data set comprises 119.44 hr of observations, including 4,284 per energy images with 128,277 values of differential ENA flux from pixels near Earth's limb. Limb pixel data are extracted and mapped to a common polar ionospheric grid and associated with values of the Dst index. Statistical analysis is restricted to pixels within 10% of the LAE emissivity peak. For weak Dst conditions we find a premidnight peak in the average ion precipitation, whose flux and location are relatively insensitive to energy. For moderate Dst, elevated flux levels appear over a wider magnetic local time (MLT) range, with a separation of peak locations by energy. Strong disturbances bring a dramatic flux increase across the entire nightside at all energies but strongest for low energies in the postmidnight sector. The arrival of low-energy ions can lower the average energy for strong Dst, even as it raises the total integral number flux. TWINS-derived ion fluxes provide a macroscale measurement of the average precipitating ion distribution and confirm that convection, either quasi-steady or bursty, is an important process controlling the spatial and spectral properties of precipitating ions. The premidnight peak (weak Dst), MLT widening and energy-versus-MLT dependence (moderate Dst), and postmidnight low-energy ion enhancement (strong Dst) are consistent with observations and models of steady or bursty convective transport.