Sample records for develop empirical correlations

  1. Path integral for equities: Dynamic correlation and empirical analysis

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Cao, Yang; Lau, Ada; Tang, Pan

    2012-02-01

    This paper develops a model to describe the unequal time correlation between rate of returns of different stocks. A non-trivial fourth order derivative Lagrangian is defined to provide an unequal time propagator, which can be fitted to the market data. A calibration algorithm is designed to find the empirical parameters for this model and different de-noising methods are used to capture the signals concealed in the rate of return. The detailed results of this Gaussian model show that the different stocks can have strong correlation and the empirical unequal time correlator can be described by the model's propagator. This preliminary study provides a novel model for the correlator of different instruments at different times.

  2. Design of exchange-correlation functionals through the correlation factor approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlíková Přecechtělová, Jana, E-mail: j.precechtelova@gmail.com, E-mail: Matthias.Ernzerhof@UMontreal.ca; Institut für Chemie, Theoretische Chemie / Quantenchemie, Sekr. C7, Technische Universität Berlin, Straße des 17. Juni 135, 10623 Berlin; Bahmann, Hilke

    The correlation factor model is developed in which the spherically averaged exchange-correlation hole of Kohn-Sham theory is factorized into an exchange hole model and a correlation factor. The exchange hole model reproduces the exact exchange energy per particle. The correlation factor is constructed in such a manner that the exchange-correlation energy correctly reduces to exact exchange in the high density and rapidly varying limits. Four different correlation factor models are presented which satisfy varying sets of physical constraints. Three models are free from empirical adjustments to experimental data, while one correlation factor model draws on one empirical parameter. The correlationmore » factor models are derived in detail and the resulting exchange-correlation holes are analyzed. Furthermore, the exchange-correlation energies obtained from the correlation factor models are employed to calculate total energies, atomization energies, and barrier heights. It is shown that accurate, non-empirical functionals can be constructed building on exact exchange. Avenues for further improvements are outlined as well.« less

  3. Empirical correlations of the performance of vapor-anode PX-series AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, L.; Merrill, J.M.; Mayberry, C.

    Power systems based on AMTEC technology will be used for future NASA missions, including a Pluto-Express (PX) or Europa mission planned for approximately year 2004. AMTEC technology may also be used as an alternative to photovoltaic based power systems for future Air Force missions. An extensive development program of Alkali-Metal Thermal-to-Electric Conversion (AMTEC) technology has been underway at the Vehicle Technologies Branch of the Air Force Research Laboratory (AFRL) in Albuquerque, New Mexico since 1992. Under this program, numerical modeling and experimental investigations of the performance of the various multi-BASE tube, vapor-anode AMTEC cells have been and are being performed.more » Vacuum testing of AMTEC cells at AFRL determines the effects of changing the hot and cold end temperatures, T{sub hot} and T{sub cold}, and applied external load, R{sub ext}, on the cell electric power output, current-voltage characteristics, and conversion efficiency. Test results have traditionally been used to provide feedback to cell designers, and to validate numerical models. The current work utilizes the test data to develop empirical correlations for cell output performance under various working conditions. Because the empirical correlations are developed directly from the experimental data, uncertainties arising from material properties that must be used in numerical modeling can be avoided. Empirical correlations of recent vapor-anode PX-series AMTEC cells have been developed. Based on AMTEC theory and the experimental data, the cell output power (as well as voltage and current) was correlated as a function of three parameters (T{sub hot}, T{sub cold}, and R{sub ext}) for a given cell. Correlations were developed for different cells (PX-3C, PX-3A, PX-G3, and PX-5A), and were in good agreement with experimental data for these cells. Use of these correlations can greatly reduce the testing required to determine electrical performance of a given type of AMTEC cell over a wide range of operating conditions.« less

  4. Piaget's epistemic subject and science education: Epistemological vs. psychological issues

    NASA Astrophysics Data System (ADS)

    Kitchener, Richard F.

    1993-06-01

    Many individuals claim that Piaget's theory of cognitive development is empirically false or substantially disconfirmed by empirical research. Although there is substance to such a claim, any such conclusion must address three increasingly problematic issues about the possibility of providing an empirical test of Piaget's genetic epistemology: (1) the empirical underdetermination of theory by empirical evidence, (2) the empirical difficulty of testing competence-type explanations, and (3) the difficulty of empirically testing epistemic norms. This is especially true of a central epistemic construct in Piaget's theory — the epistemic subject. To illustrate how similar problems of empirical testability arise in the physical sciences, I briefly examine the case of Galileo and the correlative difficulty of empirically testing Galileo's laws. I then point out some important epistemological similarities between Galileo and Piaget together with correlative changes needed in science studies methodology. I conclude that many psychologists and science educators have failed to appreciate the difficulty of falsifying Piaget's theory because they have tacitly adopted a philosophy of science at odds with the paradigm-case of Galileo.

  5. An Empirical Bayes Approach to Spatial Analysis

    NASA Technical Reports Server (NTRS)

    Morris, C. N.; Kostal, H.

    1983-01-01

    Multi-channel LANDSAT data are collected in several passes over agricultural areas during the growing season. How empirical Bayes modeling can be used to develop crop identification and discrimination techniques that account for spatial correlation in such data is considered. The approach models the unobservable parameters and the data separately, hoping to take advantage of the fact that the bulk of spatial correlation lies in the parameter process. The problem is then framed in terms of estimating posterior probabilities of crop types for each spatial area. Some empirical Bayes spatial estimation methods are used to estimate the logits of these probabilities.

  6. Correlations by the entrainment theory of thermodynamic effects for developed cavitation in venturis and comparisons with ogive data

    NASA Technical Reports Server (NTRS)

    Billet, M. L.; Holl, J. W.; Weir, D. S.

    1975-01-01

    A semi-empirical entrainment theory was employed to correlate the measured temperature depression, Delta T, in a developed cavity for a venturi. This theory correlates Delta t in terms of the dimensionless numbers of Nusselt, Reynolds, Froude, Weber and Peclet, and dimensionless cavity length, L/D. These correlations are then compared with similar correlations for zero and quarter caliber ogives. In addition, cavitation number data for both limited and developed cavitation in venturis are presented.

  7. Optimum wall impedance for spinning modes: A correlation with mode cut-off ratio

    NASA Technical Reports Server (NTRS)

    Rice, E. J.

    1978-01-01

    A correlating equation relating the optimum acoustic impedance for the wall lining of a circular duct to the acoustic mode cut-off ratio, is presented. The optimum impedance was correlated with cut-off ratio because the cut-off ratio appears to be the fundamental parameter governing the propagation of sound in the duct. Modes with similar cut-off ratios respond in a similar way to the acoustic liner. The correlation is a semi-empirical expression developed from an empirical modification of an equation originally derived from sound propagation theory in a thin boundary layer. This correlating equation represents a part of a simplified liner design method, based upon modal cut-off ratio, for multimodal noise propagation.

  8. Empirical Correlations for the Solubility of Pressurant Gases in Cryogenic Propellants

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Asipauskas, Marius; VanDresar, Neil T.

    2010-01-01

    We have analyzed data published by others reporting the solubility of helium in liquid hydrogen, oxygen, and methane, and of nitrogen in liquid oxygen, to develop empirical correlations for the mole fraction of these pressurant gases in the liquid phase as a function of temperature and pressure. The data, compiled and provided by NIST, are from a variety of sources and covers a large range of liquid temperatures and pressures. The correlations were developed to yield accurate estimates of the mole fraction of the pressurant gas in the cryogenic liquid at temperature and pressures of interest to the propulsion community, yet the correlations developed are applicable over a much wider range. The mole fraction solubility of helium in all these liquids is less than 0.3% at the temperatures and pressures used in propulsion systems. When nitrogen is used as a pressurant for liquid oxygen, substantial contamination can result, though the diffusion into the liquid is slow.

  9. High Speed Jet Noise Prediction Using Large Eddy Simulation

    NASA Technical Reports Server (NTRS)

    Lele, Sanjiva K.

    2002-01-01

    Current methods for predicting the noise of high speed jets are largely empirical. These empirical methods are based on the jet noise data gathered by varying primarily the jet flow speed, and jet temperature for a fixed nozzle geometry. Efforts have been made to correlate the noise data of co-annular (multi-stream) jets and for the changes associated with the forward flight within these empirical correlations. But ultimately these emipirical methods fail to provide suitable guidance in the selection of new, low-noise nozzle designs. This motivates the development of a new class of prediction methods which are based on computational simulations, in an attempt to remove the empiricism of the present day noise predictions.

  10. Semi-empirical correlation for binary interaction parameters of the Peng-Robinson equation of state with the van der Waals mixing rules for the prediction of high-pressure vapor-liquid equilibrium.

    PubMed

    Fateen, Seif-Eddeen K; Khalil, Menna M; Elnabawy, Ahmed O

    2013-03-01

    Peng-Robinson equation of state is widely used with the classical van der Waals mixing rules to predict vapor liquid equilibria for systems containing hydrocarbons and related compounds. This model requires good values of the binary interaction parameter kij . In this work, we developed a semi-empirical correlation for kij partly based on the Huron-Vidal mixing rules. We obtained values for the adjustable parameters of the developed formula for over 60 binary systems and over 10 categories of components. The predictions of the new equation system were slightly better than the constant-kij model in most cases, except for 10 systems whose predictions were considerably improved with the new correlation.

  11. The study and development of the empirical correlations equation of natural convection heat transfer on vertical rectangular sub-channels

    NASA Astrophysics Data System (ADS)

    Kamajaya, Ketut; Umar, Efrizon; Sudjatmi, K. S.

    2012-06-01

    This study focused on natural convection heat transfer using a vertical rectangular sub-channel and water as the coolant fluid. To conduct this study has been made pipe heaters are equipped with thermocouples. Each heater is equipped with five thermocouples along the heating pipes. The diameter of each heater is 2.54 cm and 45 cm in length. The distance between the central heating and the pitch is 29.5 cm. Test equipment is equipped with a primary cooling system, a secondary cooling system and a heat exchanger. The purpose of this study is to obtain new empirical correlations equations of the vertical rectangular sub-channel, especially for the natural convection heat transfer within a bundle of vertical cylinders rectangular arrangement sub-channels. The empirical correlation equation can support the thermo-hydraulic analysis of research nuclear reactors that utilize cylindrical fuel rods, and also can be used in designing of baffle-free vertical shell and tube heat exchangers. The results of this study that the empirical correlation equations of natural convection heat transfer coefficients with rectangular arrangement is Nu = 6.3357 (Ra.Dh/x)0.0740.

  12. Success Avoidant Motivation and Behavior; Its Development Correlates and Situational Determinants. Final Report.

    ERIC Educational Resources Information Center

    Horner, Matina S.

    This paper reports on a successful attempt to understand success avoidant motivation and behavior by the development of an empirically sophisticated scoring system of success avoidant motivation and the observation of its behavioral correlates and situational determinants. Like most of the work on achievement motivation, the study was carried out…

  13. Climate Prediction for Brazil's Nordeste: Performance of Empirical and Numerical Modeling Methods.

    NASA Astrophysics Data System (ADS)

    Moura, Antonio Divino; Hastenrath, Stefan

    2004-07-01

    Comparisons of performance of climate forecast methods require consistency in the predictand and a long common reference period. For Brazil's Nordeste, empirical methods developed at the University of Wisconsin use preseason (October January) rainfall and January indices of the fields of meridional wind component and sea surface temperature (SST) in the tropical Atlantic and the equatorial Pacific as input to stepwise multiple regression and neural networking. These are used to predict the March June rainfall at a network of 27 stations. An experiment at the International Research Institute for Climate Prediction, Columbia University, with a numerical model (ECHAM4.5) used global SST information through February to predict the March June rainfall at three grid points in the Nordeste. The predictands for the empirical and numerical model forecasts are correlated at +0.96, and the period common to the independent portion of record of the empirical prediction and the numerical modeling is 1968 99. Over this period, predicted versus observed rainfall are evaluated in terms of correlation, root-mean-square error, absolute error, and bias. Performance is high for both approaches. Numerical modeling produces a correlation of +0.68, moderate errors, and strong negative bias. For the empirical methods, errors and bias are small, and correlations of +0.73 and +0.82 are reached between predicted and observed rainfall.


  14. Data mining in forecasting PVT correlations of crude oil systems based on Type1 fuzzy logic inference systems

    NASA Astrophysics Data System (ADS)

    El-Sebakhy, Emad A.

    2009-09-01

    Pressure-volume-temperature properties are very important in the reservoir engineering computations. There are many empirical approaches for predicting various PVT properties based on empirical correlations and statistical regression models. Last decade, researchers utilized neural networks to develop more accurate PVT correlations. These achievements of neural networks open the door to data mining techniques to play a major role in oil and gas industry. Unfortunately, the developed neural networks correlations are often limited, and global correlations are usually less accurate compared to local correlations. Recently, adaptive neuro-fuzzy inference systems have been proposed as a new intelligence framework for both prediction and classification based on fuzzy clustering optimization criterion and ranking. This paper proposes neuro-fuzzy inference systems for estimating PVT properties of crude oil systems. This new framework is an efficient hybrid intelligence machine learning scheme for modeling the kind of uncertainty associated with vagueness and imprecision. We briefly describe the learning steps and the use of the Takagi Sugeno and Kang model and Gustafson-Kessel clustering algorithm with K-detected clusters from the given database. It has featured in a wide range of medical, power control system, and business journals, often with promising results. A comparative study will be carried out to compare their performance of this new framework with the most popular modeling techniques, such as neural networks, nonlinear regression, and the empirical correlations algorithms. The results show that the performance of neuro-fuzzy systems is accurate, reliable, and outperform most of the existing forecasting techniques. Future work can be achieved by using neuro-fuzzy systems for clustering the 3D seismic data, identification of lithofacies types, and other reservoir characterization.

  15. Physical Activity and Psychological Correlates during an After-School Running Club

    ERIC Educational Resources Information Center

    Kahan, David; McKenzie, Thomas L.

    2018-01-01

    Background: After-school programs (ASPs) have the potential to contribute to moderate-to-vigorous physical activity (MVPA), but there is limited empirical evidence to guide their development and implementation. Purpose: This study assessed the replication of an elementary school running program and identified psychological correlates of children's…

  16. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1980-01-01

    The use of the AMOEBA clustering/classification algorithm was investigated as a basis for both a color display generation technique and maximum likelihood proportion estimation procedure. An approach to analyzing large data reduction systems was formulated and an exploratory empirical study of spatial correlation in LANDSAT data was also carried out. Topics addressed include: (1) development of multiimage color images; (2) spectral spatial classification algorithm development; (3) spatial correlation studies; and (4) evaluation of data systems.

  17. Correlation of published data on the solubility of methane in H/sub 2/O-NaCl solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coco, L.T.; Johnson, A.E. Jr.; Bebout, D.G.

    1981-01-01

    A new correlation of the available published data for the solubility of methane in water was developed, based on fundamental thermodynamic relationships. An empirical relationship for the salting-out coefficient of NaCl for methane solubility in water was determined as a function of temperature. Root mean square and average deviations for the new correlation, the Haas correlation, and the revised Blount equation are compared.

  18. Designing Large-Scale Multisite and Cluster-Randomized Studies of Professional Development

    ERIC Educational Resources Information Center

    Kelcey, Ben; Spybrook, Jessaca; Phelps, Geoffrey; Jones, Nathan; Zhang, Jiaqi

    2017-01-01

    We develop a theoretical and empirical basis for the design of teacher professional development studies. We build on previous work by (a) developing estimates of intraclass correlation coefficients for teacher outcomes using two- and three-level data structures, (b) developing estimates of the variance explained by covariates, and (c) modifying…

  19. Socio-demographic and academic correlates of clinical reasoning in a dental school in South Africa.

    PubMed

    Postma, T C; White, J G

    2017-02-01

    There are no empirical studies that describe factors that may influence the development of integrated clinical reasoning skills in dental education. Hence, this study examines the association between outcomes of clinical reasoning in relation with differences in instructional design and student factors. Progress test scores, including diagnostic and treatment planning scores, of fourth and fifth year dental students (2009-2011) at the University of Pretoria, South Africa served as the outcome measures in stepwise linear regression analyses. These scores were correlated with the instructional design (lecture-based teaching and learning (LBTL = 0) or case-based teaching and learning (CBTL = 1), students' grades in Oral Biology, indicators of socio-economic status (SES) and gender. CBTL showed an independent association with progress test scores. Oral Biology scores correlated with diagnostic component scores. Diagnostic component scores correlated with treatment planning scores in the fourth year of study but not in the fifth year of study. 'SES' correlated with progress test scores in year five only, while gender showed no correlation. The empirical evidence gathered in this study provides support for scaffolded inductive teaching and learning methods to develop clinical reasoning skills. Knowledge in Oral Biology and reading skills may be important attributes to develop to ensure that students are able to reason accurately in a clinical setting. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Correlation of refrigerant mass flow rate through adiabatic capillary tubes using mixture refrigerant carbondioxide and ethane for low temperature applications

    NASA Astrophysics Data System (ADS)

    Nasruddin, Syaka, Darwin R. B.; Alhamid, M. Idrus

    2012-06-01

    Various binary mixtures of carbon dioxide and hydrocarbons, especially propane or ethane, as alternative natural refrigerants to Chlorofluorocarbons (CFCs) or Hydro fluorocarbons (HFCs) are presented in this paper. Their environmental performance is friendly, with an ozone depletion potential (ODP) of zero and Global-warming potential (GWP) smaller than 20. The capillary tube performance for the alternative refrigerant HFC HCand mixed refrigerants have been widely studied. However, studies that discuss the performance of the capillary tube to a mixture of natural refrigerants, in particular a mixture of azeotrope carbon dioxide and ethane is still undeveloped. A method of empirical correlation to determine the mass flow rate and pipe length has an important role in the design of the capillary tube for industrial refrigeration. Based on the variables that effect the rate of mass flow of refrigerant in the capillary tube, the Buckingham Pi theorem formulated eight non-dimensional parameters to be developed into an empirical equations correlation. Furthermore, non-linear regression analysis used to determine the co-efficiency and exponent of this empirical correlation based on experimental verification of the results database.

  1. Disorders without borders: current and future directions in the meta-structure of mental disorders.

    PubMed

    Carragher, Natacha; Krueger, Robert F; Eaton, Nicholas R; Slade, Tim

    2015-03-01

    Classification is the cornerstone of clinical diagnostic practice and research. However, the extant psychiatric classification systems are not well supported by research evidence. In particular, extensive comorbidity among putatively distinct disorders flags an urgent need for fundamental changes in how we conceptualize psychopathology. Over the past decade, research has coalesced on an empirically based model that suggests many common mental disorders are structured according to two correlated latent dimensions: internalizing and externalizing. We review and discuss the development of a dimensional-spectrum model which organizes mental disorders in an empirically based manner. We also touch upon changes in the DSM-5 and put forward recommendations for future research endeavors. Our review highlights substantial empirical support for the empirically based internalizing-externalizing model of psychopathology, which provides a parsimonious means of addressing comorbidity. As future research goals, we suggest that the field would benefit from: expanding the meta-structure of psychopathology to include additional disorders, development of empirically based thresholds, inclusion of a developmental perspective, and intertwining genomic and neuroscience dimensions with the empirical structure of psychopathology.

  2. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  3. Estimating the volume of Alpine glacial lakes

    NASA Astrophysics Data System (ADS)

    Cook, S. J.; Quincey, D. J.

    2015-12-01

    Supraglacial, moraine-dammed and ice-dammed lakes represent a potential glacial lake outburst flood (GLOF) threat to downstream communities in many mountain regions. This has motivated the development of empirical relationships to predict lake volume given a measurement of lake surface area obtained from satellite imagery. Such relationships are based on the notion that lake depth, area and volume scale predictably. We critically evaluate the performance of these existing empirical relationships by examining a global database of glacial lake depths, areas and volumes. Results show that lake area and depth are not always well correlated (r2 = 0.38) and that although lake volume and area are well correlated (r2 = 0.91), and indeed are auto-correlated, there are distinct outliers in the data set. These outliers represent situations where it may not be appropriate to apply existing empirical relationships to predict lake volume and include growing supraglacial lakes, glaciers that recede into basins with complex overdeepened morphologies or that have been deepened by intense erosion and lakes formed where glaciers advance across and block a main trunk valley. We use the compiled data set to develop a conceptual model of how the volumes of supraglacial ponds and lakes, moraine-dammed lakes and ice-dammed lakes should be expected to evolve with increasing area. Although a large amount of bathymetric data exist for moraine-dammed and ice-dammed lakes, we suggest that further measurements of growing supraglacial ponds and lakes are needed to better understand their development.

  4. An Empirical Research on the Correlation between Human Capital and Career Success of Knowledge Workers in Enterprise

    NASA Astrophysics Data System (ADS)

    Guo, Wenchen; Xiao, Hongjun; Yang, Xi

    Human capital plays an important part in employability of knowledge workers, also it is the important intangible assets of company. This paper explores the correlation between human capital and career success of knowledge workers. Based on literature retrieval, we identified measuring tool of career success and modified further; measuring human capital with self-developed scale of high reliability and validity. After exploratory factor analysis, we suggest that human capital contents four dimensions, including education, work experience, learning ability and training; career success contents three dimensions, including perceived internal competitiveness of organization, perceived external competitiveness of organization and career satisfaction. The result of empirical analysis indicates that there is a positive correlation between human capital and career success, and human capital is an excellent predictor of career success beyond demographics variables.

  5. Motor Coordination and Executive Functions

    ERIC Educational Resources Information Center

    Michel, Eva

    2012-01-01

    Since Piaget, the view that motor and cognitive development are interrelated has gained wide acceptance. However, empirical research on this issue is still rare. Few studies show a correlation of performance in cognitive and motor tasks in typically developing children. More specifically, Diamond A. (2000) hypothesizes an involvement of executive…

  6. Prediction of shear wave velocity using empirical correlations and artificial intelligence methods

    NASA Astrophysics Data System (ADS)

    Maleki, Shahoo; Moradzadeh, Ali; Riabi, Reza Ghavami; Gholami, Raoof; Sadeghzadeh, Farhad

    2014-06-01

    Good understanding of mechanical properties of rock formations is essential during the development and production phases of a hydrocarbon reservoir. Conventionally, these properties are estimated from the petrophysical logs with compression and shear sonic data being the main input to the correlations. This is while in many cases the shear sonic data are not acquired during well logging, which may be for cost saving purposes. In this case, shear wave velocity is estimated using available empirical correlations or artificial intelligent methods proposed during the last few decades. In this paper, petrophysical logs corresponding to a well drilled in southern part of Iran were used to estimate the shear wave velocity using empirical correlations as well as two robust artificial intelligence methods knows as Support Vector Regression (SVR) and Back-Propagation Neural Network (BPNN). Although the results obtained by SVR seem to be reliable, the estimated values are not very precise and considering the importance of shear sonic data as the input into different models, this study suggests acquiring shear sonic data during well logging. It is important to note that the benefits of having reliable shear sonic data for estimation of rock formation mechanical properties will compensate the possible additional costs for acquiring a shear log.

  7. What Good Is Gratitude in Youth and Schools? A Systematic Review and Meta-Analysis of Correlates and Intervention Outcomes

    ERIC Educational Resources Information Center

    Renshaw, Tyler L.; Olinger Steeves, Rachel M.

    2016-01-01

    The development of gratitude in youth has received increasing attention during the past several years, and gratitude-based interventions have often been recommended for use in schools. Yet, the empirical status of the correlates of gratitude and the effects of gratitude-based interventions on youths' outcomes remains unclear. The present study…

  8. How Does Leadership Development Help Universities Become Learning Organisations?

    ERIC Educational Resources Information Center

    Gentle, Paul; Clifton, Louise

    2017-01-01

    Purpose: The purpose of this paper is to draw on empirical data to interrogate the correlation between participation in leadership development programmes by individual leaders and the ability of higher education institutions to learn organisationally from such participation. Design/methodology/approach: Applying a multi-stakeholder perspective,…

  9. Assessing Computer Literacy: A Validated Instrument and Empirical Results.

    ERIC Educational Resources Information Center

    Gabriel, Roy M.

    1985-01-01

    Describes development of a comprehensive computer literacy assessment battery for K-12 curriculum based on objectives of a curriculum implemented in the Worldwide Department of Defense Dependents Schools system. Test development and field test data are discussed and a correlational analysis which assists in interpretation of test results is…

  10. An empirical description of the dispersion of 5th and 95th percentiles in worldwide anthropometric data applied to estimating accommodation with unknown correlation values.

    PubMed

    Albin, Thomas J; Vink, Peter

    2015-01-01

    Anthropometric data are assumed to have a Gaussian (Normal) distribution, but if non-Gaussian, accommodation estimates are affected. When data are limited, users may choose to combine anthropometric elements by Combining Percentiles (CP) (adding or subtracting), despite known adverse effects. This study examined whether global anthropometric data are Gaussian distributed. It compared the Median Correlation Method (MCM) of combining anthropometric elements with unknown correlations to CP to determine if MCM provides better estimates of percentile values and accommodation. Percentile values of 604 male and female anthropometric data drawn from seven countries worldwide were expressed as standard scores. The standard scores were tested to determine if they were consistent with a Gaussian distribution. Empirical multipliers for determining percentile values were developed.In a test case, five anthropometric elements descriptive of seating were combined in addition and subtraction models. Percentile values were estimated for each model by CP, MCM with Gaussian distributed data, or MCM with empirically distributed data. The 5th and 95th percentile values of a dataset of global anthropometric data are shown to be asymmetrically distributed. MCM with empirical multipliers gave more accurate estimates of 5th and 95th percentiles values. Anthropometric data are not Gaussian distributed. The MCM method is more accurate than adding or subtracting percentiles.

  11. Jet Aeroacoustics: Noise Generation Mechanism and Prediction

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    1998-01-01

    This report covers the third year research effort of the project. The research work focussed on the fine scale mixing noise of both subsonic and supersonic jets and the effects of nozzle geometry and tabs on subsonic jet noise. In publication 1, a new semi-empirical theory of jet mixing noise from fine scale turbulence is developed. By an analogy to gas kinetic theory, it is shown that the source of noise is related to the time fluctuations of the turbulence kinetic theory. On starting with the Reynolds Averaged Navier-Stokes equations, a formula for the radiated noise is derived. An empirical model of the space-time correlation function of the turbulence kinetic energy is adopted. The form of the model is in good agreement with the space-time two-point velocity correlation function measured by Davies and coworkers. The parameters of the correlation are related to the parameters of the k-epsilon turbulence model. Thus the theory is self-contained. Extensive comparisons between the computed noise spectrum of the theory and experimental measured have been carried out. The parameters include jet Mach number from 0.3 to 2.0 and temperature ratio from 1.0 to 4.8. Excellent agreements are found in the spectrum shape, noise intensity and directivity. It is envisaged that the theory would supercede all semi-empirical and totally empirical jet noise prediction methods in current use.

  12. Simple, empirical approach to predict neutron capture cross sections from nuclear masses

    NASA Astrophysics Data System (ADS)

    Couture, A.; Casten, R. F.; Cakirli, R. B.

    2017-12-01

    Background: Neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40 % , and has limited predictive power, with predictions from different models rapidly differing by an order of magnitude a few nucleons from the last measurement. Purpose: To develop a new approach to predicting neutron capture cross sections over broad ranges of nuclei that accounts for their values where known and which has reliable predictive power with small uncertainties for many nuclei where they are unknown. Methods: Experimental neutron capture cross sections were compared to empirical mass observables in regions of similar structure. Results: We present an extremely simple method, based solely on empirical mass observables, that correlates neutron capture cross sections in the critical energy range from a few keV to a couple hundred keV. We show that regional cross sections are compactly correlated in medium and heavy mass nuclei with the two-neutron separation energy. These correlations are easily amenable to predict unknown cross sections, often converting the usual extrapolations to more reliable interpolations. It almost always reproduces existing data to within 25 % and estimated uncertainties are below about 40 % up to 10 nucleons beyond known data. Conclusions: Neutron capture cross sections display a surprisingly strong connection to the two-neutron separation energy, a nuclear structure property. The simple, empirical correlations uncovered provide model-independent predictions of neutron capture cross sections, extending far from stability, including for nuclei of the highest sensitivity to r -process nucleosynthesis.

  13. Empirical Bayes method for reducing false discovery rates of correlation matrices with block diagonal structure.

    PubMed

    Pacini, Clare; Ajioka, James W; Micklem, Gos

    2017-04-12

    Correlation matrices are important in inferring relationships and networks between regulatory or signalling elements in biological systems. With currently available technology sample sizes for experiments are typically small, meaning that these correlations can be difficult to estimate. At a genome-wide scale estimation of correlation matrices can also be computationally demanding. We develop an empirical Bayes approach to improve covariance estimates for gene expression, where we assume the covariance matrix takes a block diagonal form. Our method shows lower false discovery rates than existing methods on simulated data. Applied to a real data set from Bacillus subtilis we demonstrate it's ability to detecting known regulatory units and interactions between them. We demonstrate that, compared to existing methods, our method is able to find significant covariances and also to control false discovery rates, even when the sample size is small (n=10). The method can be used to find potential regulatory networks, and it may also be used as a pre-processing step for methods that calculate, for example, partial correlations, so enabling the inference of the causal and hierarchical structure of the networks.

  14. Prediction of Partition Coefficients of Organic Compounds between SPME/PDMS and Aqueous Solution

    PubMed Central

    Chao, Keh-Ping; Lu, Yu-Ting; Yang, Hsiu-Wen

    2014-01-01

    Polydimethylsiloxane (PDMS) is commonly used as the coated polymer in the solid phase microextraction (SPME) technique. In this study, the partition coefficients of organic compounds between SPME/PDMS and the aqueous solution were compiled from the literature sources. The correlation analysis for partition coefficients was conducted to interpret the effect of their physicochemical properties and descriptors on the partitioning process. The PDMS-water partition coefficients were significantly correlated to the polarizability of organic compounds (r = 0.977, p < 0.05). An empirical model, consisting of the polarizability, the molecular connectivity index, and an indicator variable, was developed to appropriately predict the partition coefficients of 61 organic compounds for the training set. The predictive ability of the empirical model was demonstrated by using it on a test set of 26 chemicals not included in the training set. The empirical model, applying the straightforward calculated molecular descriptors, for estimating the PDMS-water partition coefficient will contribute to the practical applications of the SPME technique. PMID:24534804

  15. Fine structure of spectral properties for random correlation matrices: An application to financial markets

    NASA Astrophysics Data System (ADS)

    Livan, Giacomo; Alfarano, Simone; Scalas, Enrico

    2011-07-01

    We study some properties of eigenvalue spectra of financial correlation matrices. In particular, we investigate the nature of the large eigenvalue bulks which are observed empirically, and which have often been regarded as a consequence of the supposedly large amount of noise contained in financial data. We challenge this common knowledge by acting on the empirical correlation matrices of two data sets with a filtering procedure which highlights some of the cluster structure they contain, and we analyze the consequences of such filtering on eigenvalue spectra. We show that empirically observed eigenvalue bulks emerge as superpositions of smaller structures, which in turn emerge as a consequence of cross correlations between stocks. We interpret and corroborate these findings in terms of factor models, and we compare empirical spectra to those predicted by random matrix theory for such models.

  16. Developing Reading Identities: Understanding Issues of Motivation within the Reading Workshop

    ERIC Educational Resources Information Center

    Miller, Leigh Ann

    2013-01-01

    Empirical evidence suggests a correlation between motivation and reading achievement as well as a decline in motivation as students progress through the grades. In order to address this issue, it is necessary to determine the instructional methods that promote motivation and identity development in reading. This study examines the motivation and…

  17. Consequence Assessment Methods for Incidents Involving Releases From Liquefied Natural Gas Carriers

    DTIC Science & Technology

    2004-05-13

    the downwind direction. The Thomas (1965) correlation is used to calculate flame length . Flame tilt is estimated using an empirical correlation from...follows: From TNO (1997) • Thomas (1963) correlation for flame length • For an experimental LNG pool fire of 16.8-m diameter, a mass burning flux of...m, flame length ranged from 50 to 78 m, and tilt angle from 27 to 35 degrees From Rew (1996) • Work included a review of recent developments in

  18. AN EMPIRICAL INVESTIGATION OF THE EFFECTS OF NONNORMALITY UPON THE SAMPLING DISTRIBUTION OF THE PROJECT MOMENT CORRELATION COEFFICIENT.

    ERIC Educational Resources Information Center

    HJELM, HOWARD; NORRIS, RAYMOND C.

    THE STUDY EMPIRICALLY DETERMINED THE EFFECTS OF NONNORMALITY UPON SOME SAMPLING DISTRIBUTIONS OF THE PRODUCT MOMENT CORRELATION COEFFICIENT (PMCC). SAMPLING DISTRIBUTIONS OF THE PMCC WERE OBTAINED BY DRAWING NUMEROUS SAMPLES FROM CONTROL AND EXPERIMENTAL POPULATIONS HAVING VARIOUS DEGREES OF NONNORMALITY AND BY CALCULATING CORRELATION COEFFICIENTS…

  19. Evaluation of Phytoavailability of Heavy Metals to Chinese Cabbage (Brassica chinensis L.) in Rural Soils

    PubMed Central

    Hseu, Zeng-Yei; Zehetner, Franz

    2014-01-01

    This study compared the extractability of Cd, Cu, Ni, Pb, and Zn by 8 extraction protocols for 22 representative rural soils in Taiwan and correlated the extractable amounts of the metals with their uptake by Chinese cabbage for developing an empirical model to predict metal phytoavailability based on soil properties. Chemical agents in these protocols included dilute acids, neutral salts, and chelating agents, in addition to water and the Rhizon soil solution sampler. The highest concentrations of extractable metals were observed in the HCl extraction and the lowest in the Rhizon sampling method. The linear correlation coefficients between extractable metals in soil pools and metals in shoots were higher than those in roots. Correlations between extractable metal concentrations and soil properties were variable; soil pH, clay content, total metal content, and extractable metal concentration were considered together to simulate their combined effects on crop uptake by an empirical model. This combination improved the correlations to different extents for different extraction methods, particularly for Pb, for which the extractable amounts with any extraction protocol did not correlate with crop uptake by simple correlation analysis. PMID:25295297

  20. Cultural differences and economic development of 31 countries.

    PubMed

    Nadler, Scott; Zemanek, James E

    2006-08-01

    To update and extend the empirical research of Hofstede, the influence of culture on 31 nations' economic development was examined and support for modernization theory provided. Per capita gross domestic product, literacy rates, the negative of the population growth rate, and life expectancy development data were collected from 31 countries. The pattern of correlations among measures provided partial support for Hofstede's 1980 findings.

  1. Evaluation of Advanced Stirling Convertor Net Heat Input Correlation Methods Using a Thermal Standard

    NASA Technical Reports Server (NTRS)

    Briggs, Maxwell H.; Schifer, Nicholas A.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including testing validation hardware, known as the Thermal Standard, to provide a direct comparison to numerical and empirical models used to predict convertor net heat input. This validation hardware provided a comparison for scrutinizing and improving empirical correlations and numerical models of ASC-E2 net heat input. This hardware simulated the characteristics of an ASC-E2 convertor in both an operating and non-operating mode. This paper describes the Thermal Standard testing and the conclusions of the validation effort applied to the empirical correlation methods used by the Radioisotope Power System (RPS) team at NASA Glenn.

  2. Development of an Empirical Methods for Predicting Jet Mixing Noise of Cold Flow Rectangular Jets

    NASA Technical Reports Server (NTRS)

    Russell, James W.

    1999-01-01

    This report presents an empirical method for predicting the jet mixing noise levels of cold flow rectangular jets. The report presents a detailed analysis of the methodology used in development of the prediction method. The empirical correlations used are based on narrow band acoustic data for cold flow rectangular model nozzle tests conducted in the NASA Langley Jet Noise Laboratory. There were 20 separate nozzle test operating conditions. For each operating condition 60 Hz bandwidth microphone measurements were made over a frequency range from 0 to 60,000 Hz. Measurements were performed at 16 polar directivity angles ranging from 45 degrees to 157.5 degrees. At each polar directivity angle, measurements were made at 9 azimuth directivity angles. The report shows the methods employed to remove screech tones and shock noise from the data in order to obtain the jet mixing noise component. The jet mixing noise was defined in terms of one third octave band spectral content, polar and azimuth directivity, and overall power level. Empirical correlations were performed over the range of test conditions to define each of these jet mixing noise parameters as a function of aspect ratio, jet velocity, and polar and azimuth directivity angles. The report presents the method for predicting the overall power level, the average polar directivity, the azimuth directivity and the location and shape of the spectra for jet mixing noise of cold flow rectangular jets.

  3. Development and Validation of the Five-by-Five Resilience Scale.

    PubMed

    DeSimone, Justin A; Harms, P D; Vanhove, Adam J; Herian, Mitchel N

    2017-09-01

    This article introduces a new measure of resilience and five related protective factors. The Five-by-Five Resilience Scale (5×5RS) is developed on the basis of theoretical and empirical considerations. Two samples ( N = 475 and N = 613) are used to assess the factor structure, reliability, convergent validity, and criterion-related validity of the 5×5RS. Confirmatory factor analysis supports a bifactor model. The 5×5RS demonstrates adequate internal consistency as evidenced by Cronbach's alpha and empirical reliability estimates. The 5×5RS correlates positively with the Connor-Davidson Resilience Scale (CD-RISC), a commonly used measure of resilience. The 5×5RS exhibits similar criterion-related validity to the CD-RISC as evidenced by positive correlations with satisfaction with life, meaning in life, and secure attachment style as well as negative correlations with rumination and anxious or avoidant attachment styles. 5×5RS scores are positively correlated with healthy behaviors such as exercise and negatively correlated with sleep difficulty and symptomology of anxiety and depression. The 5×5RS incrementally explains variance in some criteria above and beyond the CD-RISC. Item responses are modeled using the graded response model. Information estimates demonstrate the ability of the 5×5RS to assess individuals within at least one standard deviation of the mean on relevant latent traits.

  4. Differentiation of Self and Spirituality: Empirical Explorations

    ERIC Educational Resources Information Center

    Jankowski, Peter J.; Vaughn, Marsha

    2009-01-01

    This study explored the relationships between an individual's interpersonal functioning, perceived spirituality, and selected spiritual practices. Using Bowen's family systems theory, the authors proposed that an individual's level of spiritual development and level of differentiation are correlated and that certain spiritual practices are…

  5. Ignition behavior of live California chaparral leaves

    Treesearch

    J.D. Engstrom; J.K Butler; S.G. Smith; L.L. Baxter; T.H. Fletcher; D.R. Weise

    2004-01-01

    Current forest fire models are largely empirical correlations based on data from beds of dead vegetation Improvement in model capabilities is sought by developing models of the combustion of live fuels. A facility was developed to determine the combustion behavior of small samples of live fuels, consisting of a flat-flame burner on a moveable platform Qualitative and...

  6. Estimated correlation matrices and portfolio optimization

    NASA Astrophysics Data System (ADS)

    Pafka, Szilárd; Kondor, Imre

    2004-11-01

    Correlations of returns on various assets play a central role in financial theory and also in many practical applications. From a theoretical point of view, the main interest lies in the proper description of the structure and dynamics of correlations, whereas for the practitioner the emphasis is on the ability of the models to provide adequate inputs for the numerous portfolio and risk management procedures used in the financial industry. The theory of portfolios, initiated by Markowitz, has suffered from the “curse of dimensions” from the very outset. Over the past decades a large number of different techniques have been developed to tackle this problem and reduce the effective dimension of large bank portfolios, but the efficiency and reliability of these procedures are extremely hard to assess or compare. In this paper, we propose a model (simulation)-based approach which can be used for the systematical testing of all these dimensional reduction techniques. To illustrate the usefulness of our framework, we develop several toy models that display some of the main characteristic features of empirical correlations and generate artificial time series from them. Then, we regard these time series as empirical data and reconstruct the corresponding correlation matrices which will inevitably contain a certain amount of noise, due to the finiteness of the time series. Next, we apply several correlation matrix estimators and dimension reduction techniques introduced in the literature and/or applied in practice. As in our artificial world the only source of error is the finite length of the time series and, in addition, the “true” model, hence also the “true” correlation matrix, are precisely known, therefore in sharp contrast with empirical studies, we can precisely compare the performance of the various noise reduction techniques. One of our recurrent observations is that the recently introduced filtering technique based on random matrix theory performs consistently well in all the investigated cases. Based on this experience, we believe that our simulation-based approach can also be useful for the systematic investigation of several related problems of current interest in finance.

  7. Induced Innovation and Social Inequality: Evidence from Infant Medical Care.

    PubMed

    Cutler, David M; Meara, Ellen; Richards-Shubik, Seth

    2012-01-01

    We develop a model of induced innovation that applies to medical research. Our model yields three empirical predictions. First, initial death rates and subsequent research effort should be positively correlated. Second, research effort should be associated with more rapid mortality declines. Third, as a byproduct of targeting the most common conditions in the population as a whole, induced innovation leads to growth in mortality disparities between minority and majority groups. Using information on infant deaths in the U.S. between 1983 and 1998, we find support for all three empirical predictions.

  8. Non-empirical exchange-correlation parameterizations based on exact conditions from correlated orbital theory.

    PubMed

    Haiduke, Roberto Luiz A; Bartlett, Rodney J

    2018-05-14

    Some of the exact conditions provided by the correlated orbital theory are employed to propose new non-empirical parameterizations for exchange-correlation functionals from Density Functional Theory (DFT). This reparameterization process is based on range-separated functionals with 100% exact exchange for long-range interelectronic interactions. The functionals developed here, CAM-QTP-02 and LC-QTP, show mitigated self-interaction error, correctly predict vertical ionization potentials as the negative of eigenvalues for occupied orbitals, and provide nice excitation energies, even for challenging charge-transfer excited states. Moreover, some improvements are observed for reaction barrier heights with respect to the other functionals belonging to the quantum theory project (QTP) family. Finally, the most important achievement of these new functionals is an excellent description of vertical electron affinities (EAs) of atoms and molecules as the negative of appropriate virtual orbital eigenvalues. In this case, the mean absolute deviations for EAs in molecules are smaller than 0.10 eV, showing that physical interpretation can indeed be ascribed to some unoccupied orbitals from DFT.

  9. Non-empirical exchange-correlation parameterizations based on exact conditions from correlated orbital theory

    NASA Astrophysics Data System (ADS)

    Haiduke, Roberto Luiz A.; Bartlett, Rodney J.

    2018-05-01

    Some of the exact conditions provided by the correlated orbital theory are employed to propose new non-empirical parameterizations for exchange-correlation functionals from Density Functional Theory (DFT). This reparameterization process is based on range-separated functionals with 100% exact exchange for long-range interelectronic interactions. The functionals developed here, CAM-QTP-02 and LC-QTP, show mitigated self-interaction error, correctly predict vertical ionization potentials as the negative of eigenvalues for occupied orbitals, and provide nice excitation energies, even for challenging charge-transfer excited states. Moreover, some improvements are observed for reaction barrier heights with respect to the other functionals belonging to the quantum theory project (QTP) family. Finally, the most important achievement of these new functionals is an excellent description of vertical electron affinities (EAs) of atoms and molecules as the negative of appropriate virtual orbital eigenvalues. In this case, the mean absolute deviations for EAs in molecules are smaller than 0.10 eV, showing that physical interpretation can indeed be ascribed to some unoccupied orbitals from DFT.

  10. An Analytical-Numerical Model for Two-Phase Slug Flow through a Sudden Area Change in Microchannels

    DOE PAGES

    Momen, A. Mehdizadeh; Sherif, S. A.; Lear, W. E.

    2016-01-01

    In this article, two new analytical models have been developed to calculate two-phase slug flow pressure drop in microchannels through a sudden contraction. Even though many studies have been reported on two-phase flow in microchannels, considerable discrepancies still exist, mainly due to the difficulties in experimental setup and measurements. Numerical simulations were performed to support the new analytical models and to explore in more detail the physics of the flow in microchannels with a sudden contraction. Both analytical and numerical results were compared to the available experimental data and other empirical correlations. Results show that models, which were developed basedmore » on the slug and semi-slug assumptions, agree well with experiments in microchannels. Moreover, in contrast to the previous empirical correlations which were tuned for a specific geometry, the new analytical models are capable of taking geometrical parameters as well as flow conditions into account.« less

  11. Parametric Study to Determine the Effect of Temperature on Oil Soldifier Performance and the Development of a new Empirical Correlation for Predicting Effectiveness

    EPA Science Inventory

    Temperature can play a significant role in the efficacy of solidifiers in removing oil slicks on water. We studied and quantified the effect of temperature on the performance of several solidifiers using 5 different types of oils under a newly developed testing protocol by condu...

  12. Parametric Study to Determine the Effect of Temperature on Oil Solidifier Performance and the Development of a New Empirical Correlation for Predicting Effectiveness

    EPA Science Inventory

    Temperature can play a significant role in the efficacy of solidifiers in removing oil slicks on water. We studied and quantified the effect of temperature on the performance of several solidifiers using 5 different types of oils under a newly developed testing protocol by conduc...

  13. Assessing the Basic Traits Associated with Psychopathy: Development and Validation of the Elemental Psychopathy Assessment

    ERIC Educational Resources Information Center

    Lynam, Donald R.; Gaughan, Eric T.; Miller, Joshua D.; Miller, Drew J.; Mullins-Sweatt, Stephanie; Widiger, Thomas A.

    2011-01-01

    A new self-report assessment of the basic traits of psychopathy was developed with a general trait model of personality (five-factor model [FFM]) as a framework. Scales were written to assess maladaptive variants of the 18 FFM traits that are robustly related to psychopathy across a variety of perspectives including empirical correlations, expert…

  14. Empirical source strength correlations for rans-based acoustic analogy methods

    NASA Astrophysics Data System (ADS)

    Kube-McDowell, Matthew Tyndall

    JeNo is a jet noise prediction code based on an acoustic analogy method developed by Mani, Gliebe, Balsa, and Khavaran. Using the flow predictions from a standard Reynolds-averaged Navier-Stokes computational fluid dynamics solver, JeNo predicts the overall sound pressure level and angular spectra for high-speed hot jets over a range of observer angles, with a processing time suitable for rapid design purposes. JeNo models the noise from hot jets as a combination of two types of noise sources; quadrupole sources dependent on velocity fluctuations, which represent the major noise of turbulent mixing, and dipole sources dependent on enthalpy fluctuations, which represent the effects of thermal variation. These two sources are modeled by JeNo as propagating independently into the far-field, with no cross-correlation at the observer location. However, high-fidelity computational fluid dynamics solutions demonstrate that this assumption is false. In this thesis, the theory, assumptions, and limitations of the JeNo code are briefly discussed, and a modification to the acoustic analogy method is proposed in which the cross-correlation of the two primary noise sources is allowed to vary with the speed of the jet and the observer location. As a proof-of-concept implementation, an empirical correlation correction function is derived from comparisons between JeNo's noise predictions and a set of experimental measurements taken for the Air Force Aero-Propulsion Laboratory. The empirical correlation correction is then applied to JeNo's predictions of a separate data set of hot jets tested at NASA's Glenn Research Center. Metrics are derived to measure the qualitative and quantitative performance of JeNo's acoustic predictions, and the empirical correction is shown to provide a quantitative improvement in the noise prediction at low observer angles with no freestream flow, and a qualitative improvement in the presence of freestream flow. However, the results also demonstrate that there are underlying flaws in JeNo's ability to predict the behavior of a hot jet's acoustic signature at certain rear observer angles, and that this correlation correction is not able to correct these flaws.

  15. Estimating the volume of Alpine glacial lakes

    NASA Astrophysics Data System (ADS)

    Cook, S. J.; Quincey, D. J.

    2015-09-01

    Supraglacial, moraine-dammed and ice-dammed lakes represent a potential glacial lake outburst flood (GLOF) threat to downstream communities in many mountain regions. This has motivated the development of empirical relationships to predict lake volume given a measurement of lake surface area obtained from satellite imagery. Such relationships are based on the notion that lake depth, area and volume scale predictably. We critically evaluate the performance of these existing empirical relationships by examining a global database of measured glacial lake depths, areas and volumes. Results show that lake area and depth are not always well correlated (r2 = 0.38), and that although lake volume and area are well correlated (r2 = 0.91), there are distinct outliers in the dataset. These outliers represent situations where it may not be appropriate to apply existing empirical relationships to predict lake volume, and include growing supraglacial lakes, glaciers that recede into basins with complex overdeepened morphologies or that have been deepened by intense erosion, and lakes formed where glaciers advance across and block a main trunk valley. We use the compiled dataset to develop a conceptual model of how the volumes of supraglacial ponds and lakes, moraine-dammed lakes and ice-dammed lakes should be expected to evolve with increasing area. Although a large amount of bathymetric data exist for moraine-dammed and ice-dammed lakes, we suggest that further measurements of growing supraglacial ponds and lakes are needed to better understand their development.

  16. Characterization of subgrade resilient modulus for Virginia soils and its correlation with the results of other soil tests.

    DOT National Transportation Integrated Search

    2008-01-01

    In 2004, the Guide for the Mechanistic-Empirical Design of New & Rehabilitated Pavement Structures (MEPDG) was developed under NCHRP Project 1-37A to replace the currently used 1993 Guide for Design of Pavement Structures by the American Association ...

  17. Selected Sports Bras: Overall Comfort and Support.

    ERIC Educational Resources Information Center

    Lawson, LaJean; Lorentzen, Deana

    This study evaluated currently marketed sports bras on subjective measures of comfort and support both within an entire group of women and within cup sizes, correlated the subjective measures of comfort and support with previously reported biomechanical findings of support on the same bras, and further developed empirically based guidelines for…

  18. Bicultural Self-Efficacy among College Students: Initial Scale Development and Mental Health Correlates

    ERIC Educational Resources Information Center

    David, E. J. R.; Okazaki, Sumie; Saw, Anne

    2009-01-01

    Theory and empirical research suggest that perceived self-efficacy, or one's perceived ability to perform personally significant tasks, is related to individuals' psychological well-being and mental health. Thus, the authors hypothesized that bicultural individuals' perceived ability to function competently in 2 cultures, or perceived bicultural…

  19. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  20. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  1. Roles of Engineering Correlations in Hypersonic Entry Boundary Layer Transition Prediction

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.; King, Rudolph A.; Kergerise, Michael A.; Berry, Scott A.; Horvath, Thomas J.

    2010-01-01

    Efforts to design and operate hypersonic entry vehicles are constrained by many considerations that involve all aspects of an entry vehicle system. One of the more significant physical phenomenon that affect entry trajectory and thermal protection system design is the occurrence of boundary layer transition from a laminar to turbulent state. During the Space Shuttle Return To Flight activity following the loss of Columbia and her crew of seven, NASA's entry aerothermodynamics community implemented an engineering correlation based framework for the prediction of boundary layer transition on the Orbiter. The methodology for this implementation relies upon the framework of correlation techniques that have been in use for several decades. What makes the Orbiter boundary layer transition correlation implementation unique is that a statistically significant data set was acquired in multiple ground test facilities, flight data exists to assist in establishing a better correlation and the framework was founded upon state of the art chemical nonequilibrium Navier Stokes flow field simulations. The basic tenets that guided the formulation and implementation of the Orbiter Return To Flight boundary layer transition prediction capability will be reviewed as a recommended format for future empirical correlation efforts. The validity of this approach has since been demonstrated by very favorable comparison of recent entry flight testing performed with the Orbiter Discovery, which will be graphically summarized. These flight data can provide a means to validate discrete protuberance engineering correlation approaches as well as high fidelity prediction methods to higher confidence. The results of these Orbiter engineering and flight test activities only serve to reinforce the essential role that engineering correlations currently exercise in the design and operation of entry vehicles. The framework of information-related to the Orbiter empirical boundary layer transition prediction capability will be utilized to establish a fresh perspective on this role, to illustrate how quantitative statistical evaluations of empirical correlations can and should be used to assess accuracy and to discuss what the authors' perceive as a recent heightened interest in the application of high fidelity numerical modeling of boundary layer transition. Concrete results will also be developed related to empirical boundary layer transition onset correlations. This will include assessment of the discrete protuberance boundary layer transition onset data assembled for the Orbiter configuration during post-Columbia Return To Flight. Assessment of these data will conclude that momentum thickness Reynolds number based correlations have superior coefficients and uncertainty in comparison to roughness height based Reynolds numbers, aka Re(sub k) or Re(sub kk). In addition, linear regression results from roughness height Reynolds number based correlations will be evaluated, leading to a hypothesis that non-continuum effects play a role in the processes associated with incipient boundary layer transition on discrete protuberances.

  2. Non-separable time preferences, novelty consumption and body weight: Theory and evidence from the East German transition to capitalism.

    PubMed

    Dragone, Davide; Ziebarth, Nicolas R

    2017-01-01

    This paper develops a dynamic model to illustrate how diet and body weight change when novel food products become available to consumers. We propose a microfounded test to empirically discriminate between habit and taste formation in intertemporal preferences. Moreover, we show that 'novelty consumption' and endogenous preferences can explain the persistent correlation between economic development and obesity. By empirically studying the German reunification, we find that East Germans consumed more novel Western food and gained more weight than West Germans when a larger variety of food products became readily accessible after the fall of the Wall. The observed consumption patterns suggest that food consumption features habit formation. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Unidimensional factor models imply weaker partial correlations than zero-order correlations.

    PubMed

    van Bork, Riet; Grasman, Raoul P P P; Waldorp, Lourens J

    2018-06-01

    In this paper we present a new implication of the unidimensional factor model. We prove that the partial correlation between two observed variables that load on one factor given any subset of other observed variables that load on this factor lies between zero and the zero-order correlation between these two observed variables. We implement this result in an empirical bootstrap test that rejects the unidimensional factor model when partial correlations are identified that are either stronger than the zero-order correlation or have a different sign than the zero-order correlation. We demonstrate the use of the test in an empirical data example with data consisting of fourteen items that measure extraversion.

  4. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers

    PubMed Central

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-01-01

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653

  5. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.

    PubMed

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-06-29

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.

  6. Learning-by-doing, population pressure, and the theory of demographic transition.

    PubMed

    Strulik, H

    1997-01-01

    The long-term effects of two interdependent relations between economic growth and population growth are discussed. The empirical work of Boserup (1981) was utilized, which focused on rural, sparsely populated economies with low income per capita. According to the formulation of the population-push hypothesis, learning-by-doing effects in production lead to increasing returns to scale and, therefore, to a positive correlation between economic and population growth. In accordance with the theory of demographic transition, the population growth rate initially increases with rising income levels and then declines. The approach originating from Cigno (1984) modified the economic model, which allowed the establishment of two different stable equilibria. Regarding this relationship, the existence and stability of low-income and high-income equilibrium was shown in a neoclassical growth model. Under plausible conditions a demo-economic transition from the first to the second steady-state took place. The instability of the Malthusian steady-state is also possible when a country develops along a path of economic growth which is compatible with the demographic transition. In this context, learning means the application of new techniques of agrarian production. In developed economies with a stable population the learning-or-doing decision lead to accumulation of human capital and the invention of new technologies and goods. The interdependence of income-determined population growth and learning-by-doing may serve as an explanation for the weak and partly controversial empirical support for an overall correlation between income and population growth. The result yielded a meaningful interpretation of the population-push hypothesis, which is consistent with the empirical findings on the correlation between economic and population growth.

  7. Fear as a Predictor of Life Satisfaction in Retirement in Canada

    ERIC Educational Resources Information Center

    Nguyen, Satoko; Tirrito, Teresa S.; Barkley, William M.

    2014-01-01

    In developed countries, healthy retirees can fulfill their life, but may fear growing old. Yet, there is little empirical data on the relationship between this fear and life satisfaction. This cross-sectional, correlational survey study tested whether a new, summated measure of Fears About Growing Old (FAGO)--derived from exemplifications of…

  8. Simulation of Plant Physiological Process Using Fuzzy Variables

    Treesearch

    Daniel L. Schmoldt

    1991-01-01

    Qualitative modelling can help us understand and project effects of multiple stresses on trees. It is not practical to collect and correlate empirical data for all combinations of plant/environments and human/climate stresses, especially for mature trees in natural settings. Therefore, a mechanistic model was developed to describe ecophysiological processes. This model...

  9. Induced Innovation and Social Inequality: Evidence from Infant Medical Care

    ERIC Educational Resources Information Center

    Cutler, David M.; Meara, Ellen; Richards-Shubik, Seth

    2012-01-01

    We develop a model of induced innovation that applies to medical research. Our model yields three empirical predictions. First, initial death rates and subsequent research effort should be positively correlated. Second, research effort should be associated with more rapid mortality declines. Third, as a byproduct of targeting the most common…

  10. Clinical and Experimental Research Utilizing the DACL.

    ERIC Educational Resources Information Center

    Strickland, Bonnie R.

    Since the development of the Lubin Depression Adjective Check Lists (DACL) in 1965, researchers have used this instrument in many empirical and clinical studies. Scores on the DACL have correlated with other measures of depression and have also been related to personal characteristics of depressed individuals. The DACL has been used in studies to…

  11. The Philosophy, Theoretical Bases, and Implementation of the AHAAH Model for Evaluation of Hazard from Exposure to Intense Sounds

    DTIC Science & Technology

    2018-04-01

    empirical, external energy-damage correlation methods for evaluating hearing damage risk associated with impulsive noise exposure. AHAAH applies the...is validated against the measured results of human exposures to impulsive sounds, and unlike wholly empirical correlation approaches, AHAAH’s...a measured level (LAEQ8 of 85 dB). The approach in MIL-STD-1474E is very different. Previous standards tried to find a correlation between some

  12. Transport property correlations for the niobium-1% zirconium alloy

    NASA Astrophysics Data System (ADS)

    Senor, David J.; Thomas, J. Kelly; Peddicord, K. L.

    1990-10-01

    Correlations were developed for the electrical resistivity (ρ), thermal conductivity ( k), and hemispherical total emittance (ɛ) of niobium-1% zirconium as functions of temperature. All three correlations were developed as empirical fits to experimental data. ρ = 5.571 + 4.160 × 10 -2(T) - 4.192 × 10 -6(T) 2 μΩcm , k = 13.16( T) 0.2149W/ mK, ɛ = 6.39 × 10 -2 + 4.98 × 10 -5( T) + 3.62 × 10 -8( T) 2 - 7.28 × 10 -12( T) 3. The relative standard deviation of the electrical resistivity correlation is 1.72% and it is valid over the temperature range 273 to 2700 K. The thermal conductivity correlation has a relative standard deviation of 3.24% and is valid over the temperature range 379 to 1421 K. The hemispherical total emittance correlation was developed for smooth surface materials only and represents a conservative estimate of the emittance of the alloy for space reactor fuel element modeling applications. It has a relative standard deviation of 9.50% and is valid over the temperature range 755 to 2670 K.

  13. Cultural Validity of the Minnesota Multiphasic Personality Inventory-2 Empirical Correlates: Is This the Best We Can Do?

    ERIC Educational Resources Information Center

    Hill, Jill S.; Robbins, Rockey R.; Pace, Terry M.

    2012-01-01

    This article critically reviews empirical correlates of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2; Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989), based on several validation studies conducted with different racial, ethnic, and cultural groups. A major critique of the reviewed MMPI-2 studies was focused on the use of…

  14. Numerical development of a new correlation between biaxial fracture strain and material fracture toughness for small punch test

    NASA Astrophysics Data System (ADS)

    Kumar, Pradeep; Dutta, B. K.; Chattopadhyay, J.

    2017-04-01

    The miniaturized specimens are used to determine mechanical properties of the materials, such as yield stress, ultimate stress, fracture toughness etc. Use of such specimens is essential whenever limited quantity of material is available for testing, such as aged/irradiated materials. The miniaturized small punch test (SPT) is a technique which is widely used to determine change in mechanical properties of the materials. Various empirical correlations are proposed in the literature to determine the value of fracture toughness (JIC) using this technique. bi-axial fracture strain is determined using SPT tests. This parameter is then used to determine JIC using available empirical correlations. The correlations between JIC and biaxial fracture strain quoted in the literature are based on experimental data acquired for large number of materials. There are number of such correlations available in the literature, which are generally not in agreement with each other. In the present work, an attempt has been made to determine the correlation between biaxial fracture strain (εqf) and crack initiation toughness (Ji) numerically. About one hundred materials are digitally generated by varying yield stress, ultimate stress, hardening coefficient and Gurson parameters. Such set of each material is then used to analyze a SPT specimen and a standard TPB specimen. Analysis of SPT specimen generated biaxial fracture strain (εqf) and analysis of TPB specimen generated value of Ji. A graph is then plotted between these two parameters for all the digitally generated materials. The best fit straight line determines the correlation. It has been also observed that it is possible to have variation in Ji for the same value of biaxial fracture strain (εqf) within a limit. Such variation in the value of Ji has been also ascertained using the graph. Experimental SPT data acquired earlier for three materials were then used to get Ji by using newly developed correlation. A reasonable comparison of calculated Ji with the values quoted in literature confirmed usefulness of the correlation.

  15. The Interest Checklist: a factor analysis.

    PubMed

    Klyczek, J P; Bauer-Yox, N; Fiedler, R C

    1997-01-01

    The purpose of this study was to determine whether the 80 items on the Interest Checklist empirically cluster into the five categories of interests described by Matsutsuyu, the developer of the tool. The Interest Checklist was administered to 367 subjects classified in three subgroups: students, working adults, and retired elderly persons. An 80-item correlation matrix was formed from the responses to the Interest Checklist for each subgroup and then used in a factor analysis model to identify the underlying structure or domains of interest. Results indicated that the Social Recreation theoretical category was empirically independent for all three subgroups; the Physical Sports and Cultural/Educational theoretical categories were empirically independent for only the college students and working adults; and the Manual Skills theoretical category was empirically independent for only the working adults. Although therapists should continue to be cautious in their interpretation of patients' Interest Checklist scores, the tool is useful for identifying patients' interests in order to choose meaningful activities for therapy.

  16. Dissolved Organic Carbon along the Louisiana coast from MODIS and MERIS satellite data

    NASA Astrophysics Data System (ADS)

    Chaichi Tehrani, N.; D'Sa, E. J.

    2012-12-01

    Dissolved organic carbon (DOC) plays a critical role in the coastal and ocean carbon cycle. Hence, it is important to monitor and investigate its the distribution and fate in coastal waters. Since DOC cannot be measured directly through satellite remote sensors, chromophoric dissolved organic matter (CDOM) as an optically active fraction of DOC can be used as an alternative proxy to trace DOC concentrations. Here, satellite ocean color data from MODIS, MERIS, and field measurements of CDOM and DOC were used to develop and assess CDOM and DOC ocean color algorithms for coastal waters. To develop a CDOM retrieval algorithm, empirical relationships between CDOM absorption coefficient at 412 nm (aCDOM(412)) and reflectance ratios Rrs(488)/Rrs(555) for MODIS and Rrs(510)/Rrs(560) for MERIS were established. The performance of two CDOM empirical algorithms were evaluated for retrieval of (aCDOM(412)) from MODIS and MERIS in the northern Gulf of Mexico. Further, empirical algorithms were developed to estimate DOC concentration using the relationship between in situ aCDOM(412) and DOC, as well as using the newly developed CDOM empirical algorithms. Accordingly, our results revealed that DOC concentration was strongly correlated to aCDOM (412) for summer and spring-winter periods (r2 = 0.9 for both periods). Then, using the aCDOM(412)-Rrs and the aCDOM(412)-DOC relationships derived from field measurements, a relationship between DOC-Rrs was established for MODIS and MERIS data. The DOC empirical algorithms performed well as indicated by match-up comparisons between satellite estimates and field data (R2=0.52 and 0.58 for MODIS and MERIS for summer period, respectively). These algorithms were then used to examine DOC distribution along the Louisiana coast.

  17. A comparison of high-frequency cross-correlation measures

    NASA Astrophysics Data System (ADS)

    Precup, Ovidiu V.; Iori, Giulia

    2004-12-01

    On a high-frequency scale the time series are not homogeneous, therefore standard correlation measures cannot be directly applied to the raw data. There are two ways to deal with this problem. The time series can be homogenised through an interpolation method (An Introduction to High-Frequency Finance, Academic Press, NY, 2001) (linear or previous tick) and then the Pearson correlation statistic computed. Recently, methods that can handle raw non-synchronous time series have been developed (Int. J. Theor. Appl. Finance 6(1) (2003) 87; J. Empirical Finance 4 (1997) 259). This paper compares two traditional methods that use interpolation with an alternative method applied directly to the actual time series.

  18. Public demand for preserving local open space.

    Treesearch

    Jeffrey D. Kline

    2006-01-01

    Increased development results in the loss of forest, farm, range, and other open space lands that contribute to the quality of life of U.S. residents. I describe an economic rationale for growing public support for preserving local open space, based the growing scarcity of open space lands. I test the rationale empirically by correlating the prevalence of open space...

  19. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  20. A Qualitative Simulation Framework in Smalltalk Based on Fuzzy Arithmetic

    Treesearch

    Richard L. Olson; Daniel L. Schmoldt; David L. Peterson

    1996-01-01

    For many systems, it is not practical to collect and correlate empirical data necessary to formulate a mathematical model. However, it is often sufficient to predict qualitative dynamics effects (as opposed to system quantities), especially for research purposes. In this effort, an object-oriented application framework (AF) was developed for the qualitative modeling of...

  1. Lutheran Adolescent Spiritual Development: The Effect of School Attendance on Spiritual Transformation Inventory Test Scores

    ERIC Educational Resources Information Center

    Weider, Michael James

    2013-01-01

    Lutheran schools have been established to nurture and disciple children into the Christian faith. However, empirical evidence is lacking that Lutheran schools are accomplishing this goal. The purpose of this Causal comparative and Correlational study was to determine whether attendance at Lutheran or Public schools made a statistically significant…

  2. The Correlation of Human Capital on Costs of Air Force Acquisition Programs

    DTIC Science & Technology

    2009-03-01

    6.78 so our model does not exhibit the presence of multi-collinearity. We empirically tested for heteroskedasticity using the Breusch - Pagan -Godfrey...inputs to outputs. The output in this study is the average cost overrun of Aeronautical Systems Center research, development, test , and evaluation...32 Pre-Estimation Specification Tests ............................................................................34 Post

  3. Long-range correlation and market segmentation in bond market

    NASA Astrophysics Data System (ADS)

    Wang, Zhongxing; Yan, Yan; Chen, Xiaosong

    2017-09-01

    This paper investigates the long-range auto-correlations and cross-correlations in bond market. Based on Detrended Moving Average (DMA) method, empirical results present a clear evidence of long-range persistence that exists in one year scale. The degree of long-range correlation related to maturities has an upward tendency with a peak in short term. These findings confirm the expectations of fractal market hypothesis (FMH). Furthermore, we have developed a method based on a complex network to study the long-range cross-correlation structure and applied it to our data, and found a clear pattern of market segmentation in the long run. We also detected the nature of long-range correlation in the sub-period 2007-2012 and 2011-2016. The result from our research shows that long-range auto-correlations are decreasing in the recent years while long-range cross-correlations are strengthening.

  4. Empirical correlations for axial dispersion coefficient and Peclet number in fixed-bed columns.

    PubMed

    Rastegar, Seyed Omid; Gu, Tingyue

    2017-03-24

    In this work, a new correlation for the axial dispersion coefficient was obtained using experimental data in the literature for axial dispersion in fixed-bed columns packed with particles. The Chung and Wen correlation, the De Ligny correlation are two popular empirical correlations. However, the former lacks the molecular diffusion term and the latter does not consider bed voidage. The new axial dispersion coefficient correlation in this work was based on additional experimental data in the literature by considering both molecular diffusion and bed voidage. It is more comprehensive and accurate. The Peclet number correlation from the new axial dispersion coefficient correlation on the average leads to 12% lower Peclet number values compared to the values from the Chung and Wen correlation, and in many cases much smaller than those from the De Ligny correlation. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. An empirical model for prediction of geomagnetic storms using initially observed CME parameters at the Sun

    NASA Astrophysics Data System (ADS)

    Kim, R.-S.; Cho, K.-S.; Moon, Y.-J.; Dryer, M.; Lee, J.; Yi, Y.; Kim, K.-H.; Wang, H.; Park, Y.-D.; Kim, Yong Ha

    2010-12-01

    In this study, we discuss the general behaviors of geomagnetic storm strength associated with observed parameters of coronal mass ejection (CME) such as speed (V) and earthward direction (D) of CMEs as well as the longitude (L) and magnetic field orientation (M) of overlaying potential fields of the CME source region, and we develop an empirical model to predict geomagnetic storm occurrence with its strength (gauged by the Dst index) in terms of these CME parameters. For this we select 66 halo or partial halo CMEs associated with M-class and X-class solar flares, which have clearly identifiable source regions, from 1997 to 2003. After examining how each of these CME parameters correlates with the geoeffectiveness of the CMEs, we find several properties as follows: (1) Parameter D best correlates with storm strength Dst; (2) the majority of geoeffective CMEs have been originated from solar longitude 15°W, and CMEs originated away from this longitude tend to produce weaker storms; (3) correlations between Dst and the CME parameters improve if CMEs are separated into two groups depending on whether their magnetic fields are oriented southward or northward in their source regions. Based on these observations, we present two empirical expressions for Dst in terms of L, V, and D for two groups of CMEs, respectively. This is a new attempt to predict not only the occurrence of geomagnetic storms, but also the storm strength (Dst) solely based on the CME parameters.

  6. Biomass viability: An experimental study and the development of an empirical mathematical model for submerged membrane bioreactor.

    PubMed

    Zuthi, M F R; Ngo, H H; Guo, W S; Nghiem, L D; Hai, F I; Xia, S Q; Zhang, Z Q; Li, J X

    2015-08-01

    This study investigates the influence of key biomass parameters on specific oxygen uptake rate (SOUR) in a sponge submerged membrane bioreactor (SSMBR) to develop mathematical models of biomass viability. Extra-cellular polymeric substances (EPS) were considered as a lumped parameter of bound EPS (bEPS) and soluble microbial products (SMP). Statistical analyses of experimental results indicate that the bEPS, SMP, mixed liquor suspended solids and volatile suspended solids (MLSS and MLVSS) have functional relationships with SOUR and their relative influence on SOUR was in the order of EPS>bEPS>SMP>MLVSS/MLSS. Based on correlations among biomass parameters and SOUR, two independent empirical models of biomass viability were developed. The models were validated using results of the SSMBR. However, further validation of the models for different operating conditions is suggested. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  8. Hospital nurses' individual priorities, internal psychological states and work motivation.

    PubMed

    Toode, K; Routasalo, P; Helminen, M; Suominen, T

    2014-09-01

    This study looks to describe the relationships between hospital nurses' individual priorities, internal psychological states and their work motivation. Connections between hospital nurses' work-related needs, values and work motivation are essential for providing safe and high quality health care. However, there is insufficient empirical knowledge concerning these connections for the practice development. A cross-sectional empirical research study was undertaken. A total of 201 registered nurses from all types of Estonian hospitals filled out an electronic self-reported questionnaire. Descriptive statistics, Mann-Whitney, Kruskal-Wallis and Spearman's correlation were used for data analysis. In individual priorities, higher order needs strength were negatively correlated with age and duration of service. Regarding nurses' internal psychological states, central hospital nurses had less sense of meaningfulness of work. Nurses' individual priorities (i.e. their higher order needs strength and shared values with the organization) correlated with their work motivation. Their internal psychological states (i.e. their experienced meaningfulness of work, experienced responsibility for work outcomes and their knowledge of results) correlated with intrinsic work motivation. Nurses who prioritize their higher order needs are more motivated to work. The more their own values are compatible with those of the organization, the more intrinsically motivated they are likely to be. Nurses' individual achievements, autonomy and training are key factors which influence their motivation to work. The small sample size and low response rate of the study limit the direct transferability of the findings to the wider nurse population, so further research is needed. This study highlights the need and importance to support nurses' professional development and self-determination, in order to develop and retain motivated nurses. It also indicates a need to value both nurses and nursing in healthcare policy and management. © 2014 International Council of Nurses.

  9. Study of clad ballooning and rupture behaviour of Indian PHWR fuel pins under transient heating condition in steam environment

    NASA Astrophysics Data System (ADS)

    Sawarn, Tapan K.; Banerjee, Suparna; Sheelvantra, Smita S.; Singh, J. L.; Bhasin, Vivek

    2017-11-01

    This paper presents the results of the investigation on the deformation and rupture characteristics of Indian pressurized heavy water reactor (IPHWR) fuel pins under simulated loss of coolant accident (LOCA) condition in steam environment. Transient heating experiments were carried out on single fuel pin internally pressurized with argon gas in the range 3-70 bar. Effect of internal pressure on burst temperature, influence of burst temperature on the circumferential strain and rupture opening area were also studied. Two circumferential strain maxima at the burst temperatures of 740 & ∼979 °C and a minimum at the burst temperature of ∼868 °C were observed. It was found that oxidation had considerable effect on the burst behavior. Test data were used to derive a direct empirical correlation for burst stress exclusively as a function of temperature. The ballooning and rupture behaviours in steam and argon environments have been compared. Experimental data were examined against various correlations using Erbacher equation and author's previous correlation in argon. A second burst correlation has also been developed combining the equation in argon from the previous work of the authors and an exponential factor with oxygen content as a parameter assuming the burst stress to be a function of both temperature and oxygen concentration. The burst temperatures predicted by this empirical correlation are in good agreement with the test data.

  10. Sparsity guided empirical wavelet transform for fault diagnosis of rolling element bearings

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Zhao, Yang; Yi, Cai; Tsui, Kwok-Leung; Lin, Jianhui

    2018-02-01

    Rolling element bearings are widely used in various industrial machines, such as electric motors, generators, pumps, gearboxes, railway axles, turbines, and helicopter transmissions. Fault diagnosis of rolling element bearings is beneficial to preventing any unexpected accident and reducing economic loss. In the past years, many bearing fault detection methods have been developed. Recently, a new adaptive signal processing method called empirical wavelet transform attracts much attention from readers and engineers and its applications to bearing fault diagnosis have been reported. The main problem of empirical wavelet transform is that Fourier segments required in empirical wavelet transform are strongly dependent on the local maxima of the amplitudes of the Fourier spectrum of a signal, which connotes that Fourier segments are not always reliable and effective if the Fourier spectrum of the signal is complicated and overwhelmed by heavy noises and other strong vibration components. In this paper, sparsity guided empirical wavelet transform is proposed to automatically establish Fourier segments required in empirical wavelet transform for fault diagnosis of rolling element bearings. Industrial bearing fault signals caused by single and multiple railway axle bearing defects are used to verify the effectiveness of the proposed sparsity guided empirical wavelet transform. Results show that the proposed method can automatically discover Fourier segments required in empirical wavelet transform and reveal single and multiple railway axle bearing defects. Besides, some comparisons with three popular signal processing methods including ensemble empirical mode decomposition, the fast kurtogram and the fast spectral correlation are conducted to highlight the superiority of the proposed method.

  11. Effect on Gaseous Film Cooling of Coolant Injection Through Angled Slots and Normal Holes

    NASA Technical Reports Server (NTRS)

    Papell, S. Stephen

    1960-01-01

    A study was made to determine the effect of coolant injection angularity on gaseous film-cooling effectiveness. In the correlation of experimental data an effective injection angle was defined by a vector summation of the coolant and mainstream gas flows. The cosine of this angle was used as a parameter to empirically develop a corrective term to qualify a correlating equation presented in Technical Note D-130 that was limited to tangential injection of the coolant. Data were also obtained for coolant injection through rows of holes normal to the test plate. The slot correlating equation was adapted to fit these data by the definition of an effective slot height. An additional corrective term was then determined to correlate these data.

  12. A semi-empirical model for the estimation of maximum horizontal displacement due to liquefaction-induced lateral spreading

    USGS Publications Warehouse

    Faris, Allison T.; Seed, Raymond B.; Kayen, Robert E.; Wu, Jiaer

    2006-01-01

    During the 1906 San Francisco Earthquake, liquefaction-induced lateral spreading and resultant ground displacements damaged bridges, buried utilities, and lifelines, conventional structures, and other developed works. This paper presents an improved engineering tool for the prediction of maximum displacement due to liquefaction-induced lateral spreading. A semi-empirical approach is employed, combining mechanistic understanding and data from laboratory testing with data and lessons from full-scale earthquake field case histories. The principle of strain potential index, based primary on correlation of cyclic simple shear laboratory testing results with in-situ Standard Penetration Test (SPT) results, is used as an index to characterized the deformation potential of soils after they liquefy. A Bayesian probabilistic approach is adopted for development of the final predictive model, in order to take fullest advantage of the data available and to deal with the inherent uncertainties intrinstiic to the back-analyses of field case histories. A case history from the 1906 San Francisco Earthquake is utilized to demonstrate the ability of the resultant semi-empirical model to estimate maximum horizontal displacement due to liquefaction-induced lateral spreading.

  13. Is the Critical Shields Stress for Incipient Sediment Motion Dependent on Bed Slope in Natural Channels? No.

    NASA Astrophysics Data System (ADS)

    Phillips, C. B.; Jerolmack, D. J.

    2017-12-01

    Understanding when coarse sediment begins to move in a river is essential for linking rivers to the evolution of mountainous landscapes. Unfortunately, the threshold of surface particle motion is notoriously difficult to measure in the field. However, recent studies have shown that the threshold of surface motion is empirically correlated with channel slope, a property that is easy to measure and readily available from the literature. These studies have thoroughly examined the mechanistic underpinnings behind the observed correlation and produced suitably complex models. These models are difficult to implement for natural rivers using widely available data, and thus others have treated the empirical regression between slope and the threshold of motion as a predictive model. We note that none of the authors of the original studies exploring this correlation suggested their empirical regressions be used in a predictive fashion, nevertheless these regressions between slope and the threshold of motion have found their way into numerous recent studies engendering potentially spurious conclusions. We demonstrate that there are two significant problems with using these empirical equations for prediction: (1) the empirical regressions are based on a limited sampling of the phase space of bed-load rivers and (2) the empirical measurements of bankfull and critical shear stresses are paired. The upshot of these problems limits the empirical relations predictive capacity to field sites drawn from the same region of the bed-load river phase space and that the paired nature of the data introduces a spurious correlation when considering the ratio of bankfull to critical shear stress. Using a large compilation of bed-load river hydraulic geometry data, we demonstrate that the variation within independently measured values of the threshold of motion changes systematically with bankfull shields stress and not channel slope. Additionally, we highlight using several recent datasets the potential pitfalls that one can encounter when using simplistic empirical regressions to predict the threshold of motion showing that while these concerns could be construed as subtle the resulting implications can be substantial.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Wei; Lei, Wei-Hua; Wang, Ding-Xiong, E-mail: leiwh@hust.edu.cn

    Recently, two empirical correlations related to the minimum variability timescale (MTS) of the light curves are discovered in gamma-ray bursts (GRBs). One is the anti-correlation between MTS and Lorentz factor Γ, and the other is the anti-correlation between the MTS and gamma-ray luminosity L {sub γ}. Both of the two correlations might be used to explore the activity of the central engine of GRBs. In this paper, we try to understand these empirical correlations by combining two popular black hole central engine models (namely, the Blandford and Znajek mechanism (BZ) and the neutrino-dominated accretion flow (NDAF)). By taking the MTSmore » as the timescale of viscous instability of the NDAF, we find that these correlations favor the scenario in which the jet is driven by the BZ mechanism.« less

  15. Emirates Mars Ultraviolet Spectrometer's (EMUS) Prediction of Oxygen OI 135.6 nm and CO 4PG Emissions in the Martian Atmosphere

    NASA Astrophysics Data System (ADS)

    Almatroushi, H. R.; Lootah, F. H.; Deighan, J.; Fillingim, M. O.; Jain, S.; Bougher, S. W.; England, S.; Schneider, N. M.

    2017-12-01

    This research focuses on developing empirical and theoretical models for OI 135.6 nm and CO 4PG band system FUV dayglow emissions in the Martian thermosphere as predicted to be seen from the Emirates Mars Ultraviolet Spectrometer (EMUS), one of the three scientific instruments aboard the Emirates Mars Mission (EMM) to be launched in 2020. These models will aid in simulating accurate disk radiances which will be utilized as an input to an EMUS instrument simulator. The developed zonally averaged empirical models are based on FUV data from the IUVS instrument onboard the MAVEN mission, while the theoretical models are based on a basic Chapman profile. The models calculate the brightness (B) of those emissions taking into consideration observation geometry parameters such as emission angle (EA), solar zenith angle (SZA) and planet distance from the sun (Ds). Specifically, the empirical models takes a general form of Bn=A*cos(SZA)n/cos(EA)m , where Bn is the normalized brightness value of an emission feature, and A, n, and m are positive constant values. The model form shows that the brightness has a positive correlation with EA and a negative correlation with SZA. A comparison of both models are explained in this research while examining full Mars and half Mars disk images generated using geometry code specially developed for the EMUS instrument. Sensitivity analyses have also been conducted for the theoretical modeling to observe the contributions of electron impact on atomic oxygen and CO2 to the brightness of OI 135.6nm, in addition to the effect of electron temperature on the CO2± dissociative recombination contribution to the CO 4PG band system.

  16. Communities of Practice for the Development of Adolescent Civic Engagement: An Empirical Study of Their Correlates in Australia and the United States

    ERIC Educational Resources Information Center

    Homana, Gary A.

    2009-01-01

    The relationships between a multidimensional model of school community and civic engagement were examined using survey data collected for the 1999 IEA Civic Education Study from large, nationally representative samples of adolescents in Australia and the United States. This study extends previous research by considering the extent to which…

  17. 17 CFR 240.15c3-1f - Optional market and credit risk requirements for OTC derivatives dealers (Appendix F to 17 CFR...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... charges. An OTC derivatives dealer shall provide a description of all statistical models used for pricing... controls over those models, and a statement regarding whether the firm has developed its own internal VAR models. If the OTC derivatives dealer's VAR model incorporates empirical correlations across risk...

  18. Experimental study and empirical prediction of fuel flow parameters under air evolution conditions

    NASA Astrophysics Data System (ADS)

    Kitanina, E. E.; Kitanin, E. L.; Bondarenko, D. A.; Kravtsov, P. A.; Peganova, M. M.; Stepanov, S. G.; Zherebzov, V. L.

    2017-11-01

    Air evolution in kerosene under the effect of gravity flow with various hydraulic resistances in the pipeline was studied experimentally. The study was conducted at pressure ranging from 0.2 to 1.0 bar and temperature varying between -20°C and +20°C. Through these experiments, the oversaturation limit beyond which dissolved air starts evolving intensively from the fuel was established and the correlations for the calculation of pressure losses and air evolution on local loss elements were obtained. A method of calculating two-phase flow behaviour in a titled pipeline segment with very low mass flow quality and fairly high volume flow quality was developed. The complete set of empirical correlations obtained by experimental analysis was implemented in the engineering code. The software simulation results were repeatedly verified against our experimental findings and Airbus test data to show that the two-phase flow simulation agrees quite well with the experimental results obtained in the complex branched pipelines.

  19. Random matrix theory analysis of cross-correlations in the US stock market: Evidence from Pearson’s correlation coefficient and detrended cross-correlation coefficient

    NASA Astrophysics Data System (ADS)

    Wang, Gang-Jin; Xie, Chi; Chen, Shou; Yang, Jiao-Jiao; Yang, Ming-Yan

    2013-09-01

    In this study, we first build two empirical cross-correlation matrices in the US stock market by two different methods, namely the Pearson’s correlation coefficient and the detrended cross-correlation coefficient (DCCA coefficient). Then, combining the two matrices with the method of random matrix theory (RMT), we mainly investigate the statistical properties of cross-correlations in the US stock market. We choose the daily closing prices of 462 constituent stocks of S&P 500 index as the research objects and select the sample data from January 3, 2005 to August 31, 2012. In the empirical analysis, we examine the statistical properties of cross-correlation coefficients, the distribution of eigenvalues, the distribution of eigenvector components, and the inverse participation ratio. From the two methods, we find some new results of the cross-correlations in the US stock market in our study, which are different from the conclusions reached by previous studies. The empirical cross-correlation matrices constructed by the DCCA coefficient show several interesting properties at different time scales in the US stock market, which are useful to the risk management and optimal portfolio selection, especially to the diversity of the asset portfolio. It will be an interesting and meaningful work to find the theoretical eigenvalue distribution of a completely random matrix R for the DCCA coefficient because it does not obey the Marčenko-Pastur distribution.

  20. Semi-empirical quantum evaluation of peptide - MHC class II binding

    NASA Astrophysics Data System (ADS)

    González, Ronald; Suárez, Carlos F.; Bohórquez, Hugo J.; Patarroyo, Manuel A.; Patarroyo, Manuel E.

    2017-01-01

    Peptide presentation by the major histocompatibility complex (MHC) is a key process for triggering a specific immune response. Studying peptide-MHC (pMHC) binding from a structural-based approach has potential for reducing the costs of investigation into vaccine development. This study involved using two semi-empirical quantum chemistry methods (PM7 and FMO-DFTB) for computing the binding energies of peptides bonded to HLA-DR1 and HLA-DR2. We found that key stabilising water molecules involved in the peptide binding mechanism were required for finding high correlation with IC50 experimental values. Our proposal is computationally non-intensive, and is a reliable alternative for studying pMHC binding interactions.

  1. Why Psychology Cannot be an Empirical Science.

    PubMed

    Smedslund, Jan

    2016-06-01

    The current empirical paradigm for psychological research is criticized because it ignores the irreversibility of psychological processes, the infinite number of influential factors, the pseudo-empirical nature of many hypotheses, and the methodological implications of social interactivity. An additional point is that the differences and correlations usually found are much too small to be useful in psychological practice and in daily life. Together, these criticisms imply that an objective, accumulative, empirical and theoretical science of psychology is an impossible project.

  2. Power-Laws and Scaling in Finance: Empirical Evidence and Simple Models

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe

    We discuss several models that may explain the origin of power-law distributions and power-law correlations in financial time series. From an empirical point of view, the exponents describing the tails of the price increments distribution and the decay of the volatility correlations are rather robust and suggest universality. However, many of the models that appear naturally (for example, to account for the distribution of wealth) contain some multiplicative noise, which generically leads to non universal exponents. Recent progress in the empirical study of the volatility suggests that the volatility results from some sort of multiplicative cascade. A convincing `microscopic' (i.e. trader based) model that explains this observation is however not yet available. We discuss a rather generic mechanism for long-ranged volatility correlations based on the idea that agents constantly switch between active and inactive strategies depending on their relative performance.

  3. Dependency structure and scaling properties of financial time series are related

    PubMed Central

    Morales, Raffaello; Di Matteo, T.; Aste, Tomaso

    2014-01-01

    We report evidence of a deep interplay between cross-correlations hierarchical properties and multifractality of New York Stock Exchange daily stock returns. The degree of multifractality displayed by different stocks is found to be positively correlated to their depth in the hierarchy of cross-correlations. We propose a dynamical model that reproduces this observation along with an array of other empirical properties. The structure of this model is such that the hierarchical structure of heterogeneous risks plays a crucial role in the time evolution of the correlation matrix, providing an interpretation to the mechanism behind the interplay between cross-correlation and multifractality in financial markets, where the degree of multifractality of stocks is associated to their hierarchical positioning in the cross-correlation structure. Empirical observations reported in this paper present a new perspective towards the merging of univariate multi scaling and multivariate cross-correlation properties of financial time series. PMID:24699417

  4. Dependency structure and scaling properties of financial time series are related

    NASA Astrophysics Data System (ADS)

    Morales, Raffaello; Di Matteo, T.; Aste, Tomaso

    2014-04-01

    We report evidence of a deep interplay between cross-correlations hierarchical properties and multifractality of New York Stock Exchange daily stock returns. The degree of multifractality displayed by different stocks is found to be positively correlated to their depth in the hierarchy of cross-correlations. We propose a dynamical model that reproduces this observation along with an array of other empirical properties. The structure of this model is such that the hierarchical structure of heterogeneous risks plays a crucial role in the time evolution of the correlation matrix, providing an interpretation to the mechanism behind the interplay between cross-correlation and multifractality in financial markets, where the degree of multifractality of stocks is associated to their hierarchical positioning in the cross-correlation structure. Empirical observations reported in this paper present a new perspective towards the merging of univariate multi scaling and multivariate cross-correlation properties of financial time series.

  5. Modeling of Kerena Emergency Condenser

    NASA Astrophysics Data System (ADS)

    Bryk, Rafał; Schmidt, Holger; Mull, Thomas; Wagner, Thomas; Ganzmann, Ingo; Herbst, Oliver

    2017-12-01

    KERENA is an innovative boiling water reactor concept equipped with several passive safety systems. For the experimental verification of performance of the systems and for codes validation, the Integral Test Stand Karlstein (INKA) was built in Karlstein, Germany. The emergency condenser (EC) system transfers heat from the reactor pressure vessel (RPV) to the core flooding pool in case of water level decrease in the RPV. EC is composed of a large number of slightly inclined tubes. During accident conditions, steam enters into the tubes and condenses due to the contact of the tubes with cold water at the secondary side. The condensed water flows then back to the RPV due to gravity. In this paper two approaches for modeling of condensation in slightly inclined tubes are compared and verified against experiments. The first approach is based on the flow regime map. Depending on the regime, heat transfer coefficient is calculated according to specific semi-empirical correlation. The second approach uses a general, fully-empirical correlation. The models are developed with utilization of the object-oriented Modelica language and the open-source OpenModelica environment. The results are compared with data obtained during a large scale integral test, simulating loss of coolant accident performed at Integral Test Stand Karlstein (INKA). The comparison shows a good agreement.Due to the modularity of models, both of them may be used in the future in systems incorporating condensation in horizontal or slightly inclined tubes. Depending on his preferences, the modeller may choose one-equation based approach or more sophisticated model composed of several exchangeable semi-empirical correlations.

  6. A structural-phenomenological typology of mind-matter correlations.

    PubMed

    Atmanspacher, Harald; Fach, Wolfgang

    2013-04-01

    We present a typology of mind-matter correlations embedded in a dual-aspect monist framework as proposed by Pauli and Jung. They conjectured a picture in which the mental and the material arise as two complementary aspects of one underlying psychophysically neutral reality to which they cannot be reduced and to which direct empirical access is impossible. This picture suggests structural, persistent, reproducible mind-matter correlations by splitting the underlying reality into aspects. In addition, it suggests induced, occasional, evasive mind-matter correlations above and below, respectively, those stable baseline correlations. Two significant roles for the concept of meaning in this framework are elucidated. Finally, it is shown that the obtained typology is in perfect agreement with an empirically based classification of the phenomenology of mind-matter correlations as observed in exceptional human experiences. © 2013, The Society of Analytical Psychology.

  7. Metaheuristic optimization approaches to predict shear-wave velocity from conventional well logs in sandstone and carbonate case studies

    NASA Astrophysics Data System (ADS)

    Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi

    2018-06-01

    Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.

  8. Estimating and Identifying Unspecified Correlation Structure for Longitudinal Data

    PubMed Central

    Hu, Jianhua; Wang, Peng; Qu, Annie

    2014-01-01

    Identifying correlation structure is important to achieving estimation efficiency in analyzing longitudinal data, and is also crucial for drawing valid statistical inference for large size clustered data. In this paper, we propose a nonparametric method to estimate the correlation structure, which is applicable for discrete longitudinal data. We utilize eigenvector-based basis matrices to approximate the inverse of the empirical correlation matrix and determine the number of basis matrices via model selection. A penalized objective function based on the difference between the empirical and model approximation of the correlation matrices is adopted to select an informative structure for the correlation matrix. The eigenvector representation of the correlation estimation is capable of reducing the risk of model misspecification, and also provides useful information on the specific within-cluster correlation pattern of the data. We show that the proposed method possesses the oracle property and selects the true correlation structure consistently. The proposed method is illustrated through simulations and two data examples on air pollution and sonar signal studies. PMID:26361433

  9. A model of clutter for complex, multivariate geospatial displays.

    PubMed

    Lohrenz, Maura C; Trafton, J Gregory; Beck, R Melissa; Gendron, Marlin L

    2009-02-01

    A novel model of measuring clutter in complex geospatial displays was compared with human ratings of subjective clutter as a measure of convergent validity. The new model is called the color-clustering clutter (C3) model. Clutter is a known problem in displays of complex data and has been shown to affect target search performance. Previous clutter models are discussed and compared with the C3 model. Two experiments were performed. In Experiment 1, participants performed subjective clutter ratings on six classes of information visualizations. Empirical results were used to set two free parameters in the model. In Experiment 2, participants performed subjective clutter ratings on aeronautical charts. Both experiments compared and correlated empirical data to model predictions. The first experiment resulted in a .76 correlation between ratings and C3. The second experiment resulted in a .86 correlation, significantly better than results from a model developed by Rosenholtz et al. Outliers to our correlation suggest further improvements to C3. We suggest that (a) the C3 model is a good predictor of subjective impressions of clutter in geospatial displays, (b) geospatial clutter is a function of color density and saliency (primary C3 components), and (c) pattern analysis techniques could further improve C3. The C3 model could be used to improve the design of electronic geospatial displays by suggesting when a display will be too cluttered for its intended audience.

  10. Statistical microeconomics and commodity prices: theory and empirical results.

    PubMed

    Baaquie, Belal E

    2016-01-13

    A review is made of the statistical generalization of microeconomics by Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is given by the unequal time correlation function and is modelled by the Feynman path integral based on an action functional. The correlation functions of the model are defined using the path integral. The existence of the action functional for commodity prices that was postulated to exist in Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)) has been empirically ascertained in Baaquie et al. (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). The model's action functionals for different commodities has been empirically determined and calibrated using the unequal time correlation functions of the market commodity prices using a perturbation expansion (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). Nine commodities drawn from the energy, metal and grain sectors are empirically studied and their auto-correlation for up to 300 days is described by the model to an accuracy of R(2)>0.90-using only six parameters. © 2015 The Author(s).

  11. Comment: Spurious Correlation and Other Observations on Experimental Design for Engineering Dimensional Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.

    2013-08-01

    This article discusses the paper "Experimental Design for Engineering Dimensional Analysis" by Albrecht et al. (2013, Technometrics). That paper provides and overview of engineering dimensional analysis (DA) for use in developing DA models. The paper proposes methods for generating model-robust experimental designs to supporting fitting DA models. The specific approach is to develop a design that maximizes the efficiency of a specified empirical model (EM) in the original independent variables, subject to a minimum efficiency for a DA model expressed in terms of dimensionless groups (DGs). This discussion article raises several issues and makes recommendations regarding the proposed approach. Also,more » the concept of spurious correlation is raised and discussed. Spurious correlation results from the response DG being calculated using several independent variables that are also used to calculate predictor DGs in the DA model.« less

  12. An empirical comparative study on biological age estimation algorithms with an application of Work Ability Index (WAI).

    PubMed

    Cho, Il Haeng; Park, Kyung S; Lim, Chang Joo

    2010-02-01

    In this study, we described the characteristics of five different biological age (BA) estimation algorithms, including (i) multiple linear regression, (ii) principal component analysis, and somewhat unique methods developed by (iii) Hochschild, (iv) Klemera and Doubal, and (v) a variant of Klemera and Doubal's method. The objective of this study is to find the most appropriate method of BA estimation by examining the association between Work Ability Index (WAI) and the differences of each algorithm's estimates from chronological age (CA). The WAI was found to be a measure that reflects an individual's current health status rather than the deterioration caused by a serious dependency with the age. Experiments were conducted on 200 Korean male participants using a BA estimation system developed principally under the concept of non-invasive, simple to operate and human function-based. Using the empirical data, BA estimation as well as various analyses including correlation analysis and discriminant function analysis was performed. As a result, it had been confirmed by the empirical data that Klemera and Doubal's method with uncorrelated variables from principal component analysis produces relatively reliable and acceptable BA estimates. 2009 Elsevier Ireland Ltd. All rights reserved.

  13. Empirical Green's functions from small earthquakes: A waveform study of locally recorded aftershocks of the 1971 San Fernando earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchings, L.; Wu, F.

    1990-02-10

    Seismograms from 52 aftershocks of the 1971 San Fernando earthquake recorded at 25 stations distributed across the San Fernando Valley are examined to identify empirical Green's functions, and characterize the dependence of their waveforms on moment, focal mechanism, source and recording site spatial variations, recording site geology, and recorded frequency band. Recording distances ranged from 3.0 to 33.0 km, hypocentral separations ranged from 0.22 to 28.4 km, and recording site separations ranged from 0.185 to 24.2 km. The recording site geologies are diorite gneiss, marine and nonmarine sediments, and alluvium of varying thicknesses. Waveforms of events with moment below aboutmore » 1.5 {times} 10{sup 21} dyn cm are independent of the source-time function and are termed empirical Green's functions. Waveforms recorded at a particular station from events located within 1.0 to 3.0 km of each other, depending upon site geology, with very similar focal mechanism solutions are nearly identical for frequencies up to 10 Hz. There is no correlation to waveforms between recording sites at least 1.2 km apart, and waveforms are clearly distinctive for two sites 0.185 km apart. The geologic conditions of the recording site dominate the character of empirical Green's functions. Even for source separations of up to 20.0 km, the empirical Green's functions at a particular site are consistent in frequency content, amplification, and energy distribution. Therefore, it is shown that empirical Green's functions can be used to obtain site response functions. The observations of empirical Green's functions are used as a basis for developing the theory for using empirical Green's functions in deconvolution for source pulses and synthesis of seismograms of larger earthquakes.« less

  14. Electrochemical carbon dioxide concentrator: Math model

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Schubert, F. H.; Carlson, J. N.

    1973-01-01

    A steady state computer simulation model of an Electrochemical Depolarized Carbon Dioxide Concentrator (EDC) has been developed. The mathematical model combines EDC heat and mass balance equations with empirical correlations derived from experimental data to describe EDC performance as a function of the operating parameters involved. The model is capable of accurately predicting performance over EDC operating ranges. Model simulation results agree with the experimental data obtained over the prediction range.

  15. An applicable method for efficiency estimation of operating tray distillation columns and its comparison with the methods utilized in HYSYS and Aspen Plus

    NASA Astrophysics Data System (ADS)

    Sadeghifar, Hamidreza

    2015-10-01

    Developing general methods that rely on column data for the efficiency estimation of operating (existing) distillation columns has been overlooked in the literature. Most of the available methods are based on empirical mass transfer and hydraulic relations correlated to laboratory data. Therefore, these methods may not be sufficiently accurate when applied to industrial columns. In this paper, an applicable and accurate method was developed for the efficiency estimation of distillation columns filled with trays. This method can calculate efficiency as well as mass and heat transfer coefficients without using any empirical mass transfer or hydraulic correlations and without the need to estimate operational or hydraulic parameters of the column. E.g., the method does not need to estimate tray interfacial area, which can be its most important advantage over all the available methods. The method can be used for the efficiency prediction of any trays in distillation columns. For the efficiency calculation, the method employs the column data and uses the true rates of the mass and heat transfers occurring inside the operating column. It is highly emphasized that estimating efficiency of an operating column has to be distinguished from that of a column being designed.

  16. Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions.

    PubMed

    Fox, Naomi J; Marion, Glenn; Davidson, Ross S; White, Piran C L; Hutchings, Michael R

    2012-03-06

    Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed.

  17. A mini-review on econophysics: Comparative study of Chinese and western financial markets

    NASA Astrophysics Data System (ADS)

    Zheng, Bo; Jiang, Xiong-Fei; Ni, Peng-Yun

    2014-07-01

    We present a review of our recent research in econophysics, and focus on the comparative study of Chinese and western financial markets. By virtue of concepts and methods in statistical physics, we investigate the time correlations and spatial structure of financial markets based on empirical high-frequency data. We discover that the Chinese stock market shares common basic properties with the western stock markets, such as the fat-tail probability distribution of price returns, the long-range auto-correlation of volatilities, and the persistence probability of volatilities, while it exhibits very different higher-order time correlations of price returns and volatilities, spatial correlations of individual stock prices, and large-fluctuation dynamic behaviors. Furthermore, multi-agent-based models are developed to simulate the microscopic interaction and dynamic evolution of the stock markets.

  18. Why and How. The Future of the Central Questions of Consciousness

    PubMed Central

    Havlík, Marek; Kozáková, Eva; Horáček, Jiří

    2017-01-01

    In this review, we deal with two central questions of consciousness how and why, and we outline their possible future development. The question how refers to the empirical endeavor to reveal the neural correlates and mechanisms that form consciousness. On the other hand, the question why generally refers to the “hard problem” of consciousness, which claims that empirical science will always fail to provide a satisfactory answer to the question why is there conscious experience at all. Unfortunately, the hard problem of consciousness will probably never completely disappear because it will always have its most committed supporters. However, there is a good chance that its weight and importance will be highly reduced by empirically tackling consciousness in the near future. We expect that future empirical endeavor of consciousness will be based on a unifying brain theory and will answer the question as to what is the function of conscious experience, which will in turn replace the implications of the hard problem. The candidate of such a unifying brain theory is predictive coding, which will have to explain both perceptual consciousness and conscious mind-wandering in order to become the truly unifying theory of brain functioning. PMID:29075226

  19. 40 CFR Appendix C to Part 75 - Missing Data Estimation Procedures

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... certification of a parametric, empirical, or process simulation method or model for calculating substitute data... available process simulation methods and models. 1.2Petition Requirements Continuously monitor, determine... desulfurization, a corresponding empirical correlation or process simulation parametric method using appropriate...

  20. Phase correlation of foreign exchange time series

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Chya

    2007-03-01

    Correlation of foreign exchange rates in currency markets is investigated based on the empirical data of USD/DEM and USD/JPY exchange rates for a period from February 1 1986 to December 31 1996. The return of exchange time series is first decomposed into a number of intrinsic mode functions (IMFs) by the empirical mode decomposition method. The instantaneous phases of the resultant IMFs calculated by the Hilbert transform are then used to characterize the behaviors of pricing transmissions, and the correlation is probed by measuring the phase differences between two IMFs in the same order. From the distribution of phase differences, our results show explicitly that the correlations are stronger in daily time scale than in longer time scales. The demonstration for the correlations in periods of 1986-1989 and 1990-1993 indicates two exchange rates in the former period were more correlated than in the latter period. The result is consistent with the observations from the cross-correlation calculation.

  1. Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1996-01-01

    In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.

  2. Seismic Site Classification and Empirical Correlation Between Standard Penetration Test N Value and Shear Wave Velocity for Guwahati Based on Thorough Subsoil Investigation Data

    NASA Astrophysics Data System (ADS)

    Kumar, Abhishek; Harinarayan, N. H.; Verma, Vishal; Anand, Saurabh; Borah, Uddipana; Bania, Mousumi

    2018-04-01

    Guwahati, the Gateway of India in the northeast, is a large business and development center. Past seismic scenarios suggest moderate to significant effects of regional earthquakes (EQs) in Guwahati in terms of liquefaction as well as building damages. Considering the role of local soil in amplifying EQ-generated ground motions and controlling surface damages, present study attempts seismic site classification of subsoil of Guwahati. Subsoil is explored based on 43 geophysical tests and 244 borelogs gathered from different resources. Based on the borehole data, 4 numbers of 2D cross-sections are developed from different parts of Guwahati, clearly indicating that a majority of the locations are composed of clay of intermediate to high plasticity while at specific locations only, layers of sand are found at selective depths. Further, seismic site classification based on 30 m average SPT-N suggests that a major part of Guwahati falls under seismic site class (SSC) D such as Balaji Temple and Airport. However, Assam Zoo, Pan Bazaar, IIT campus, Dhol Gobinda and Maligaon show SSC E clearly indicating the presence of soft soil deposits at these locations. Similar site classification is also attempted from MASW test-based 30 m average shear wave velocity (V S30). V S30-based site classification also categorizes most of Guwahati under SSC D. However, there are locations in the southern part of Guwahati which belong to SSC C as well. Mismatch in SSC based on two different test findings for Indian soil found here are consistent with previous studies. Further, three empirical correlations based on both SPT-N and V S profiles at 22 test locations are developed for: (1) clayey; (2) sandy and (3) all soil types. Proposed correlation for all soil types is validated graphically and is found closely matching with similar correlations for Turkey and Lucknow.

  3. Statistical analysis on multifractal detrended cross-correlation coefficient for return interval by oriented percolation

    NASA Astrophysics Data System (ADS)

    Deng, Wei; Wang, Jun

    2015-06-01

    We investigate and quantify the multifractal detrended cross-correlation of return interval series for Chinese stock markets and a proposed price model, the price model is established by oriented percolation. The return interval describes the waiting time between two successive price volatilities which are above some threshold, the present work is an attempt to quantify the level of multifractal detrended cross-correlation for the return intervals. Further, the concept of MF-DCCA coefficient of return intervals is introduced, and the corresponding empirical research is performed. The empirical results show that the return intervals of SSE and SZSE are weakly positive multifractal power-law cross-correlated, and exhibit the fluctuation patterns of MF-DCCA coefficients. The similar behaviors of return intervals for the price model is also demonstrated.

  4. Cross-correlations between the US monetary policy, US dollar index and crude oil market

    NASA Astrophysics Data System (ADS)

    Sun, Xinxin; Lu, Xinsheng; Yue, Gongzheng; Li, Jianfeng

    2017-02-01

    This paper investigates the cross-correlations between the US monetary policy, US dollar index and WTI crude oil market, using a dataset covering a period from February 4, 1994 to February 29, 2016. Our study contributes to the literature by examining the effect of the US monetary policy on US dollar index and WTI crude oil through the MF-DCCA approach. The empirical results show that the cross-correlations between the three sets of time series exhibit strong multifractal features with the strength of multifractality increasing over the sample period. Employing a rolling window analysis, our empirical results show that the US monetary policy operations have clear influences on the cross-correlated behavior of the three time series covered by this study.

  5. Movement patterns of Tenebrio beetles demonstrate empirically that correlated-random-walks have similitude with a Lévy walk.

    PubMed

    Reynolds, Andy M; Leprêtre, Lisa; Bohan, David A

    2013-11-07

    Correlated random walks are the dominant conceptual framework for modelling and interpreting organism movement patterns. Recent years have witnessed a stream of high profile publications reporting that many organisms perform Lévy walks; movement patterns that seemingly stand apart from the correlated random walk paradigm because they are discrete and scale-free rather than continuous and scale-finite. Our new study of the movement patterns of Tenebrio molitor beetles in unchanging, featureless arenas provides the first empirical support for a remarkable and deep theoretical synthesis that unites correlated random walks and Lévy walks. It demonstrates that the two models are complementary rather than competing descriptions of movement pattern data and shows that correlated random walks are a part of the Lévy walk family. It follows from this that vast numbers of Lévy walkers could be hiding in plain sight.

  6. Developing the Mid-Level Civilian Logistician: An Empirical Study of United States Air Force GS-12 to GM-13 Professional Development.

    DTIC Science & Technology

    1989-09-01

    airimn : PME NUMBER OF APPLICABLE RESPONSES PROGRAMS S S~ 1quadron Off icer-S SchoolI. Ai~rCcr 1 .n College, Air War College: self-direct-d Table 11...8217 betweenr- thie number of e:;ist ~p ~msancre:.t7 lcg~stiian weak-nesses (r -. 8545 ) Inditc. r: lo:w pcoitlive correlation e:xists b_-etween thenubr...equivalent) c. Industrial College of the Armed Forces d. Defense Systems Management Course e. Air War College (or equivalent) f. Other (please specify) g

  7. Empirical correlates for the Minnesota Multiphasic Personality Inventory-2-Restructured Form in a German inpatient sample.

    PubMed

    Moultrie, Josefine K; Engel, Rolf R

    2017-10-01

    We identified empirical correlates for the 42 substantive scales of the German language version of the Minnesota Multiphasic Personality Inventory (MMPI)-2-Restructured Form (MMPI-2-RF): Higher Order, Restructured Clinical, Specific Problem, Interest, and revised Personality Psychopathology Five scales. We collected external validity data by means of a 177-item chart review form in a sample of 488 psychiatric inpatients of a German university hospital. We structured our findings along the interpretational guidelines for the MMPI-2-RF and compared them with the validity data published in the tables of the MMPI-2-RF Technical Manual. Our results show significant correlations between MMPI-2-RF scales and conceptually relevant criteria. Most of the results were in line with U.S. validation studies. Some of the differences could be attributed to sample compositions. For most of the scales, construct validity coefficients were acceptable. Taken together, this study amplifies the enlarging body of research on empirical correlates of the MMPI-2-RF scales in a new sample. The study suggests that the interpretations given in the MMPI-2-RF manual may be generalizable to the German language MMPI-2-RF. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Computer programs to predict induced effects of jets exhausting into a crossflow

    NASA Technical Reports Server (NTRS)

    Perkins, S. C., Jr.; Mendenhall, M. R.

    1984-01-01

    A user's manual for two computer programs was developed to predict the induced effects of jets exhausting into a crossflow. Program JETPLT predicts pressures induced on an infinite flat plate by a jet exhausting at angles to the plate and Program JETBOD, in conjunction with a panel code, predicts pressures induced on a body of revolution by a jet exhausting normal to the surface. Both codes use a potential model of the jet and adjacent surface with empirical corrections for the viscous or nonpotential effects. This program manual contains a description of the use of both programs, instructions for preparation of input, descriptions of the output, limitations of the codes, and sample cases. In addition, procedures to extend both codes to include additional empirical correlations are described.

  9. The microwave propagation and backscattering characteristics of vegetation. [wheat, sorghum, soybeans and corn fields in Kansas

    NASA Technical Reports Server (NTRS)

    Ulaby, F. T. (Principal Investigator); Wilson, E. A.

    1984-01-01

    A semi-empirical model for microwave backscatter from vegetation was developed and a complete set of canope attenuation measurements as a function of frequency, incidence angle and polarization was acquired. The semi-empirical model was tested on corn and sorghum data over the 8 to 35 GHz range. The model generally provided an excellent fit to the data as measured by the correlation and rms error between observed and predicted data. The model also predicted reasonable values of canopy attenuation. The attenuation data was acquired over the 1.6 to 10.2 GHz range for the linear polarizations at approximately 20 deg and 50 deg incidence angles for wheat and soybeans. An attenuation model is proposed which provides reasonable agreement with the measured data.

  10. An empirical investigation on different methods of economic growth rate forecast and its behavior from fifteen countries across five continents

    NASA Astrophysics Data System (ADS)

    Yin, Yip Chee; Hock-Eam, Lim

    2012-09-01

    Our empirical results show that we can predict GDP growth rate more accurately in continent with fewer large economies, compared to smaller economies like Malaysia. This difficulty is very likely positively correlated with subsidy or social security policies. The stage of economic development and level of competiveness also appears to have interactive effects on this forecast stability. These results are generally independent of the forecasting procedures. Countries with high stability in their economic growth, forecasting by model selection is better than model averaging. Overall forecast weight averaging (FWA) is a better forecasting procedure in most countries. FWA also outperforms simple model averaging (SMA) and has the same forecasting ability as Bayesian model averaging (BMA) in almost all countries.

  11. Use of Empirical Estimates of Shrinkage in Multiple Regression: A Caution.

    ERIC Educational Resources Information Center

    Kromrey, Jeffrey D.; Hines, Constance V.

    1995-01-01

    The accuracy of four empirical techniques to estimate shrinkage in multiple regression was studied through Monte Carlo simulation. None of the techniques provided unbiased estimates of the population squared multiple correlation coefficient, but the normalized jackknife and bootstrap techniques demonstrated marginally acceptable performance with…

  12. A Bayes linear Bayes method for estimation of correlated event rates.

    PubMed

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  13. Supersonic aerodynamics of delta wings

    NASA Technical Reports Server (NTRS)

    Wood, Richard M.

    1988-01-01

    Through the empirical correlation of experimental data and theoretical analysis, a set of graphs has been developed which summarize the inviscid aerodynamics of delta wings at supersonic speeds. The various graphs which detail the aerodynamic performance of delta wings at both zero-lift and lifting conditions were then employed to define a preliminary wing design approach in which both the low-lift and high-lift design criteria were combined to define a feasible design space.

  14. Analytical prediction of forced convective heat transfer of fluids embedded with nanostructured materials (nanofluids)

    NASA Astrophysics Data System (ADS)

    Vasu, V.; Rama Krishna, K.; Kumar, A. C. S.

    2007-09-01

    Nanofluids are a new class of heat transfer fluids developed by suspending nanosized solid particles in liquids. Larger thermal conductivity of solid particles compared to the base fluid such as water, ethylene glycol, engine oil etc. significantly enhances their thermal properties. Several phenomenological models have been proposed to explain the anomalous heat transfer enhancement in nanofluids. This paper presents a systematic literature survey to exploit the characteristics of nanofluids, viz., thermal conductivity, specific heat and other thermal properties. An empirical correlation for the thermal conductivity of Al_{2}O_{3} + water and Cu + water nanofluids, considering the effects of temperature, volume fraction and size of the nanoparticle is developed and presented. A correlation for the evaluation of Nusselt number is also developed and presented and compared in graphical form. This enhanced thermophysical and heat transfer characteristics make fluids embedded with nanomaterials as excellent candidates for future applications.

  15. Dual-aspect monism à la Pauli and Jung perforates the completeness of physics

    NASA Astrophysics Data System (ADS)

    Atmanspacher, Harald

    2012-12-01

    In the mid 19th century, the physicist Wolfgang Pauli and the psychologist Carl Gustav Jung developed a philosophical position for the mind-matter problem that is today called dual-aspect monism. They conjectured a picture in which the mental and the material arise as two complementary aspects of one underlying psychophysically neutral reality to which they cannot be reduced and to which direct empirical access is impossible. This picture suggests structural, persistent,re-producible mind-matter correlations by splitting the underlying reality into aspects. In addition, it suggests induced, occasional, evasive mind-matter correlations above and below, respectively, those stable baseline correlations. These correlations, and the way they arise, suggest that the domain of the physical is not completely independent of the domain of the mental, and both are not independent from the assumed reality underlying them. Some ideas are presented of how these relationships might be conceived.

  16. Thermal Conductivity of Metallic Uranium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hin, Celine

    This project has developed a modeling and simulation approaches to predict the thermal conductivity of metallic fuels and their alloys. We focus on two methods. The first method has been developed by the team at the University of Wisconsin Madison. They developed a practical and general modeling approach for thermal conductivity of metals and metal alloys that integrates ab-initio and semi-empirical physics-based models to maximize the strengths of both techniques. The second method has been developed by the team at Virginia Tech. This approach consists of a determining the thermal conductivity using only ab-initio methods without any fitting parameters. Bothmore » methods were complementary. The models incorporated both phonon and electron contributions. Good agreement with experimental data over a wide temperature range were found. The models also provided insight into the different physical factors that govern the thermal conductivity under different temperatures. The models were general enough to incorporate more complex effects like additional alloying species, defects, transmutation products and noble gas bubbles to predict the behavior of complex metallic alloys like U-alloy fuel systems under burnup. 3 Introduction Thermal conductivity is an important thermal physical property affecting the performance and efficiency of metallic fuels [1]. Some experimental measurement of thermal conductivity and its correlation with composition and temperature from empirical fitting are available for U, Zr and their alloys with Pu and other minor actinides. However, as reviewed in by Kim, Cho and Sohn [2], due to the difficulty in doing experiments on actinide materials, thermal conductivities of metallic fuels have only been measured at limited alloy compositions and temperatures, some of them even being negative and unphysical. Furthermore, the correlations developed so far are empirical in nature and may not be accurate when used for prediction at conditions far from those used in the original fitting. Moreover, as fuels burn up in the reactor and fission products are built up, thermal conductivity is also significantly changed [3]. Unfortunately, fundamental understanding of the effect of fission products is also currently lacking. In this project, we probe thermal conductivity of metallic fuels with ab initio calculations, a theoretical tool with the potential to yield better accuracy and predictive power than empirical fitting. This work will both complement experimental data by determining thermal conductivity in wider composition and temperature ranges than is available experimentally, and also develop mechanistic understanding to guide better design of metallic fuels in the future. So far, we focused on α-U perfect crystal, the ground-state phase of U metal. We focus on two methods. The first method has been developed by the team at the University of Wisconsin Madison. They developed a practical and general modeling approach for thermal conductivity of metals and metal alloys that integrates ab-initio and semi-empirical physics-based models to maximize the strengths of both techniques. The second method has been developed by the team at Virginia Tech. This approach consists of a determining the thermal conductivity using only ab-initio methods without any fitting parameters. Both methods were complementary and very helpful to understand the physics behind the thermal conductivity in metallic uranium and other materials with similar characteristics. In Section I, the combined model developed at UWM is explained. In Section II, the ab-initio method developed at VT is described along with the uranium pseudo-potential and its validation. Section III is devoted to the work done by Jianguo Yu at INL. Finally, we will present the performance of the project in terms of milestones, publications, and presentations.« less

  17. Dynamic correlations at different time-scales with empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Nava, Noemi; Di Matteo, T.; Aste, Tomaso

    2018-07-01

    We introduce a simple approach which combines Empirical Mode Decomposition (EMD) and Pearson's cross-correlations over rolling windows to quantify dynamic dependency at different time scales. The EMD is a tool to separate time series into implicit components which oscillate at different time-scales. We apply this decomposition to intraday time series of the following three financial indices: the S&P 500 (USA), the IPC (Mexico) and the VIX (volatility index USA), obtaining time-varying multidimensional cross-correlations at different time-scales. The correlations computed over a rolling window are compared across the three indices, across the components at different time-scales and across different time lags. We uncover a rich heterogeneity of interactions, which depends on the time-scale and has important lead-lag relations that could have practical use for portfolio management, risk estimation and investment decisions.

  18. Price-volume multifractal analysis and its application in Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying

    2012-06-01

    An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.

  19. Feynman perturbation expansion for the price of coupon bond options and swaptions in quantum finance. II. Empirical

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Liang, Cui

    2007-01-01

    The quantum finance pricing formulas for coupon bond options and swaptions derived by Baaquie [Phys. Rev. E 75, 016703 (2006)] are reviewed. We empirically study the swaption market and propose an efficient computational procedure for analyzing the data. Empirical results of the swaption price, volatility, and swaption correlation are compared with the predictions of quantum finance. The quantum finance model generates the market swaption price to over 90% accuracy.

  20. Feynman perturbation expansion for the price of coupon bond options and swaptions in quantum finance. II. Empirical.

    PubMed

    Baaquie, Belal E; Liang, Cui

    2007-01-01

    The quantum finance pricing formulas for coupon bond options and swaptions derived by Baaquie [Phys. Rev. E 75, 016703 (2006)] are reviewed. We empirically study the swaption market and propose an efficient computational procedure for analyzing the data. Empirical results of the swaption price, volatility, and swaption correlation are compared with the predictions of quantum finance. The quantum finance model generates the market swaption price to over 90% accuracy.

  1. The conceptual and empirical relationship between gambling, investing, and speculation

    PubMed Central

    Arthur, Jennifer N.; Williams, Robert J.; Delfabbro, Paul H.

    2016-01-01

    Background and aims To review the conceptual and empirical relationship between gambling, investing, and speculation. Methods An analysis of the attributes differentiating these constructs as well as identification of all articles speaking to their empirical relationship. Results Gambling differs from investment on many different attributes and should be seen as conceptually distinct. On the other hand, speculation is conceptually intermediate between gambling and investment, with a few of its attributes being investment-like, some of its attributes being gambling-like, and several of its attributes being neither clearly gambling or investment-like. Empirically, gamblers, investors, and speculators have similar cognitive, motivational, and personality attributes, with this relationship being particularly strong for gambling and speculation. Population levels of gambling activity also tend to be correlated with population level of financial speculation. At an individual level, speculation has a particularly strong empirical relationship to gambling, as speculators appear to be heavily involved in traditional forms of gambling and problematic speculation is strongly correlated with problematic gambling. Discussion and conclusions Investment is distinct from gambling, but speculation and gambling have conceptual overlap and a strong empirical relationship. It is recommended that financial speculation be routinely included when assessing gambling involvement, and there needs to be greater recognition and study of financial speculation as both a contributor to problem gambling as well as an additional form of behavioral addiction in its own right. PMID:27929350

  2. Empirical study of recent Chinese stock market

    NASA Astrophysics Data System (ADS)

    Jiang, J.; Li, W.; Cai, X.; Wang, Qiuping A.

    2009-05-01

    We investigate the statistical properties of the empirical data taken from the Chinese stock market during the time period from January, 2006 to July, 2007. By using the methods of detrended fluctuation analysis (DFA) and calculating correlation coefficients, we acquire the evidence of strong correlations among different stock types, stock index, stock volume turnover, A share (B share) seat number, and GDP per capita. In addition, we study the behavior of “volatility”, which is now defined as the difference between the new account numbers for two consecutive days. It is shown that the empirical power-law of the number of aftershock events exceeding the selected threshold is analogous to the Omori law originally observed in geophysics. Furthermore, we find that the cumulative distributions of stock return, trade volume and trade number are all exponential-like, which does not belong to the universality class of such distributions found by Xavier Gabaix et al. [Xavier Gabaix, Parameswaran Gopikrishnan, Vasiliki Plerou, H. Eugene Stanley, Nature, 423 (2003)] for major western markets. Through the comparison, we draw a conclusion that regardless of developed stock markets or emerging ones, “cubic law of returns” is valid only in the long-term absolute return, and in the short-term one, the distributions are exponential-like. Specifically, the distributions of both trade volume and trade number display distinct decaying behaviors in two separate regimes. Lastly, the scaling behavior of the relation is analyzed between dispersion and the mean monthly trade value for each administrative area in China.

  3. Correlation matrix renormalization theory for correlated-electron materials with application to the crystalline phases of atomic hydrogen

    DOE PAGES

    Zhao, Xin; Liu, Jun; Yao, Yong-Xin; ...

    2018-01-23

    Developing accurate and computationally efficient methods to calculate the electronic structure and total energy of correlated-electron materials has been a very challenging task in condensed matter physics and materials science. Recently, we have developed a correlation matrix renormalization (CMR) method which does not assume any empirical Coulomb interaction U parameters and does not have double counting problems in the ground-state total energy calculation. The CMR method has been demonstrated to be accurate in describing both the bonding and bond breaking behaviors of molecules. In this study, we extend the CMR method to the treatment of electron correlations in periodic solidmore » systems. By using a linear hydrogen chain as a benchmark system, we show that the results from the CMR method compare very well with those obtained recently by accurate quantum Monte Carlo (QMC) calculations. We also study the equation of states of three-dimensional crystalline phases of atomic hydrogen. We show that the results from the CMR method agree much better with the available QMC data in comparison with those from density functional theory and Hartree-Fock calculations.« less

  4. Correlation matrix renormalization theory for correlated-electron materials with application to the crystalline phases of atomic hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Xin; Liu, Jun; Yao, Yong-Xin

    Developing accurate and computationally efficient methods to calculate the electronic structure and total energy of correlated-electron materials has been a very challenging task in condensed matter physics and materials science. Recently, we have developed a correlation matrix renormalization (CMR) method which does not assume any empirical Coulomb interaction U parameters and does not have double counting problems in the ground-state total energy calculation. The CMR method has been demonstrated to be accurate in describing both the bonding and bond breaking behaviors of molecules. In this study, we extend the CMR method to the treatment of electron correlations in periodic solidmore » systems. By using a linear hydrogen chain as a benchmark system, we show that the results from the CMR method compare very well with those obtained recently by accurate quantum Monte Carlo (QMC) calculations. We also study the equation of states of three-dimensional crystalline phases of atomic hydrogen. We show that the results from the CMR method agree much better with the available QMC data in comparison with those from density functional theory and Hartree-Fock calculations.« less

  5. Multiscale Characterization of PM2.5 in Southern Taiwan based on Noise-assisted Multivariate Empirical Mode Decomposition and Time-dependent Intrinsic Correlation

    NASA Astrophysics Data System (ADS)

    Hsiao, Y. R.; Tsai, C.

    2017-12-01

    As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.

  6. Development of hypersonic engine seals: Flow effects of preload and engine pressures

    NASA Technical Reports Server (NTRS)

    Cai, Zhong; Mutharasan, Rajakkannu; Ko, Frank K.; Steinetz, Bruce M.

    1993-01-01

    A new type of engine seal is being developed to meet the needs of advanced hypersonic engines. A seal braided of emerging high temperature ceramic fibers comprised of a sheath-core construction was selected for study based on its low leakage rates. Flexible, low-leakage, high temperature seals are required to seal the movable engine panels of advanced ramjet-scramjet engines either preventing potentially dangerous leakage into backside engine cavities or limiting the purge coolant flow rates through the seals. To predict the leakage through these flexible, porous seal structures as a function of preload and engine pressures, new analytical flow models are required. An empirical leakage resistance/preload model is proposed to characterize the observed decrease in leakage with increasing preload. Empirically determined compression modulus and preload factor are used to correlate experimental leakage data for a wide range of seal architectures. Good agreement between measured and predicted values are observed over a range of engine pressures and seal preloads.

  7. A Divergence Statistics Extension to VTK for Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less

  8. Communication as an ecological system.

    PubMed

    Borg, Erik; Bergkvist, Christina; Olsson, Inga-Stina; Wikström, Carina; Borg, Birgitta

    2008-11-01

    A conceptual framework for human communication, based on traditional biological ecology, is further developed. The difference between communication at the message and behavioural levels is emphasized. Empirical data are presented from various studies, showing that degree of satisfaction with communication is correlated with how close the outcome is to the memory of function prior to hearing impairment. We found no indication that hearing-impaired subjects overestimated their previous hearing or the hearing of normal-hearing people. Satisfaction was also correlated with the outcome and degree of fulfillment of expectations. It did not correlate with improvement of function. The concept of balance was presented and tested using a semi-quantitative approach. Several projects were presented in which the framework was applied: the hearing impaired as counsellor, choosing sides in unilateral deafness, a monitoring device for the deafblind, interaction between Swedish as a second language and hearing impairment, language development in hearing impaired children. By regarding hearing as a component of a communicative system, the perspective of audiological analysis and rehabilitation is broadened.

  9. Pluvials, droughts, the Mongol Empire, and modern Mongolia

    NASA Astrophysics Data System (ADS)

    Pederson, Neil; Hessl, Amy E.; Baatarbileg, Nachin; Anchukaitis, Kevin J.; Di Cosmo, Nicola

    2014-03-01

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan's (Genghis Khan's) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia.

  10. Pluvials, droughts, the Mongol Empire, and modern Mongolia.

    PubMed

    Pederson, Neil; Hessl, Amy E; Baatarbileg, Nachin; Anchukaitis, Kevin J; Di Cosmo, Nicola

    2014-03-25

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan's (Genghis Khan's) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia.

  11. Pluvials, droughts, the Mongol Empire, and modern Mongolia

    PubMed Central

    Pederson, Neil; Hessl, Amy E.; Baatarbileg, Nachin; Anchukaitis, Kevin J.; Di Cosmo, Nicola

    2014-01-01

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan’s (Genghis Khan’s) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia. PMID:24616521

  12. Prediction of Very High Reynolds Number Compressible Skin Friction

    NASA Technical Reports Server (NTRS)

    Carlson, John R.

    1998-01-01

    Flat plate skin friction calculations over a range of Mach numbers from 0.4 to 3.5 at Reynolds numbers from 16 million to 492 million using a Navier Stokes method with advanced turbulence modeling are compared with incompressible skin friction coefficient correlations. The semi-empirical correlation theories of van Driest; Cope; Winkler and Cha; and Sommer and Short T' are used to transform the predicted skin friction coefficients of solutions using two algebraic Reynolds stress turbulence models in the Navier-Stokes method PAB3D. In general, the predicted skin friction coefficients scaled well with each reference temperature theory though, overall the theory by Sommer and Short appeared to best collapse the predicted coefficients. At the lower Reynolds number 3 to 30 million, both the Girimaji and Shih, Zhu and Lumley turbulence models predicted skin-friction coefficients within 2% of the semi-empirical correlation skin friction coefficients. At the higher Reynolds numbers of 100 to 500 million, the turbulence models by Shih, Zhu and Lumley and Girimaji predicted coefficients that were 6% less and 10% greater, respectively, than the semi-empirical coefficients.

  13. Statistical properties of correlated solar flares and coronal mass ejections in cycles 23 and 24

    NASA Astrophysics Data System (ADS)

    Aarnio, Alicia

    2018-01-01

    Outstanding problems in understanding early stellar systems include mass loss, angular momentum evolution, and the effects of energetic events on the surrounding environs. The latter of these drives much research into our own system's space weather and the development of predictive algorithms for geomagnetic storms. So dually motivated, we have leveraged a big-data approach to combine two decades of GOES and LASCO data to identify a large sample of spatially and temporally correlated solar flares and CMEs. In this presentation, we revisit the analysis of Aarnio et al. (2011), adding 10 years of data and further exploring the relationships between correlated flare and CME properties. We compare the updated data set results to those previously obtained, and discuss the effects of selecting smaller time windows within solar cycles 23 and 24 on the empirically defined relationships between correlated flare and CME properties. Finally, we discuss a newly identified large sample of potentially interesting correlated flares and CMEs perhaps erroneously excluded from previous searches.

  14. Are cross-cultural comparisons of norms on death anxiety valid?

    PubMed

    Beshai, James A

    2008-01-01

    Cross-cultural comparisons of norms derived from research on Death Anxiety are valid as long as they provide existential validity. Existential validity is not empirically derived like construct validity. It is an understanding of being human unto death. It is the realization that death is imminent. It is the inner sense that provides a responder to death anxiety scales with a valid expression of his or her sense about the prospect of dying. It can be articulated in a life review by a disclosure of one's ontology. This article calls upon psychologists who develop death anxiety scales to disclose their presuppositions about death before administering a questionnaire. By disclosing his or her ontology a psychologist provides a means of disclosing his or her intentionality in responding to the items. This humanistic paradigm allows for an interactive participation between investigator and subject. Lester, Templer, and Abdel-Khalek (2006-2007) enriched psychology with significant empirical data on several correlates of death anxiety. But all scientists, especially psychologists, will always have alternative interpretations of the same empirical fact pattern. Empirical data is limited by the affirmation of the consequent limitation. A phenomenology of language and communication makes existential validity a necessary step for a broader understanding of the meaning of death anxiety.

  15. Prediction of an Apparent Flame Length in a Co-Axial Jet Diffusion Flame Combustor.

    DTIC Science & Technology

    1983-04-01

    This report is comprised of two parts. In Part I a predictive model for an apparent flame length in a co-axial jet diffusion flame combustor is...Overall mass transfer coefficient, evaluated from an empirically developed correlation, is employed to predict total flame length . Comparison of the...experimental and predicted data on total flame length shows a reasonable agreement within sixteen percent over the investigated air and fuel flow rate

  16. The Play Experience Scale: development and validation of a measure of play.

    PubMed

    Pavlas, Davin; Jentsch, Florian; Salas, Eduardo; Fiore, Stephen M; Sims, Valerie

    2012-04-01

    A measure of play experience in video games was developed through literature review and two empirical validation studies. Despite the considerable attention given to games in the behavioral sciences, play experience remains empirically underexamined. One reason for this gap is the absence of a scale that measures play experience. In Study 1, the initial Play Experience Scale (PES) was tested through an online validation that featured three different games (N = 203). In Study 2, a revised PES was assessed with a serious game in the laboratory (N = 77). Through principal component analysis of the Study 1 data, the initial 20-item PES was revised, resulting in the 16-item PES-16. Study 2 showed the PES-16 to be a robust instrument with the same patterns of correlations as in Study 1 via (a) internal consistency estimates, (b) correlations with established scales of motivation, (c) distributions of PES-16 scores in different game conditions, and (d) examination of the average variance extracted of the PES and the Intrinsic Motivation Scale. We suggest that the PES is appropriate for use in further validation studies. Additional examinations of the scale are required to determine its applicability to other contexts and its relationship with other constructs. The PES is potentially relevant to human factors undertakings involving video games, including basic research into play, games, and learning; prototype testing; and exploratory learning studies.

  17. Prediction of Agglomeration, Fouling, and Corrosion Tendency of Fuels in CFB Co-Combustion

    NASA Astrophysics Data System (ADS)

    Barišć, Vesna; Zabetta, Edgardo Coda; Sarkki, Juha

    Prediction of agglomeration, fouling, and corrosion tendency of fuels is essential to the design of any CFB boiler. During the years, tools have been successfully developed at Foster Wheeler to help with such predictions for the most commercial fuels. However, changes in fuel market and the ever-growing demand for co-combustion capabilities pose a continuous need for development. This paper presents results from recently upgraded models used at Foster Wheeler to predict agglomeration, fouling, and corrosion tendency of a variety of fuels and mixtures. The models, subject of this paper, are semi-empirical computer tools that combine the theoretical basics of agglomeration/fouling/corrosion phenomena with empirical correlations. Correlations are derived from Foster Wheeler's experience in fluidized beds, including nearly 10,000 fuel samples and over 1,000 tests in about 150 CFB units. In these models, fuels are evaluated based on their classification, their chemical and physical properties by standard analyses (proximate, ultimate, fuel ash composition, etc.;.) alongside with Foster Wheeler own characterization methods. Mixtures are then evaluated taking into account the component fuels. This paper presents the predictive capabilities of the agglomeration/fouling/corrosion probability models for selected fuels and mixtures fired in full-scale. The selected fuels include coals and different types of biomass. The models are capable to predict the behavior of most fuels and mixtures, but also offer possibilities for further improvements.

  18. Cognitive neuroenhancement: false assumptions in the ethical debate.

    PubMed

    Heinz, Andreas; Kipke, Roland; Heimann, Hannah; Wiesing, Urban

    2012-06-01

    The present work critically examines two assumptions frequently stated by supporters of cognitive neuroenhancement. The first, explicitly methodological, assumption is the supposition of effective and side effect-free neuroenhancers. However, there is an evidence-based concern that the most promising drugs currently used for cognitive enhancement can be addictive. Furthermore, this work describes why the neuronal correlates of key cognitive concepts, such as learning and memory, are so deeply connected with mechanisms implicated in the development and maintenance of addictive behaviour so that modification of these systems may inevitably run the risk of addiction to the enhancing drugs. Such a potential risk of addiction could only be falsified by in-depth empirical research. The second, implicit, assumption is that research on neuroenhancement does not pose a serious moral problem. However, the potential for addiction, along with arguments related to research ethics and the potential social impact of neuroenhancement, could invalidate this assumption. It is suggested that ethical evaluation needs to consider the empirical data as well as the question of whether and how such empirical knowledge can be obtained.

  19. Retaining Early Childhood Education Workers: A Review of the Empirical Literature

    ERIC Educational Resources Information Center

    Totenhagen, Casey J.; Hawkins, Stacy Ann; Casper, Deborah M.; Bosch, Leslie A.; Hawkey, Kyle R.; Borden, Lynne M.

    2016-01-01

    Low retention in the child care workforce is a persistent challenge that has been associated with negative outcomes for children, staff, and centers. This article reviews the empirical literature, identifying common correlates or predictors of retention for child care workers. Searches were conducted using several databases, and articles that…

  20. Empirical data and the variance-covariance matrix for the 1969 Smithsonian Standard Earth (2)

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. M.

    1972-01-01

    The empirical data used in the 1969 Smithsonian Standard Earth (2) are presented. The variance-covariance matrix, or the normal equations, used for correlation analysis, are considered. The format and contents of the matrix, available on magnetic tape, are described and a sample printout is given.

  1. Tracer kinetics of forearm endothelial function: comparison of an empirical method and a quantitative modeling technique.

    PubMed

    Zhao, Xueli; Arsenault, Andre; Lavoie, Kim L; Meloche, Bernard; Bacon, Simon L

    2007-01-01

    Forearm Endothelial Function (FEF) is a marker that has been shown to discriminate patients with cardiovascular disease (CVD). FEF has been assessed using several parameters: the Rate of Uptake Ratio (RUR), EWUR (Elbow-to-Wrist Uptake Ratio) and EWRUR (Elbow-to-Wrist Relative Uptake Ratio). However, the modeling functions of FEF require more robust models. The present study was designed to compare an empirical method with quantitative modeling techniques to better estimate the physiological parameters and understand the complex dynamic processes. The fitted time activity curves of the forearms, estimating blood and muscle components, were assessed using both an empirical method and a two-compartment model. Although correlational analyses suggested a good correlation between the methods for RUR (r=.90) and EWUR (r=.79), but not EWRUR (r=.34), Altman-Bland plots found poor agreement between the methods for all 3 parameters. These results indicate that there is a large discrepancy between the empirical and computational method for FEF. Further work is needed to establish the physiological and mathematical validity of the 2 modeling methods.

  2. Wind-chill-equivalent temperatures: regarding the impact due to the variability of the environmental convective heat transfer coefficient.

    PubMed

    Shitzer, Avraham

    2006-03-01

    The wind-chill index (WCI), developed in Antarctica in the 1940s and recently updated by the weather services in the USA and Canada, expresses the enhancement of heat loss in cold climates from exposed body parts, e.g., face, due to wind. The index provides a simple and practical means for assessing the thermal effects of wind on humans outdoors. It is also used for indicating weather conditions that may pose adverse risks of freezing at subfreezing environmental temperatures. Values of the WCI depend on a number of parameters, i.e, temperatures, physical properties of the air, wind speed, etc., and on insolation and evaporation. This paper focuses on the effects of various empirical correlations used in the literature for calculating the convective heat transfer coefficients between humans and their environment. Insolation and evaporation are not included in the presentation. Large differences in calculated values among these correlations are demonstrated and quantified. Steady-state wind-chill-equivalent temperatures (WCETs) are estimated by a simple, one-dimensional heat-conducting hollow-cylindrical model using these empirical correlations. Partial comparison of these values with the published "new" WCETs is presented. The variability of the estimated WCETs, due to different correlations employed to calculate them, is clearly demonstrated. The results of this study clearly suggest the need for establishing a "gold standard" for estimating convective heat exchange between exposed body elements and the cold and windy environment. This should be done prior to the introduction and adoption of further modifications to WCETs and indices. Correlations to estimate the convective heat transfer coefficients between exposed body parts of humans in windy and cold environments influence the WCETs and need to be standardized.

  3. Wind-chill-equivalent temperatures: regarding the impact due to the variability of the environmental convective heat transfer coefficient

    NASA Astrophysics Data System (ADS)

    Shitzer, Avraham

    2006-03-01

    The wind-chill index (WCI), developed in Antarctica in the 1940s and recently updated by the weather services in the USA and Canada, expresses the enhancement of heat loss in cold climates from exposed body parts, e.g., face, due to wind. The index provides a simple and practical means for assessing the thermal effects of wind on humans outdoors. It is also used for indicating weather conditions that may pose adverse risks of freezing at subfreezing environmental temperatures. Values of the WCI depend on a number of parameters, i.e, temperatures, physical properties of the air, wind speed, etc., and on insolation and evaporation. This paper focuses on the effects of various empirical correlations used in the literature for calculating the convective heat transfer coefficients between humans and their environment. Insolation and evaporation are not included in the presentation. Large differences in calculated values among these correlations are demonstrated and quantified. Steady-state wind-chill-equivalent temperatures (WCETs) are estimated by a simple, one-dimensional heat-conducting hollow-cylindrical model using these empirical correlations. Partial comparison of these values with the published “new” WCETs is presented. The variability of the estimated WCETs, due to different correlations employed to calculate them, is clearly demonstrated. The results of this study clearly suggest the need for establishing a “gold standard” for estimating convective heat exchange between exposed body elements and the cold and windy environment. This should be done prior to the introduction and adoption of further modifications to WCETs and indices. Correlations to estimate the convective heat transfer coefficients between exposed body parts of humans in windy and cold environments influence the WCETs and need to be standardized.

  4. Stable distribution and long-range correlation of Brent crude oil market

    NASA Astrophysics Data System (ADS)

    Yuan, Ying; Zhuang, Xin-tian; Jin, Xiu; Huang, Wei-qiang

    2014-11-01

    An empirical study of stable distribution and long-range correlation in Brent crude oil market was presented. First, it is found that the empirical distribution of Brent crude oil returns can be fitted well by a stable distribution, which is significantly different from a normal distribution. Second, the detrended fluctuation analysis for the Brent crude oil returns shows that there are long-range correlation in returns. It implies that there are patterns or trends in returns that persist over time. Third, the detrended fluctuation analysis for the Brent crude oil returns shows that after the financial crisis 2008, the Brent crude oil market becomes more persistence. It implies that the financial crisis 2008 could increase the frequency and strength of the interdependence and correlations between the financial time series. All of these findings may be used to improve the current fractal theories.

  5. On the galaxy-halo connection in the EAGLE simulation

    NASA Astrophysics Data System (ADS)

    Desmond, Harry; Mao, Yao-Yuan; Wechsler, Risa H.; Crain, Robert A.; Schaye, Joop

    2017-10-01

    Empirical models of galaxy formation require assumptions about the correlations between galaxy and halo properties. These may be calibrated against observations or inferred from physical models such as hydrodynamical simulations. In this Letter, we use the EAGLE simulation to investigate the correlation of galaxy size with halo properties. We motivate this analysis by noting that the common assumption of angular momentum partition between baryons and dark matter in rotationally supported galaxies overpredicts both the spread in the stellar mass-size relation and the anticorrelation of size and velocity residuals, indicating a problem with the galaxy-halo connection it implies. We find the EAGLE galaxy population to perform significantly better on both statistics, and trace this success to the weakness of the correlations of galaxy size with halo mass, concentration and spin at fixed stellar mass. Using these correlations in empirical models will enable fine-grained aspects of galaxy scalings to be matched.

  6. Energy performance assessment with empirical methods: application of energy signature

    NASA Astrophysics Data System (ADS)

    Belussi, L.; Danza, L.; Meroni, I.; Salamone, F.

    2015-03-01

    Energy efficiency and reduction of building consumption are deeply felt issues both at Italian and international level. The recent regulatory framework sets stringent limits on energy performance of buildings. Awaiting the adoption of these principles, several methods have been developed to solve the problem of energy consumption of buildings, among which the simplified energy audit is intended to identify any anomalies in the building system, to provide helpful tips for energy refurbishments and to raise end users' awareness. The Energy Signature is an operational tool of these methodologies, an evaluation method in which energy consumption is correlated with climatic variables, representing the actual energy behaviour of the building. In addition to that purpose, the Energy Signature can be used as an empirical tool to determine the real performances of the technical elements. The latter aspect is illustrated in this article.

  7. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  8. Closing in on chemical bonds by opening up relativity theory.

    PubMed

    Whitney, Cynthia K

    2008-03-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein's special relativity theory.

  9. Empirical research on the correlation between economic development and environmental pollution in natural resource abundant regions: the case of China Shaanxi province

    NASA Astrophysics Data System (ADS)

    Luo, Bo; Zhang, Jinsuo

    2018-02-01

    This paper investigates the relationship between economic development and environmental pollution in natural resource abundant regions via testing the Environmental Kuznets Curve (EKC) hypothesis by regression analysis, based on the statistical data of per capita GDP growth and environmental pollution indicators in Shaanxi Province from 1989 to 2015. The results show that the per capita GDP and environmental pollution in Shaanxi Province do not always accord with the “inverted U” Environmental Kuznets Curve, which mainly show “N” shapes; only SO2 show the “Inverted U” shapes.

  10. The Chern-Simons Current in Systems of DNA-RNA Transcriptions

    NASA Astrophysics Data System (ADS)

    Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin; Saridakis, Emmanuel N.

    2018-04-01

    A Chern-Simons current, coming from ghost and anti-ghost fields of supersymmetry theory, can be used to define a spectrum of gene expression in new time series data where a spinor field, as alternative representation of a gene, is adopted instead of using the standard alphabet sequence of bases $A, T, C, G, U$. After a general discussion on the use of supersymmetry in biological systems, we give examples of the use of supersymmetry for living organism, discuss the codon and anti-codon ghost fields and develop an algebraic construction for the trash DNA, the DNA area which does not seem active in biological systems. As a general result, all hidden states of codon can be computed by Chern-Simons 3 forms. Finally, we plot a time series of genetic variations of viral glycoprotein gene and host T-cell receptor gene by using a gene tensor correlation network related to the Chern-Simons current. An empirical analysis of genetic shift, in host cell receptor genes with separated cluster of gene and genetic drift in viral gene, is obtained by using a tensor correlation plot over time series data derived as the empirical mode decomposition of Chern-Simons current.

  11. Analytical determination of propeller performance degradation due to ice accretion

    NASA Technical Reports Server (NTRS)

    Miller, T. L.

    1986-01-01

    A computer code has been developed which is capable of computing propeller performance for clean, glaze, or rime iced propeller configurations, thereby providing a mechanism for determining the degree of performance degradation which results from a given icing encounter. The inviscid, incompressible flow field at each specified propeller radial location is first computed using the Theodorsen transformation method of conformal mapping. A droplet trajectory computation then calculates droplet impingement points and airfoil collection efficiency for each radial location, at which point several user-selectable empirical correlations are available for determining the aerodynamic penalities which arise due to the ice accretion. Propeller performance is finally computed using strip analysis for either the clean or iced propeller. In the iced mode, the differential thrust and torque coefficient equations are modified by the drag and lift coefficient increments due to ice to obtain the appropriate iced values. Comparison with available experimental propeller icing data shows good agreement in several cases. The code's capability to properly predict iced thrust coefficient, power coefficient, and propeller efficiency is shown to be dependent on the choice of empirical correlation employed as well as proper specification of radial icing extent.

  12. Pleiotropy of cardiometabolic syndrome with obesity-related anthropometric traits determined using empirically derived kinships from the Busselton Health Study.

    PubMed

    Cadby, Gemma; Melton, Phillip E; McCarthy, Nina S; Almeida, Marcio; Williams-Blangero, Sarah; Curran, Joanne E; VandeBerg, John L; Hui, Jennie; Beilby, John; Musk, A W; James, Alan L; Hung, Joseph; Blangero, John; Moses, Eric K

    2018-01-01

    Over two billion adults are overweight or obese and therefore at an increased risk of cardiometabolic syndrome (CMS). Obesity-related anthropometric traits genetically correlated with CMS may provide insight into CMS aetiology. The aim of this study was to utilise an empirically derived genetic relatedness matrix to calculate heritabilities and genetic correlations between CMS and anthropometric traits to determine whether they share genetic risk factors (pleiotropy). We used genome-wide single nucleotide polymorphism (SNP) data on 4671 Busselton Health Study participants. Exploiting both known and unknown relatedness, empirical kinship probabilities were estimated using these SNP data. General linear mixed models implemented in SOLAR were used to estimate narrow-sense heritabilities (h 2 ) and genetic correlations (r g ) between 15 anthropometric and 9 CMS traits. Anthropometric traits were adjusted by body mass index (BMI) to determine whether the observed genetic correlation was independent of obesity. After adjustment for multiple testing, all CMS and anthropometric traits were significantly heritable (h 2 range 0.18-0.57). We identified 50 significant genetic correlations (r g range: - 0.37 to 0.75) between CMS and anthropometric traits. Five genetic correlations remained significant after adjustment for BMI [high density lipoprotein cholesterol (HDL-C) and waist-hip ratio; triglycerides and waist-hip ratio; triglycerides and waist-height ratio; non-HDL-C and waist-height ratio; insulin and iliac skinfold thickness]. This study provides evidence for the presence of potentially pleiotropic genes that affect both anthropometric and CMS traits, independently of obesity.

  13. Competition in health insurance markets: limitations of current measures for policy analysis.

    PubMed

    Scanlon, Dennis P; Chernew, Michael; Swaminathan, Shailender; Lee, Woolton

    2006-12-01

    Health care reform proposals often rely on increased competition in health insurance markets to drive improved performance in health care costs, access, and quality. We examine a range of data issues related to the measures of health insurance competition used in empirical studies published from 1994-2004. The literature relies exclusively on market structure and penetration variables to measure competition. While these measures are correlated, the degree of correlation is modest, suggesting that choice of measure could influence empirical results. Moreover, certain measurement issues such as the lack of data on PPO enrollment, the treatment of small firms, and omitted market characteristics also could affect the conclusions in empirical studies. Importantly, other types of measures related to competition (e.g., the availability of information on price and outcomes, degree of entry barriers, etc.) are important from both a theoretical and policy perspective, but their impact on market outcomes has not been widely studied.

  14. Experimental investigation of heat transfer coefficient of mini-channel PCHE (printed circuit heat exchanger)

    NASA Astrophysics Data System (ADS)

    Kwon, Dohoon; Jin, Lingxue; Jung, WooSeok; Jeong, Sangkwon

    2018-06-01

    Heat transfer coefficient of a mini-channel printed circuit heat exchanger (PCHE) with counter-flow configuration is investigated. The PCHE used in the experiments is two layered (10 channels per layer) and has the hydraulic diameter of 1.83 mm. Experiments are conducted under various cryogenic heat transfer conditions: single-phase, boiling and condensation heat transfer. Heat transfer coefficients of each experiments are presented and compared with established correlations. In the case of the single-phase experiment, empiricial correlation of modified Dittus-Boelter correlation was proposed, which predicts the experimental results with 5% error at Reynolds number range from 8500 to 17,000. In the case of the boiling experiment, film boiling phenomenon occurred dominantly due to large temperature difference between the hot side and the cold side fluids. Empirical correlation is proposed which predicts experimental results with 20% error at Reynolds number range from 2100 to 2500. In the case of the condensation experiment, empirical correlation of modified Akers correlation was proposed, which predicts experimental results with 10% error at Reynolds number range from 3100 to 6200.

  15. Prediction of friction coefficients for gases

    NASA Technical Reports Server (NTRS)

    Taylor, M. F.

    1969-01-01

    Empirical relations are used for correlating laminar and turbulent friction coefficients for gases, with large variations in the physical properties, flowing through smooth tubes. These relations have been used to correlate friction coefficients for hydrogen, helium, nitrogen, carbon dioxide and air.

  16. Smsynth: AN Imagery Synthesis System for Soil Moisture Retrieval

    NASA Astrophysics Data System (ADS)

    Cao, Y.; Xu, L.; Peng, J.

    2018-04-01

    Soil moisture (SM) is a important variable in various research areas, such as weather and climate forecasting, agriculture, drought and flood monitoring and prediction, and human health. An ongoing challenge in estimating SM via synthetic aperture radar (SAR) is the development of the retrieval SM methods, especially the empirical models needs as training samples a lot of measurements of SM and soil roughness parameters which are very difficult to acquire. As such, it is difficult to develop empirical models using realistic SAR imagery and it is necessary to develop methods to synthesis SAR imagery. To tackle this issue, a SAR imagery synthesis system based on the SM named SMSynth is presented, which can simulate radar signals that are realistic as far as possible to the real SAR imagery. In SMSynth, SAR backscatter coefficients for each soil type are simulated via the Oh model under the Bayesian framework, where the spatial correlation is modeled by the Markov random field (MRF) model. The backscattering coefficients simulated based on the designed soil parameters and sensor parameters are added into the Bayesian framework through the data likelihood where the soil parameters and sensor parameters are set as realistic as possible to the circumstances on the ground and in the validity range of the Oh model. In this way, a complete and coherent Bayesian probabilistic framework is established. Experimental results show that SMSynth is capable of generating realistic SAR images that suit the needs of a large amount of training samples of empirical models.

  17. General correlation for prediction of critical heat flux ratio in water cooled channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pernica, R.; Cizek, J.

    1995-09-01

    The paper present the general empirical Critical Heat Flux Ration (CHFR) correlation which is valid for vertical water upflow through tubes, internally heated concentric annuli and rod bundles geometries with both wide and very tight square and triangular rods lattices. The proposed general PG correlation directly predicts the CHFR, it comprises axial and radial non-uniform heating, and is valid in a wider range of thermal hydraulic conditions than previously published critical heat flux correlations. The PG correlation has been developed using the critical heat flux Czech data bank which includes more than 9500 experimental data on tubes, 7600 data onmore » rod bundles and 713 data on internally heated concentric annuli. Accuracy of the CHFR prediction, statistically assessed by the constant dryout conditions approach, is characterized by the mean value nearing 1.00 and the standard deviation less than 0.06. Moverover, a subchannel form of the PG correlations is statistically verified on Westinghouse and Combustion Engineering rod bundle data bases, i.e. more than 7000 experimental CHF points of Columbia University data bank were used.« less

  18. Asymmetric multiscale detrended fluctuation analysis of California electricity spot price

    NASA Astrophysics Data System (ADS)

    Fan, Qingju

    2016-01-01

    In this paper, we develop a new method called asymmetric multiscale detrended fluctuation analysis, which is an extension of asymmetric detrended fluctuation analysis (A-DFA) and can assess the asymmetry correlation properties of series with a variable scale range. We investigate the asymmetric correlations in California 1999-2000 power market after filtering some periodic trends by empirical mode decomposition (EMD). Our findings show the coexistence of symmetric and asymmetric correlations in the price series of 1999 and strong asymmetric correlations in 2000. What is more, we detect subtle correlation properties of the upward and downward price series for most larger scale intervals in 2000. Meanwhile, the fluctuations of Δα(s) (asymmetry) and | Δα(s) | (absolute asymmetry) are more significant in 2000 than that in 1999 for larger scale intervals, and they have similar characteristics for smaller scale intervals. We conclude that the strong asymmetry property and different correlation properties of upward and downward price series for larger scale intervals in 2000 have important implications on the collapse of California power market, and our findings shed a new light on the underlying mechanisms of power price.

  19. Kolmogorov-Smirnov test for spatially correlated data

    USGS Publications Warehouse

    Olea, R.A.; Pawlowsky-Glahn, V.

    2009-01-01

    The Kolmogorov-Smirnov test is a convenient method for investigating whether two underlying univariate probability distributions can be regarded as undistinguishable from each other or whether an underlying probability distribution differs from a hypothesized distribution. Application of the test requires that the sample be unbiased and the outcomes be independent and identically distributed, conditions that are violated in several degrees by spatially continuous attributes, such as topographical elevation. A generalized form of the bootstrap method is used here for the purpose of modeling the distribution of the statistic D of the Kolmogorov-Smirnov test. The innovation is in the resampling, which in the traditional formulation of bootstrap is done by drawing from the empirical sample with replacement presuming independence. The generalization consists of preparing resamplings with the same spatial correlation as the empirical sample. This is accomplished by reading the value of unconditional stochastic realizations at the sampling locations, realizations that are generated by simulated annealing. The new approach was tested by two empirical samples taken from an exhaustive sample closely following a lognormal distribution. One sample was a regular, unbiased sample while the other one was a clustered, preferential sample that had to be preprocessed. Our results show that the p-value for the spatially correlated case is always larger that the p-value of the statistic in the absence of spatial correlation, which is in agreement with the fact that the information content of an uncorrelated sample is larger than the one for a spatially correlated sample of the same size. ?? Springer-Verlag 2008.

  20. Advancing Empirical Scholarship to Further Develop Evaluation Theory and Practice

    ERIC Educational Resources Information Center

    Christie, Christina A.

    2011-01-01

    Good theory development is grounded in empirical inquiry. In the context of educational evaluation, the development of empirically grounded theory has important benefits for the field and the practitioner. In particular, a shift to empirically derived theory will assist in advancing more systematic and contextually relevant evaluation practice, as…

  1. An Empirical Study of the Influence of the Concept of "Job-Hunting" on Graduates' Employment

    ERIC Educational Resources Information Center

    Chen, Chengwen; Hu, Guiying

    2008-01-01

    The concept of job-hunting is an important factor affecting university students' employment. This empirical study shows that while hunting for a job, graduates witness negative correlation between their expectation of the nature of work and the demand for occupational types and the accessibility to a post and monthly income; positive correlation…

  2. Solar-terrestrial predictions proceedings. Volume 4: Prediction of terrestrial effects of solar activity

    NASA Technical Reports Server (NTRS)

    Donnelly, R. E. (Editor)

    1980-01-01

    Papers about prediction of ionospheric and radio propagation conditions based primarily on empirical or statistical relations is discussed. Predictions of sporadic E, spread F, and scintillations generally involve statistical or empirical predictions. The correlation between solar-activity and terrestrial seismic activity and the possible relation between solar activity and biological effects is discussed.

  3. Verification of the proteus two-dimensional Navier-Stokes code for flat plate and pipe flows

    NASA Technical Reports Server (NTRS)

    Conley, Julianne M.; Zeman, Patrick L.

    1991-01-01

    The Proteus Navier-Stokes Code is evaluated for 2-D/axisymmetric, viscous, incompressible, internal, and external flows. The particular cases to be discussed are laminar and turbulent flows over a flat plate, laminar and turbulent developing pipe flows, and turbulent pipe flow with swirl. Results are compared with exact solutions, empirical correlations, and experimental data. A detailed description of the code set-up, including boundary conditions, initial conditions, grid size, and grid packing is given for each case.

  4. Dilution jet mixing program, phase 3

    NASA Technical Reports Server (NTRS)

    Srinivasan, R.; Coleman, E.; Myers, G.; White, C.

    1985-01-01

    The main objectives for the NASA Jet Mixing Phase 3 program were: extension of the data base on the mixing of single sided rows of jets in a confined cross flow to discrete slots, including streamlined, bluff, and angled injections; quantification of the effects of geometrical and flow parameters on penetration and mixing of multiple rows of jets into a confined flow; investigation of in-line, staggered, and dissimilar hole configurations; and development of empirical correlations for predicting temperature distributions for discrete slots and multiple rows of dilution holes.

  5. Using temporal detrending to observe the spatial correlation of traffic.

    PubMed

    Ermagun, Alireza; Chatterjee, Snigdhansu; Levinson, David

    2017-01-01

    This empirical study sheds light on the spatial correlation of traffic links under different traffic regimes. We mimic the behavior of real traffic by pinpointing the spatial correlation between 140 freeway traffic links in a major sub-network of the Minneapolis-St. Paul freeway system with a grid-like network topology. This topology enables us to juxtapose the positive and negative correlation between links, which has been overlooked in short-term traffic forecasting models. To accurately and reliably measure the correlation between traffic links, we develop an algorithm that eliminates temporal trends in three dimensions: (1) hourly dimension, (2) weekly dimension, and (3) system dimension for each link. The spatial correlation of traffic links exhibits a stronger negative correlation in rush hours, when congestion affects route choice. Although this correlation occurs mostly in parallel links, it is also observed upstream, where travelers receive information and are able to switch to substitute paths. Irrespective of the time-of-day and day-of-week, a strong positive correlation is witnessed between upstream and downstream links. This correlation is stronger in uncongested regimes, as traffic flow passes through consecutive links more quickly and there is no congestion effect to shift or stall traffic. The extracted spatial correlation structure can augment the accuracy of short-term traffic forecasting models.

  6. Using temporal detrending to observe the spatial correlation of traffic

    PubMed Central

    2017-01-01

    This empirical study sheds light on the spatial correlation of traffic links under different traffic regimes. We mimic the behavior of real traffic by pinpointing the spatial correlation between 140 freeway traffic links in a major sub-network of the Minneapolis—St. Paul freeway system with a grid-like network topology. This topology enables us to juxtapose the positive and negative correlation between links, which has been overlooked in short-term traffic forecasting models. To accurately and reliably measure the correlation between traffic links, we develop an algorithm that eliminates temporal trends in three dimensions: (1) hourly dimension, (2) weekly dimension, and (3) system dimension for each link. The spatial correlation of traffic links exhibits a stronger negative correlation in rush hours, when congestion affects route choice. Although this correlation occurs mostly in parallel links, it is also observed upstream, where travelers receive information and are able to switch to substitute paths. Irrespective of the time-of-day and day-of-week, a strong positive correlation is witnessed between upstream and downstream links. This correlation is stronger in uncongested regimes, as traffic flow passes through consecutive links more quickly and there is no congestion effect to shift or stall traffic. The extracted spatial correlation structure can augment the accuracy of short-term traffic forecasting models. PMID:28472093

  7. Non-Normality and Testing that a Correlation Equals Zero

    ERIC Educational Resources Information Center

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  8. Prediction of winter precipitation over northwest India using ocean heat fluxes

    NASA Astrophysics Data System (ADS)

    Nageswararao, M. M.; Mohanty, U. C.; Osuri, Krishna K.; Ramakrishna, S. S. V. S.

    2016-10-01

    The winter precipitation (December-February) over northwest India (NWI) is highly variable in terms of time and space. The maximum precipitation occurs over the Himalaya region and decreases towards south of NWI. The winter precipitation is important for water resources and agriculture sectors over the region and for the economy of the country. It is an exigent task to the scientific community to provide a seasonal outlook for the regional scale precipitation. The oceanic heat fluxes are known to have a strong linkage with the ocean and atmosphere. Henceforth, in this study, we obtained the relationship of NWI winter precipitation with total downward ocean heat fluxes at the global ocean surface, 15 regions with significant correlations are identified from August to November at 90 % confidence level. These strong relations encourage developing an empirical model for predicting winter precipitation over NWI. The multiple linear regression (MLR) and principal component regression (PCR) models are developed and evaluated using leave-one-out cross-validation. The developed regression models are able to predict the winter precipitation patterns over NWI with significant (99 % confidence level) index of agreement and correlations. Moreover, these models capture the signals of extremes, but could not reach the peaks (excess and deficit) of the observations. PCR performs better than MLR for predicting winter precipitation over NWI. Therefore, the total downward ocean heat fluxes at surface from August to November are having a significant impact on seasonal winter precipitation over the NWI. It concludes that these interrelationships are more useful for the development of empirical models and feasible to predict the winter precipitation over NWI with sufficient lead-time (in advance) for various risk management sectors.

  9. Development and evaluation of consensus-based sediment effect concentrations for polychlorinated biphenyls

    USGS Publications Warehouse

    MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.

    2000-01-01

    Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments.

  10. Development of a surveillance case definition for heat-related illness using 911 medical dispatch data.

    PubMed

    Bassil, Kate L; Cole, Donald C; Moineddin, Rahim; Gournis, Effie; Schwartz, Brian; Craig, Alan M; Lou, W Y Wendy; Rea, Elizabeth

    2008-01-01

    The adverse effects of hot weather on public health are of increasing concern. A surveillance system using 911 medical dispatch data for the detection of heat-related illness (HRI) could provide new information on the impact of excessive heat on the population. This paper describes how we identified medical dispatch call codes, called "determinants", that could represent HRI events. Approximately 500 medical dispatch determinants were reviewed in focus groups composed of Emergency Medical Services (EMS) paramedics, dispatchers, physicians, and public health epidemiologists. Each group was asked to select those determinants that might adequately represent HRI. Selections were then assessed empirically using correlations with daily mean temperature over the study period (June 1-August 31,2005). The focus groups identified 12 determinant groupings and ranked them according to specificity for HRI. Of these, "Heat/cold exposure" was deemed the most specific. The call determinant groupings with the clearest positive associations with daily mean temperature empirically were "Heat/cold exposure" (Spearman's correlation coefficient (SCC) 0.71, p < 0.0001) and "Unknown problem (man down)" (SCC 0.21, p = 0.04). Within each grouping, the determinant "Unknown status (3rd party caller)" showed significant associations, SCC = 0.34 (p = 0.001) and SCC = 0.22 (p = 0.03) respectively. Clinically-informed expertise and empirical evidence both contributed to identification of a group of 911 medical dispatch call determinants that plausibly represent HRI events. Once evaluated prospectively, these may be used in public health surveillance to better understand environmental health impacts on human populations and inform targeted public health interventions.

  11. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    PubMed

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Measuring and modeling correlations in multiplex networks.

    PubMed

    Nicosia, Vincenzo; Latora, Vito

    2015-09-01

    The interactions among the elementary components of many complex systems can be qualitatively different. Such systems are therefore naturally described in terms of multiplex or multilayer networks, i.e., networks where each layer stands for a different type of interaction between the same set of nodes. There is today a growing interest in understanding when and why a description in terms of a multiplex network is necessary and more informative than a single-layer projection. Here we contribute to this debate by presenting a comprehensive study of correlations in multiplex networks. Correlations in node properties, especially degree-degree correlations, have been thoroughly studied in single-layer networks. Here we extend this idea to investigate and characterize correlations between the different layers of a multiplex network. Such correlations are intrinsically multiplex, and we first study them empirically by constructing and analyzing several multiplex networks from the real world. In particular, we introduce various measures to characterize correlations in the activity of the nodes and in their degree at the different layers and between activities and degrees. We show that real-world networks exhibit indeed nontrivial multiplex correlations. For instance, we find cases where two layers of the same multiplex network are positively correlated in terms of node degrees, while other two layers are negatively correlated. We then focus on constructing synthetic multiplex networks, proposing a series of models to reproduce the correlations observed empirically and/or to assess their relevance.

  13. Assessment of nonverbal learning and memory using the Design Learning Test.

    PubMed

    Foster, Paul S; Drago, Valeria; Harrison, David W

    2009-05-01

    The laterality of verbal and nonverbal learning and memory to the left and right temporal lobes, respectively, has received much empirical support. Researchers have often used the Rey Auditory Verbal Learning Test (RAVLT) as a measure of verbal learning and memory in these investigations. However, a precise analog of the RAVLT that uses stimuli difficult to encode verbally has not been reported. Further, although researchers have developed some measures that are essentially visuospatial analogs of the RAVLT, no correlational data have been reported attesting to the relation between the measures. The authors report the development of a nonverbal analog of the RAVLT, referred to as the Design Learning Test (DLT). Also, the authors present correlational data supporting a relation between the DLT and RAVLT, and they hope that the present study will stimulate research investigating whether the DLT is sensitive to right temporal lobe functioning.

  14. Investigation using data from ERTS-1 to develop and implement utilization of living marine resources. [availability and distribution of menhaden fish in Mississippi Sound and Gulf waters

    NASA Technical Reports Server (NTRS)

    Stevenson, W. H. (Principal Investigator); Pastula, E. J., Jr.

    1973-01-01

    The author has identified the following significant results. This 15-month ERTS-1 investigation produced correlations between satellite, aircraft, menhaden fisheries, and environmental sea truth data from the Mississippi Sound. Selected oceanographic, meteorological, and biological parameters were used as indirect indicators of the menhaden resource. Synoptic and near real time sea truth, fishery, satellite imagery, aircraft acquired multispectral, photo and thermal IR information were acquired as data inputs. Computer programs were developed to manipulate these data according to user requirements. Preliminary results indicate a correlation between backscattered light with chlorophyll concentration and water transparency in turbid waters. Eight empirical menhaden distribution models were constructed from combinations of four fisheries-significant oceanographic parameters: water depth, transparency, color, and surface salinity. The models demonstrated their potential for management utilization in areas of resource assessment, prediction, and monitoring.

  15. Complex Dynamics in Nonequilibrium Economics and Chemistry

    NASA Astrophysics Data System (ADS)

    Wen, Kehong

    Complex dynamics provides a new approach in dealing with economic complexity. We study interactively the empirical and theoretical aspects of business cycles. The way of exploring complexity is similar to that in the study of an oscillatory chemical system (BZ system)--a model for modeling complex behavior. We contribute in simulating qualitatively the complex periodic patterns observed from the controlled BZ experiments to narrow the gap between modeling and experiment. The gap between theory and reality is much wider in economics, which involves studies of human expectations and decisions, the essential difference from natural sciences. Our empirical and theoretical studies make substantial progress in closing this gap. With the help from the new development in nonequilibrium physics, i.e., the complex spectral theory, we advance our technique in detecting characteristic time scales from empirical economic data. We obtain correlation resonances, which give oscillating modes with decays for correlation decomposition, from different time series including S&P 500, M2, crude oil spot prices, and GNP. The time scales found are strikingly compatible with business experiences and other studies in business cycles. They reveal the non-Markovian nature of coherent markets. The resonances enhance the evidence of economic chaos obtained by using other tests. The evolving multi-humped distributions produced by the moving-time -window technique reveal the nonequilibrium nature of economic behavior. They reproduce the American economic history of booms and busts. The studies seem to provide a way out of the debate on chaos versus noise and unify the cyclical and stochastic approaches in explaining business fluctuations. Based on these findings and new expectation formulation, we construct a business cycle model which gives qualitatively compatible patterns to those found empirically. The soft-bouncing oscillator model provides a better alternative than the harmonic oscillator or the random walk model as the building block in business cycle theory. The mathematical structure of the model (delay differential equation) is studied analytically and numerically. The research pave the way toward sensible economic forecasting.

  16. Hybrid BEM/empirical approach for scattering of correlated sources in rocket noise prediction

    NASA Astrophysics Data System (ADS)

    Barbarino, Mattia; Adamo, Francesco P.; Bianco, Davide; Bartoccini, Daniele

    2017-09-01

    Empirical models such as the Eldred standard model are commonly used for rocket noise prediction. Such models directly provide a definition of the Sound Pressure Level through the quadratic pressure term by uncorrelated sources. In this paper, an improvement of the Eldred Standard model has been formulated. This new formulation contains an explicit expression for the acoustic pressure of each noise source, in terms of amplitude and phase, in order to investigate the sources correlation effects and to propagate them through a wave equation. In particular, the correlation effects between adjacent and not-adjacent sources have been modeled and analyzed. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach that allows an evaluation of the scattering effects. In the framework of the European Space Agency funded program VECEP (VEga Consolidation and Evolution Programme), these models have been applied for the prediction of the aeroacoustics loads of the VEGA (Vettore Europeo di Generazione Avanzata - Advanced Generation European Carrier Rocket) launch vehicle at lift-off and the results have been compared with experimental data.

  17. Are relationships between pollen-ovule ratio and pollen and seed size explained by sex allocation?

    PubMed

    Burd, Martin

    2011-10-01

    Positive correlations between pollen-ovule ratio and seed size, and negative correlations between pollen-ovule ratio and pollen grain size have been noted frequently in a wide variety of angiosperm taxa. These relationships are commonly explained as a consequence of sex allocation on the basis of a simple model proposed by Charnov. Indeed, the theoretical expectation from the model has been the basis for interest in the empirical pattern. However, the predicted relationship is a necessary consequence of the mathematics of the model, which therefore has little explanatory power, even though its predictions are consistent with empirical results. The evolution of pollen-ovule ratios is likely to depend on selective factors affecting mating system, pollen presentation and dispensing, patterns of pollen receipt, pollen tube competition, female mate choice through embryo abortion, as well as genetic covariances among pollen, ovule, and seed size and other reproductive traits. To the extent the empirical correlations involving pollen-ovule ratios are interesting, they will need explanation in terms of a suite of selective factors. They are not explained simply by sex allocation trade-offs. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  18. The role of hip and chest radiographs in osteoporotic evaluation among south Indian women population: a comparative scenario with DXA.

    PubMed

    Kumar, D Ashok; Anburajan, M

    2014-05-01

    Osteoporosis is recognized as a worldwide skeletal disorder problem. In India, the older as well as postmenopausal women population suffering from osteoporotic fractures has been a common issue. Bone mineral density measurements gauged by dual-energy X-ray absorptiometry (DXA) are used in the diagnosis of osteoporosis. (1) To evaluate osteoporosis in south Indian women by radiogrammetric method in a comparative perspective with DXA. (2) To assess the capability of KJH; Anburajan's Empirical formula in the prediction of total hip bone mineral density (T.BMD) with estimated Hologic T.BMD. In this cross-sectional design, 56 south Indian women were evaluated. These women were randomly selected from a health camp. The patients with secondary bone diseases were excluded. The standard protocol was followed in acquiring BMD of the right proximal femur by DPX Prodigy (DXA Scanner, GE-Lunar Corp., USA). The measured Lunar Total hip BMD was converted into estimated Hologic Total hip BMD. In addition, the studied population underwent chest and hip radiographic measurements. Combined cortical thickness of clavicle has been used in KJH; Anburajan's Empirical formula to predict T.BMD and compared with estimated Hologic T.BMD by DXA. The correlation coefficients exhibited high significance. The combined cortical thickness of clavicle and femur shaft of total studied population was strongly correlated with DXA femur T.BMD measurements (r = 0.87, P < 0.01 and r = 0.45, P < 0.01) and it is also having strong correlation with low bone mass group (r = 0.87, P < 0.01 and r = 0.67, P < 0.01) KJH; Anburajan's Empirical formula shows significant correlation with estimated Hologic T.BMD (r = 0.88, P < 0.01) in total studied population. The empirical formula was identified as better tool for predicting osteoporosis in total population and old-aged population with a sensitivity (88.8 and 95.6 %), specificity (89.6 and 90.9 %), positive predictive value (88.8 and 95.6 %) and negative predictive value (89.6 and 90.9 %), respectively. The results suggest that combined cortical thickness of clavicle and femur shaft using radiogrammetric method is significantly correlated with DXA. Moreover, KJH; Anburajan's Empirical formula is useful and better index than other simple radiogrammetry measurements in the evaluation of osteoporosis from the economical and widely available digital radiographs.

  19. Improved RMR Rock Mass Classification Using Artificial Intelligence Algorithms

    NASA Astrophysics Data System (ADS)

    Gholami, Raoof; Rasouli, Vamegh; Alimoradi, Andisheh

    2013-09-01

    Rock mass classification systems such as rock mass rating (RMR) are very reliable means to provide information about the quality of rocks surrounding a structure as well as to propose suitable support systems for unstable regions. Many correlations have been proposed to relate measured quantities such as wave velocity to rock mass classification systems to limit the associated time and cost of conducting the sampling and mechanical tests conventionally used to calculate RMR values. However, these empirical correlations have been found to be unreliable, as they usually overestimate or underestimate the RMR value. The aim of this paper is to compare the results of RMR classification obtained from the use of empirical correlations versus machine-learning methodologies based on artificial intelligence algorithms. The proposed methods were verified based on two case studies located in northern Iran. Relevance vector regression (RVR) and support vector regression (SVR), as two robust machine-learning methodologies, were used to predict the RMR for tunnel host rocks. RMR values already obtained by sampling and site investigation at one tunnel were taken into account as the output of the artificial networks during training and testing phases. The results reveal that use of empirical correlations overestimates the predicted RMR values. RVR and SVR, however, showed more reliable results, and are therefore suggested for use in RMR classification for design purposes of rock structures.

  20. Empirical Profiling of Cold Hydrogen Plumes Formed from Venting Of LH2 Storage Vessels: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buttner, William J; Rivkin, Carl H; Schmidt, Kara

    Liquid hydrogen (LH2) storage is a viable approach to assuring sufficient hydrogen capacity at commercial fuelling stations. Presently, LH2 is produced at remote facilities and then transported to the end-use site by road vehicles (i.e., LH2 tanker trucks). Venting of hydrogen to depressurize the transport storage tank is a routine part of the LH2 delivery process. The behaviour of cold hydrogen plumes has not been well-characterized because empirical field data is essentially non-existent. The NFPA 2 Hydrogen Storage Safety Task Group, which consists of hydrogen producers, safety experts, and CFD modellers, has identified the lack of understanding of hydrogen dispersionmore » during LH2 venting of storage vessel as a critical gap for establishing safety distances at LH2 facilities, especially commercial hydrogen fuelling stations. To address this need, the NREL sensor laboratory, in collaboration with the NFPA 2 Safety Task Group developed the Cold Hydrogen Plume Analyzer to empirically characterize the hydrogen plume formed during LH2 storage tank venting. A prototype Analyzer was developed and field-deployed at an actual LH2 venting operation with critical findings that included: - H2 being detected as much as 2 m lower than the release point, which is not predicted by existing models - A small and inconsistent correlation between oxygen depletion and the hydrogen concentration - A negligible to non-existent correlation between in-situ temperature and the hydrogen concentration The Analyzer is currently being upgraded for enhanced metrological capabilities including improved real-time spatial and temporal profiling of the plume and tracking of prevailing weather conditions. Additional deployments are planned to monitor plume behaviour under different wind, humidity, and temperatures. This data will be shared with the NFPA 2 Safety Task Group and ultimately will be used support theoretical models and code requirements prescribed in NFPA 2.« less

  1. Sexual harassment.

    PubMed

    Sbraga, T P; O'Donohue, W

    2000-01-01

    We review the current state of sexual harassment theory, research, treatment, and prevention. Definitional problems and implications are discussed. An examination of the epidemiology of sexual harassment is presented, highlighting correlates that include characteristics of the organizational environment, the perpetrator, and the recipient of unwanted sexual behavior. Normative responses to sexual harassment and consequences are discussed. Descriptions of the most prevalent models of sexual harassment are offered and the empirical evidence for them is briefly reviewed. From there, the effect of model development and evaluation on the prevention and treatment of sexual harassment is considered. We comment on the steps that would need to be taken to develop viable prevention and treatment programs. Suggestions for fruitful avenues of research and theory development are offered.

  2. Volatility of linear and nonlinear time series

    NASA Astrophysics Data System (ADS)

    Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo

    2005-07-01

    Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.

  3. ESTIMATION OF CHEMICAL TOXICITY TO WILDLIFE SPECIES USING INTERSPECIES CORRELATION MODELS

    EPA Science Inventory

    Ecological risks to wildlife are typically assessed using toxicity data for relataively few species and with limited understanding of differences in species sensitivity to contaminants. Empirical interspecies correlation models were derived from LD50 values for 49 wildlife speci...

  4. Correlation of Apollo oxygen tank thermodynamic performance predictions

    NASA Technical Reports Server (NTRS)

    Patterson, H. W.

    1971-01-01

    Parameters necessary to analyze the stratified performance of the Apollo oxygen tanks include g levels, tank elasticity, flow rates and pressurized volumes. Methods for estimating g levels and flow rates from flight plans prior to flight, and from quidance and system data for use in the post flight analysis are described. Equilibrium thermodynamic equations are developed for the effects of tank elasticity and pressurized volumes on the tank pressure response and their relative magnitudes are discussed. Correlations of tank pressures and heater temperatures from flight data with the results of a stratification model are shown. Heater temperatures were also estimated with empirical heat transfer agreement with flight data when fluid properties were averaged rather than evaluated at the mean film temperature.

  5. Biomarker development for external CO2 injury prediction in apples through exploration of both transcriptome and DNA methylation changes.

    PubMed

    Gapper, Nigel E; Rudell, David R; Giovannoni, James J; Watkins, Chris B

    2013-01-01

    Several apple cultivars are susceptible to CO2 injury, a physiological disorder that can be expressed either externally or internally, and which can cause major losses of fruit during controlled atmosphere (CA) storage. Disorder development can also be enhanced using SmartFresh™ technology, based on the inhibition of ethylene perception by 1-methylcyclopropene (1-MCP). Injury development is associated with less mature fruit with lower ethylene production, but the aetiology of the disorder is poorly understood. Here we report on the progress made using mRNAseq approaches to explore the transcriptome during the development of external CO2 injury. Next-generation sequencing was used to mine the apple transcriptome for gene expression changes that are associated with the development of external CO2 injury. 'Empire' apples from a single orchard were treated with either 1 µL L(-1) 1-MCP or 1 g L(-1) diphenylamine or left untreated, and then stored in a CA of 5 kPa CO2 and 2 kPa O2. In addition, susceptibility to the disorder in the 'Empire' apples from five different orchards was investigated and the methylation state of the ACS1 promoter investigated using McrBC endonuclease digestion and real-time quantitative polymerase chain reaction. Expression of over 30 000 genes, aligned to the apple genome, was monitored, with clear divergence of expression among treatments after 1 day of CA storage. Symptom development, internal ethylene concentrations (IECs) and methylation state of the ACS1 promoter were different for each of five orchards. With transcriptomic changes affected by treatment, this dataset will be useful in discovering biomarkers that assess disorder susceptibility. An inverse correlation between the frequency of this disorder and the IEC was detected in a multiple orchard trial. Differential methylation state of the ACS1 promoter correlated with both IEC and injury occurrence, indicating epigenetic regulation of ethylene biosynthesis and possibly events leading to disorder development.

  6. Equation of state for dense nucleonic matter from metamodeling. I. Foundational aspects

    NASA Astrophysics Data System (ADS)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Metamodeling for the nucleonic equation of state (EOS), inspired from a Taylor expansion around the saturation density of symmetric nuclear matter, is proposed and parameterized in terms of the empirical parameters. The present knowledge of nuclear empirical parameters is first reviewed in order to estimate their average values and associated uncertainties, and thus defining the parameter space of the metamodeling. They are divided into isoscalar and isovector types, and ordered according to their power in the density expansion. The goodness of the metamodeling is analyzed against the predictions of the original models. In addition, since no correlation among the empirical parameters is assumed a priori, all arbitrary density dependences can be explored, which might not be accessible in existing functionals. Spurious correlations due to the assumed functional form are also removed. This meta-EOS allows direct relations between the uncertainties on the empirical parameters and the density dependence of the nuclear equation of state and its derivatives, and the mapping between the two can be done with standard Bayesian techniques. A sensitivity analysis shows that the more influential empirical parameters are the isovector parameters Lsym and Ksym, and that laboratory constraints at supersaturation densities are essential to reduce the present uncertainties. The present metamodeling for the EOS for nuclear matter is proposed for further applications in neutron stars and supernova matter.

  7. Physical Interpretation of the Correlation Between Multi-Angle Spectral Data and Canopy Height

    NASA Technical Reports Server (NTRS)

    Schull, M. A.; Ganguly, S.; Samanta, A.; Huang, D.; Shabanov, N. V.; Jenkins, J. P.; Chiu, J. C.; Marshak, A.; Blair, J. B.; Myneni, R. B.; hide

    2007-01-01

    Recent empirical studies have shown that multi-angle spectral data can be useful for predicting canopy height, but the physical reason for this correlation was not understood. We follow the concept of canopy spectral invariants, specifically escape probability, to gain insight into the observed correlation. Airborne Multi-Angle Imaging Spectrometer (AirMISR) and airborne Laser Vegetation Imaging Sensor (LVIS) data acquired during a NASA Terrestrial Ecology Program aircraft campaign underlie our analysis. Two multivariate linear regression models were developed to estimate LVIS height measures from 28 AirMISR multi-angle spectral reflectances and from the spectrally invariant escape probability at 7 AirMISR view angles. Both models achieved nearly the same accuracy, suggesting that canopy spectral invariant theory can explain the observed correlation. We hypothesize that the escape probability is sensitive to the aspect ratio (crown diameter to crown height). The multi-angle spectral data alone therefore may not provide enough information to retrieve canopy height globally

  8. Relating Fresh Concrete Viscosity Measurements from Different Rheometers

    PubMed Central

    Ferraris, Chiara F.; Martys, Nicos S.

    2003-01-01

    Concrete rheological properties need to be properly measured and predicted in order to characterize the workability of fresh concrete, including special concretes such as self-consolidating concrete (SCC). It was shown by a round-robin test held in 2000 [1,2] that different rheometer designs gave different values of viscosity for the same concrete. While empirical correlation between different rheometers was possible, for a procedure that is supposed to “scientifically” improve on the empirical slump tests, this situation is unsatisfactory. To remedy this situation, a new interpretation of the data was developed. In this paper, it is shown that all instruments tested could be directly and quantitatively compared in terms of relative plastic viscosity instead of the plastic viscosity alone. This should eventually allow the measurements from various rheometer designs to be directly calibrated against known standards of plastic viscosity, putting concrete rheometry and concrete workability on a sounder materials science basis. PMID:27413607

  9. A model of rotationally-sampled wind turbulence for predicting fatigue loads in wind turbines

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1995-01-01

    Empirical equations are presented with which to model rotationally-sampled (R-S) turbulence for input to structural-dynamic computer codes and the calculation of wind turbine fatigue loads. These equations are derived from R-S turbulence data which were measured at the vertical-plane array in Clayton, New Mexico. For validation, the equations are applied to the calculation of cyclic flapwise blade loads for the NASA/DOE Mod-2 2.5-MW experimental HAWT's (horizontal-axis wind turbines), and the results compared to measured cyclic loads. Good correlation is achieved, indicating that the R-S turbulence model developed in this study contains the characteristics of the wind which produce many of the fatigue loads sustained by wind turbines. Empirical factors are included which permit the prediction of load levels at specified percentiles of occurrence, which is required for the generation of fatigue load spectra and the prediction of the fatigue lifetime of structures.

  10. Predictions of avian Plasmodium expansion under climate change.

    PubMed

    Loiseau, Claire; Harrigan, Ryan J; Bichet, Coraline; Julliard, Romain; Garnier, Stéphane; Lendvai, Adám Z; Chastel, Olivier; Sorci, Gabriele

    2013-01-01

    Vector-borne diseases are particularly responsive to changing environmental conditions. Diurnal temperature variation has been identified as a particularly important factor for the development of malaria parasites within vectors. Here, we conducted a survey across France, screening populations of the house sparrow (Passer domesticus) for malaria (Plasmodium relictum). We investigated whether variation in remotely-sensed environmental variables accounted for the spatial variation observed in prevalence and parasitemia. While prevalence was highly correlated to diurnal temperature range and other measures of temperature variation, environmental conditions could not predict spatial variation in parasitemia. Based on our empirical data, we mapped malaria distribution under climate change scenarios and predicted that Plasmodium occurrence will spread to regions in northern France, and that prevalence levels are likely to increase in locations where transmission already occurs. Our findings, based on remote sensing tools coupled with empirical data suggest that climatic change will significantly alter transmission of malaria parasites.

  11. Non-invasive Investigation of Human Hippocampal Rhythms Using Magnetoencephalography: A Review.

    PubMed

    Pu, Yi; Cheyne, Douglas O; Cornwell, Brian R; Johnson, Blake W

    2018-01-01

    Hippocampal rhythms are believed to support crucial cognitive processes including memory, navigation, and language. Due to the location of the hippocampus deep in the brain, studying hippocampal rhythms using non-invasive magnetoencephalography (MEG) recordings has generally been assumed to be methodologically challenging. However, with the advent of whole-head MEG systems in the 1990s and development of advanced source localization techniques, simulation and empirical studies have provided evidence that human hippocampal signals can be sensed by MEG and reliably reconstructed by source localization algorithms. This paper systematically reviews simulation studies and empirical evidence of the current capacities and limitations of MEG "deep source imaging" of the human hippocampus. Overall, these studies confirm that MEG provides a unique avenue to investigate human hippocampal rhythms in cognition, and can bridge the gap between animal studies and human hippocampal research, as well as elucidate the functional role and the behavioral correlates of human hippocampal oscillations.

  12. Spray scrubbing of particulate-laden SO(2) using a critical flow atomizer.

    PubMed

    Bandyopadhyay, Amitava; Biswas, Manindra Nath

    2008-08-01

    The performance of a spray tower using an energy efficient two-phase critical flow atomizer on the scrubbing of particulate-laden SO(2) using water and dilute NaOH is reported in this article. Experimentation revealed that SO(2) removal was enhanced due to presence of particles (fly-ash) and almost 100% removal efficiency was achieved in water scrubbing. The removal efficiency is elucidated in reference to atomizing air pressure, droplet diameter and droplet velocity besides other pertinent variables of the system studied. The presence of fly-ash particles improved the removal efficiency to about 20% within the range of variables studied. Empirical and semi-empirical correlations were developed for predicting the removal efficiency in water and dilute NaOH respectively. Predicted data fitted excellently well with experimental values. The performance of the spray tower is compared with the performances of existing systems and very encouraging results are obtained.

  13. Non-invasive Investigation of Human Hippocampal Rhythms Using Magnetoencephalography: A Review

    PubMed Central

    Pu, Yi; Cheyne, Douglas O.; Cornwell, Brian R.; Johnson, Blake W.

    2018-01-01

    Hippocampal rhythms are believed to support crucial cognitive processes including memory, navigation, and language. Due to the location of the hippocampus deep in the brain, studying hippocampal rhythms using non-invasive magnetoencephalography (MEG) recordings has generally been assumed to be methodologically challenging. However, with the advent of whole-head MEG systems in the 1990s and development of advanced source localization techniques, simulation and empirical studies have provided evidence that human hippocampal signals can be sensed by MEG and reliably reconstructed by source localization algorithms. This paper systematically reviews simulation studies and empirical evidence of the current capacities and limitations of MEG “deep source imaging” of the human hippocampus. Overall, these studies confirm that MEG provides a unique avenue to investigate human hippocampal rhythms in cognition, and can bridge the gap between animal studies and human hippocampal research, as well as elucidate the functional role and the behavioral correlates of human hippocampal oscillations. PMID:29755314

  14. On the Time Evolution of Gamma-Ray Burst Pulses: A Self-Consistent Description.

    PubMed

    Ryde; Svensson

    2000-01-20

    For the first time, the consequences of combining two well-established empirical relations that describe different aspects of the spectral evolution of observed gamma-ray burst (GRB) pulses are explored. These empirical relations are (1) the hardness-intensity correlation and (2) the hardness-photon fluence correlation. From these we find a self-consistent, quantitative, and compact description for the temporal evolution of pulse decay phases within a GRB light curve. In particular, we show that in the case in which the two empirical relations are both valid, the instantaneous photon flux (intensity) must behave as 1&solm0;&parl0;1+t&solm0;tau&parr0;, where tau is a time constant that can be expressed in terms of the parameters of the two empirical relations. The time evolution is fully defined by two initial constants and two parameters. We study a complete sample of 83 bright GRB pulses observed by the Compton Gamma-Ray Observatory and identify a major subgroup of GRB pulses ( approximately 45%) which satisfy the spectral-temporal behavior described above. In particular, the decay phase follows a reciprocal law in time. It is unclear what physics causes such a decay phase.

  15. CORRELATION PURSUIT: FORWARD STEPWISE VARIABLE SELECTION FOR INDEX MODELS

    PubMed Central

    Zhong, Wenxuan; Zhang, Tingting; Zhu, Yu; Liu, Jun S.

    2012-01-01

    In this article, a stepwise procedure, correlation pursuit (COP), is developed for variable selection under the sufficient dimension reduction framework, in which the response variable Y is influenced by the predictors X1, X2, …, Xp through an unknown function of a few linear combinations of them. Unlike linear stepwise regression, COP does not impose a special form of relationship (such as linear) between the response variable and the predictor variables. The COP procedure selects variables that attain the maximum correlation between the transformed response and the linear combination of the variables. Various asymptotic properties of the COP procedure are established, and in particular, its variable selection performance under diverging number of predictors and sample size has been investigated. The excellent empirical performance of the COP procedure in comparison with existing methods are demonstrated by both extensive simulation studies and a real example in functional genomics. PMID:23243388

  16. Improved estimation of subject-level functional connectivity using full and partial correlation with empirical Bayes shrinkage.

    PubMed

    Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A

    2018-05-15

    Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully when using partial correlations. Copyright © 2018. Published by Elsevier Inc.

  17. Examining perceptions of academic stress and its sources among university students: The Perception of Academic Stress Scale

    PubMed Central

    Bedewy, Dalia

    2015-01-01

    The development of a scale to measure perceived sources of academic stress among university students. Based on empirical evidence and recent literature review, we developed an 18-item scale to measure perceptions of academic stress and its sources. Experts (n = 12) participated in the content validation process of the instrument before it was administered to (n = 100) students. The developed instrument has internal consistency reliability of 0.7 (Cronbach’s alpha), there was evidence for content validity, and factor analysis resulted in four correlated and theoretically meaningful factors. We developed and tested a scale to measure academic stress and its sources. This scale takes 5 minutes to complete. PMID:28070363

  18. Critical Values for Yen’s Q3: Identification of Local Dependence in the Rasch Model Using Residual Correlations

    PubMed Central

    Christensen, Karl Bang; Makransky, Guido; Horton, Mike

    2016-01-01

    The assumption of local independence is central to all item response theory (IRT) models. Violations can lead to inflated estimates of reliability and problems with construct validity. For the most widely used fit statistic Q3, there are currently no well-documented suggestions of the critical values which should be used to indicate local dependence (LD), and for this reason, a variety of arbitrary rules of thumb are used. In this study, an empirical data example and Monte Carlo simulation were used to investigate the different factors that can influence the null distribution of residual correlations, with the objective of proposing guidelines that researchers and practitioners can follow when making decisions about LD during scale development and validation. A parametric bootstrapping procedure should be implemented in each separate situation to obtain the critical value of LD applicable to the data set, and provide example critical values for a number of data structure situations. The results show that for the Q3 fit statistic, no single critical value is appropriate for all situations, as the percentiles in the empirical null distribution are influenced by the number of items, the sample size, and the number of response categories. Furthermore, the results show that LD should be considered relative to the average observed residual correlation, rather than to a uniform value, as this results in more stable percentiles for the null distribution of an adjusted fit statistic. PMID:29881087

  19. Application of empirical and mechanistic-empirical pavement design procedures to Mn/ROAD concrete pavement test sections

    DOT National Transportation Integrated Search

    1997-05-01

    Current pavement design procedures are based principally on empirical approaches. The current trend toward developing more mechanistic-empirical type pavement design methods led Minnesota to develop the Minnesota Road Research Project (Mn/ROAD), a lo...

  20. On the galaxy–halo connection in the EAGLE simulation

    DOE PAGES

    Desmond, Harry; Mao, Yao -Yuan; Wechsler, Risa H.; ...

    2017-06-13

    Empirical models of galaxy formation require assumptions about the correlations between galaxy and halo properties. These may be calibrated against observations or inferred from physical models such as hydrodynamical simulations. In this Letter, we use the EAGLE simulation to investigate the correlation of galaxy size with halo properties. We motivate this analysis by noting that the common assumption of angular momentum partition between baryons and dark matter in rotationally supported galaxies overpredicts both the spread in the stellar mass–size relation and the anticorrelation of size and velocity residuals, indicating a problem with the galaxy–halo connection it implies. We find themore » EAGLE galaxy population to perform significantly better on both statistics, and trace this success to the weakness of the correlations of galaxy size with halo mass, concentration and spin at fixed stellar mass. Here by, using these correlations in empirical models will enable fine-grained aspects of galaxy scalings to be matched.« less

  1. On the galaxy–halo connection in the EAGLE simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desmond, Harry; Mao, Yao -Yuan; Wechsler, Risa H.

    Empirical models of galaxy formation require assumptions about the correlations between galaxy and halo properties. These may be calibrated against observations or inferred from physical models such as hydrodynamical simulations. In this Letter, we use the EAGLE simulation to investigate the correlation of galaxy size with halo properties. We motivate this analysis by noting that the common assumption of angular momentum partition between baryons and dark matter in rotationally supported galaxies overpredicts both the spread in the stellar mass–size relation and the anticorrelation of size and velocity residuals, indicating a problem with the galaxy–halo connection it implies. We find themore » EAGLE galaxy population to perform significantly better on both statistics, and trace this success to the weakness of the correlations of galaxy size with halo mass, concentration and spin at fixed stellar mass. Here by, using these correlations in empirical models will enable fine-grained aspects of galaxy scalings to be matched.« less

  2. Implication of correlations among some common stability statistics - a Monte Carlo simulations.

    PubMed

    Piepho, H P

    1995-03-01

    Stability analysis of multilocation trials is often based on a mixed two-way model. Two stability measures in frequent use are the environmental variance (S i (2) )and the ecovalence (W i). Under the two-way model the rank orders of the expected values of these two statistics are identical for a given set of genotypes. By contrast, empirical rank correlations among these measures are consistently low. This suggests that the two-way mixed model may not be appropriate for describing real data. To check this hypothesis, a Monte Carlo simulation was conducted. It revealed that the low empirical rank correlation amongS i (2) and W i is most likely due to sampling errors. It is concluded that the observed low rank correlation does not invalidate the two-way model. The paper also discusses tests for homogeneity of S i (2) as well as implications of the two-way model for the classification of stability statistics.

  3. Closing in on Chemical Bonds by Opening up Relativity Theory

    PubMed Central

    Whitney, Cynthia Kolb

    2008-01-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein’s special relativity theory. PMID:19325749

  4. Pluvials, Droughts, the Mongol Empire, and Modern Mongolia

    NASA Astrophysics Data System (ADS)

    Hessl, A. E.; Pederson, N.; Baatarbileg, N.; Anchukaitis, K. J.

    2013-12-01

    Understanding the connections between climate, ecosystems, and society during historical and modern climatic transitions requires annual resolution records with high fidelity climate signals. Many studies link the demise of complex societies with deteriorating climate conditions, but few have investigated the connection between climate, surplus energy, and the rise of empires. Inner Asia in the 13th century underwent a major political transformation requiring enormous energetic inputs that altered human history. The Mongol Empire, centered on the city of Karakorum, became the largest contiguous land empire in world history (Fig. 1 inset). Powered by domesticated grazing animals, the empire grew at the expense of sedentary agriculturalists across Asia, the Middle East, and Eastern Europe. Although some scholars and conventional wisdom agree that dry conditions spurred the Mongol conquests, little paleoenvironmental data at annual resolution are available to evaluate the role of climate in the development of the Mongol Empire. Here we present a 2600 year tree-ring reconstruction of warm-season, self-calibrating Palmer Drought Severity Index (scPDSI), a measure of water balance, derived from 107 live and dead Siberian pine (Pinus sibirica) trees growing on a Holocene lava flow in central Mongolia. Trees growing on the Khorgo lava flow today are stunted and widely spaced, occurring on microsites with little to no soil development. These trees are extremely water-stressed and their radial growth is well-correlated with both drought (scPDSI) and grassland productivity (Normalized Difference Vegetation Index (NDVI)). Our reconstruction, calibrated and validated on instrumental June-September scPDSI (1959-2009) accounts for 55.8% of the variability in the regional scPDSI when 73% of the annual rainfall occurs. Our scPDSI reconstruction places historic and modern social change in Mongolia in the context of the range of climatic variability during the Common Era. Our record, in combination with a gridded temperature reconstruction, shows that the climate during the conquests of Chinggis Khaan's (Ghengis Khan) 13th century Mongol Empire was warm and persistently wet. Tree-ring and meteorological data combined suggest that the early 21st century drought was the hottest drought in the last 1000 years, consistent with model projections of warming in Inner Asia. Future warming may overwhelm increases in precipitation leading to similar 'heat droughts', with potentially severe consequences for modern Mongolia.

  5. Predicting the particle size distribution of eroded sediment using artificial neural networks.

    PubMed

    Lagos-Avid, María Paz; Bonilla, Carlos A

    2017-03-01

    Water erosion causes soil degradation and nonpoint pollution. Pollutants are primarily transported on the surfaces of fine soil and sediment particles. Several soil loss models and empirical equations have been developed for the size distribution estimation of the sediment leaving the field, including the physically-based models and empirical equations. Usually, physically-based models require a large amount of data, sometimes exceeding the amount of available data in the modeled area. Conversely, empirical equations do not always predict the sediment composition associated with individual events and may require data that are not always available. Therefore, the objective of this study was to develop a model to predict the particle size distribution (PSD) of eroded soil. A total of 41 erosion events from 21 soils were used. These data were compiled from previous studies. Correlation and multiple regression analyses were used to identify the main variables controlling sediment PSD. These variables were the particle size distribution in the soil matrix, the antecedent soil moisture condition, soil erodibility, and hillslope geometry. With these variables, an artificial neural network was calibrated using data from 29 events (r 2 =0.98, 0.97, and 0.86; for sand, silt, and clay in the sediment, respectively) and then validated and tested on 12 events (r 2 =0.74, 0.85, and 0.75; for sand, silt, and clay in the sediment, respectively). The artificial neural network was compared with three empirical models. The network presented better performance in predicting sediment PSD and differentiating rain-runoff events in the same soil. In addition to the quality of the particle distribution estimates, this model requires a small number of easily obtained variables, providing a convenient routine for predicting PSD in eroded sediment in other pollutant transport models. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Oil price and exchange rate co-movements in Asian countries: Detrended cross-correlation approach

    NASA Astrophysics Data System (ADS)

    Hussain, Muntazir; Zebende, Gilney Figueira; Bashir, Usman; Donghong, Ding

    2017-01-01

    Most empirical literature investigates the relation between oil prices and exchange rate through different models. These models measure this relationship on two time scales (long and short terms), and often fail to observe the co-movement of these variables at different time scales. We apply a detrended cross-correlation approach (DCCA) to investigate the co-movements of the oil price and exchange rate in 12 Asian countries. This model determines the co-movements of oil price and exchange rate at different time scale. The exchange rate and oil price time series indicate unit root problem. Their correlation and cross-correlation are very difficult to measure. The result becomes spurious when periodic trend or unit root problem occurs in these time series. This approach measures the possible cross-correlation at different time scale and controlling the unit root problem. Our empirical results support the co-movements of oil prices and exchange rate. Our results support a weak negative cross-correlation between oil price and exchange rate for most Asian countries included in our sample. The results have important monetary, fiscal, inflationary, and trade policy implications for these countries.

  7. Estimation of surface PM10 concentration in Seoul during the DRAGON-Asia campaign based on the physical relationship between AOD and PM

    NASA Astrophysics Data System (ADS)

    Seo, S.; Kim, J.; Lee, H.; Jeong, U.; Kim, W. V.; Holben, B. N.; Kim, S.

    2013-12-01

    Atmospheric aerosols are known to play a role in climate change while it also adverse effects on human health such as respiratory and cardiovascular diseases. Especially, in terms of air quality, many studies have been conducted to estimate surface-level particulate matter (PM) concentration by using the satellite measurements to overcome the spatial limitation of ground-based aerosol measurements. In this study, we investigate the relationship between the column aerosol optical depth (AOD) and the surface PM10 concentration using the aerosol measurements during the DRAGON (Distributed Regional Aerosol Gridded Observation Network) - Asia campaign took place in Seoul from March to May, 2012. Based on the physical relationship between AOD and PM concentration, we develop various empirical linear models and evaluate the performance of these models. The best correlation (r = 0.67) is shown when vertical and size distribution of aerosols are additionally considered by using the boundary layer height (BLH) from backscattered lidar signals and the effective radius provided in AERONET inversion products. Similarly, MODIS AOD divided by BLH shows the best correlation with hourly PM10 (r = 0.62). We also identify the variability of correlations between AOD and PM10 depending on the environment characteristics in a complex megacity, Seoul by using the aerosol optical properties measured at mesoscale-level at 10 AERONET sites during the DRAGON campaign. Both AERONET and MODIS show higher correlation in residential area than near source area. Finally, we investigate the seasonal effects on the performance of various empirical linear models and find important factors of each season in PM estimation.

  8. Evaluation of coarse and fine particles in diverse Indian environments.

    PubMed

    George, K V; Patil, Dinakar D; Anil, Mulukutla N V; Kamal, Neel; Alappat, Babu J; Kumar, Prashant

    2017-02-01

    The estimates of airborne fine particle (PM 2.5 ) concentrations are possible through rigorous empirical correlations based on the monitored PM 10 data. However, such correlations change depending on the nature of sources in diverse ambient environments and, therefore, have to be environment specific. Studies presenting such correlations are limited but needed, especially for those areas, where PM 2.5 is not routinely monitored. Moreover, there are a number of studies focusing on urban environments but very limited for coal mines and coastal areas. The aim of this study is to comprehensively analyze the concentrations of both PM 10 and PM 2.5 and develop empirical correlations between them. Data from 26 different sites spread over three distinct environments, which are a relatively clean coastal area, two coal mining areas, and a highly urbanized area in Delhi were used for the study. Distributions of PM in the 0.43-10-μm size range were measured using eight-stage cascade impactors. Regression analysis was used to estimate the percentage of PM 2.5 in PM 10 across distinct environments for source identification. Relatively low percentage of PM 2.5 concentrations (21, 28, and 32%) in PM 10 were found in clean coastal and two mining areas, respectively. Percentage of PM 2.5 concentrations in PM 10 in the highly urbanized area of Delhi was 51%, indicating a presence of a much higher percentage of fine particles due to vehicular combustion in Delhi. The findings of this work are important in estimating concentrations of much harmful fine particles from coarse particles across distinct environments. The results are also useful in source identification of particulates as differences in the percentage of PM 2.5 concentrations in PM 10 can be attributed to characteristics of sources in the diverse ambient environments.

  9. Determining accurate measurements of the growth rate from the galaxy correlation function in simulations

    NASA Astrophysics Data System (ADS)

    Contreras, Carlos; Blake, Chris; Poole, Gregory B.; Marin, Felipe

    2013-04-01

    We use high-resolution N-body simulations to develop a new, flexible empirical approach for measuring the growth rate from redshift-space distortions in the 2-point galaxy correlation function. We quantify the systematic error in measuring the growth rate in a 1 h-3 Gpc3 volume over a range of redshifts, from the dark matter particle distribution and a range of halo-mass catalogues with a number density comparable to the latest large-volume galaxy surveys such as the WiggleZ Dark Energy Survey and the Baryon Oscillation Spectroscopic Survey. Our simulations allow us to span halo masses with bias factors ranging from unity (probed by emission-line galaxies) to more massive haloes hosting luminous red galaxies. We show that the measured growth rate is sensitive to the model adopted for the small-scale real-space correlation function, and in particular that the `standard' assumption of a power-law correlation function can result in a significant systematic error in the growth-rate determination. We introduce a new, empirical fitting function that produces results with a lower (5-10 per cent) amplitude of systematic error. We also introduce a new technique which permits the galaxy pairwise velocity distribution, the quantity which drives the non-linear growth of structure, to be measured as a non-parametric stepwise function. Our (model-independent) results agree well with an exponential pairwise velocity distribution, expected from theoretical considerations, and are consistent with direct measurements of halo velocity differences from the parent catalogues. In a companion paper, we present the application of our new methodology to the WiggleZ Survey data set.

  10. Financial development and oil resource abundance-growth relations: evidence from panel data.

    PubMed

    Law, Siong Hook; Moradbeigi, Maryam

    2017-10-01

    This study investigates whether financial development dampens the negative impact of oil resource abundance on economic growth. Because of substantial cross-sectional dependence in our data, which contain a core sample of 63 oil-producing countries from 1980 through 2010, we use the common correlated effect mean group (CCEMG) estimator to account for the high degree of heterogeneity and drop the outlier countries. The empirical results reveal that oil resource abundance affects the growth rate in output contingent on the degree of development in financial markets. More developed financial markets can channel the revenues from oil into more productive activities and thus offset the negative effects of oil resource abundance on economic growth. Thus, better financial development can reverse resource curse or enhance resource blessing in oil-rich economies.

  11. Semi-empirical model for retrieval of soil moisture using RISAT-1 C-Band SAR data over a sub-tropical semi-arid area of Rewari district, Haryana (India)

    NASA Astrophysics Data System (ADS)

    Rawat, Kishan Singh; Sehgal, Vinay Kumar; Pradhan, Sanatan; Ray, Shibendu S.

    2018-03-01

    We have estimated soil moisture (SM) by using circular horizontal polarization backscattering coefficient (σ o_{RH}), differences of circular vertical and horizontal σ o (σ o_{RV} {-} σ o_{RH}) from FRS-1 data of Radar Imaging Satellite (RISAT-1) and surface roughness in terms of RMS height ({RMS}_{height}). We examined the performance of FRS-1 in retrieving SM under wheat crop at tillering stage. Results revealed that it is possible to develop a good semi-empirical model (SEM) to estimate SM of the upper soil layer using RISAT-1 SAR data rather than using existing empirical model based on only single parameter, i.e., σ o. Near surface SM measurements were related to σ o_{RH}, σ o_{RV} {-} σ o_{RH} derived using 5.35 GHz (C-band) image of RISAT-1 and {RMS}_{height}. The roughness component derived in terms of {RMS}_{height} showed a good positive correlation with σ o_{RV} {-} σ o_{RH} (R2 = 0.65). By considering all the major influencing factors (σ o_{RH}, σ o_{RV} {-} σ o_{RH}, and {RMS}_{height}), an SEM was developed where SM (volumetric) predicted values depend on σ o_{RH}, σ o_{RV} {-} σ o_{RH}, and {RMS}_{height}. This SEM showed R2 of 0.87 and adjusted R2 of 0.85, multiple R=0.94 and with standard error of 0.05 at 95% confidence level. Validation of the SM derived from semi-empirical model with observed measurement ({SM}_{Observed}) showed root mean square error (RMSE) = 0.06, relative-RMSE (R-RMSE) = 0.18, mean absolute error (MAE) = 0.04, normalized RMSE (NRMSE) = 0.17, Nash-Sutcliffe efficiency (NSE) = 0.91 ({≈ } 1), index of agreement (d) = 1, coefficient of determination (R2) = 0.87, mean bias error (MBE) = 0.04, standard error of estimate (SEE) = 0.10, volume error (VE) = 0.15, variance of the distribution of differences ({S}d2) = 0.004. The developed SEM showed better performance in estimating SM than Topp empirical model which is based only on σ o. By using the developed SEM, top soil SM can be estimated with low mean absolute percent error (MAPE) = 1.39 and can be used for operational applications.

  12. Rate correlation for condensation of pure vapor on turbulent, subcooled liquid

    NASA Technical Reports Server (NTRS)

    Brown, J. Steven; Khoo, Boo Cheong; Sonin, Ain A.

    1990-01-01

    An empirical correlation is presented for the condensation of pure vapor on a subcooled, turbulent liquid with a shear-free interface. The correlation expresses the dependence of the condensation rate on fluid properties, on the liquid-side turbulence (which is imposed from below), and on the effects of buoyancy in the interfacial thermal layer. The correlation is derived from experiments with steam and water, but under conditions which simulate typical cryogenic fluids.

  13. Empirical microeconomics action functionals

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Du, Xin; Tanputraman, Winson

    2015-06-01

    A statistical generalization of microeconomics has been made in Baaquie (2013), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is modeled by an action functional-and the focus of this paper is to empirically determine the action functionals for different commodities. The correlation functions of the model are defined using a Feynman path integral. The model is calibrated using the unequal time correlation of the market commodity prices as well as their cubic and quartic moments using a perturbation expansion. The consistency of the perturbation expansion is verified by a numerical evaluation of the path integral. Nine commodities drawn from the energy, metal and grain sectors are studied and their market behavior is described by the model to an accuracy of over 90% using only six parameters. The paper empirically establishes the existence of the action functional for commodity prices that was postulated to exist in Baaquie (2013).

  14. An Empirical Investigation of the Proposition that 'School Is Work': A Comparison of Personality-Performance Correlations in School and Work Settings

    ERIC Educational Resources Information Center

    Lounsbury, John W.; Gibson, Lucy W.; Sundstrom, Eric; Wilburn, Denise; Loveland, James M.

    2004-01-01

    An empirical test of Munson and Rubenstein's (1992) assertion that 'school is work' compared a sample of students in a high school with a sample of workers in a manufacturing plant in the same metropolitan area. Data from both samples included scores on six personality traits--Conscientiousness, Agreeableness, Openness, Emotional Stability,…

  15. Internalized Heterosexism: Measurement, Psychosocial Correlates, and Research Directions

    ERIC Educational Resources Information Center

    Szymanski, Dawn M.; Kashubeck-West, Susan; Meyer, Jill

    2008-01-01

    This article provides an integrated critical review of the literature on internalized heterosexism/internalized homophobia (IH), its measurement, and its psychosocial correlates. It describes the psychometric properties of six published measures used to operationalize the construct of IH. It also critically reviews empirical studies on correlates…

  16. Drag and stability characteristics of a variety of reefed and unreefed parachute configurations at Mach 1.80 with an empirical correlation for supersonic Mach numbers

    NASA Technical Reports Server (NTRS)

    Couch, L. M.

    1975-01-01

    An investigation was conducted at Mach 1.80 in the Langley 4-foot supersonic pressure tunnel to determine the effects of variation in reefing ratio and geometric porosity on the drag and stability characteristics of four basic canopy types deployed in the wake of a cone-cylinder forebody. The basic designs included cross, hemisflo, disk-gap-band, and extended-skirt canopies; however, modular cross and standard flat canopies and a ballute were also investigated. An empirical correlation was determined which provides a fair estimation of the drag coefficients in transonic and supersonic flow for parachutes of specified geometric porosity and reefing ratio.

  17. Analysis of transitional separation bubbles on infinite swept wings

    NASA Technical Reports Server (NTRS)

    Davis, R. L.; Carter, J. E.

    1986-01-01

    A previously developed two-dimensional local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation), has been extended for the calculation of transitional separation bubbles over infinite swept wings. As part of this effort, Roberts' empirical correlation, which is interpreted as a separated flow empirical extension of Mack's stability theory for attached flows, has been incorporated into the ALESEP procedure for the prediction of the transition location within the separation bubble. In addition, the viscous procedure used in the ALESEP techniques has been modified to allow for wall suction. A series of two-dimensional calculations is presented as a verification of the prediction capability of the interaction techniques with the Roberts' transition model. Numerical tests have shown that this two-dimensional natural transition correlation may also be applied to transitional separation bubbles over infinite swept wings. Results of the interaction procedure are compared with Horton's detailed experimental data for separated flow over a swept plate which demonstrates the accuracy of the present technique. Wall suction has been applied to a similar interaction calculation to demonstrate its effect on the separation bubble. The principal conclusion of this paper is that the prediction of transitional separation bubbles over two-dimensional or infinite swept geometries is now possible using the present interacting boundary layer approach.

  18. Empirical Modeling of the Statistical Structure of Radio Signals from Satellites Moving over Mid- and High-Latitude Trajectories in the Southern Hemisphere

    NASA Astrophysics Data System (ADS)

    Fatkullin, M. N.; Solodovnikov, G. K.; Trubitsyn, V. M.

    2004-01-01

    The results of developing the empirical model of parameters of radio signals propagating in the inhomogeneous ionosphere at middle and high latitudes are presented. As the initial data we took the homogeneous data obtained as a result of observations carried out at the Antarctic ``Molodezhnaya'' station by the method of continuous transmission probing of the ionosphere by signals of the satellite radionavigation ``Transit'' system at coherent frequencies of 150 and 400 MHz. The data relate to the summer season period in the Southern hemisphere of the Earth in 1988-1989 during high (F > 160) activity of the Sun. The behavior of the following statistical characteristics of radio signal parameters was analyzed: (a) the interval of correlation of fluctuations of amplitudes at a frequency of 150 MHz (τkA) (b) the interval of correlation of fluctuations of the difference phase (τkϕ) and (c) the parameter characterizing frequency spectra of amplitude (PA) and phase (Pϕ) fluctuations. A third-degree polynomial was used for modeling of propagation parameters. For all above indicated propagation parameters, the coefficients of the third-degree polynomial were calculated as a function of local time and magnetic activity. The results of calculations are tabulated.

  19. Multifractality, efficiency analysis of Chinese stock market and its cross-correlation with WTI crude oil price

    NASA Astrophysics Data System (ADS)

    Zhuang, Xiaoyang; Wei, Yu; Ma, Feng

    2015-07-01

    In this paper, the multifractality and efficiency degrees of ten important Chinese sectoral indices are evaluated using the methods of MF-DFA and generalized Hurst exponents. The study also scrutinizes the dynamics of the efficiency of Chinese sectoral stock market by the rolling window approach. The overall empirical findings revealed that all the sectoral indices of Chinese stock market exist different degrees of multifractality. The results of different efficiency measures have agreed on that the 300 Materials index is the least efficient index. However, they have a slight diffidence on the most efficient one. The 300 Information Technology, 300 Telecommunication Services and 300 Health Care indices are comparatively efficient. We also investigate the cross-correlations between the ten sectoral indices and WTI crude oil price based on Multifractal Detrended Cross-correlation Analysis. At last, some relevant discussions and implications of the empirical results are presented.

  20. Droplet breakup in accelerating gas flows. Part 2: Secondary atomization

    NASA Technical Reports Server (NTRS)

    Zajac, L. J.

    1973-01-01

    An experimental investigation to determine the effects of an accelerating gas flow on the atomization characteristics of liquid sprays was conducted. The sprays were produced by impinging two liquid jets. The liquid was molten wax and the gas was nitrogen. The use of molten wax allowed for a quantitative measure of the resulting dropsize distribution. The results of this study, indicate that a significant amount of droplet breakup will occur as a result of the action of the gas on the liquid droplets. Empirical correlations are presented in terms of parameters that were found to affect the mass median dropsize most significantly, the orifice diameter, the liquid injection velocity, and the maximum gas velocity. An empirical correlation for the normalized dropsize distribution is also presented. These correlations are in a form that may be incorporated readily into existing combustion model computer codes for the purpose of calculating rocket engine combustion performance.

  1. Analysis of Vibration and Noise of Construction Machinery Based on Ensemble Empirical Mode Decomposition and Spectral Correlation Analysis Method

    NASA Astrophysics Data System (ADS)

    Chen, Yuebiao; Zhou, Yiqi; Yu, Gang; Lu, Dan

    In order to analyze the effect of engine vibration on cab noise of construction machinery in multi-frequency bands, a new method based on ensemble empirical mode decomposition (EEMD) and spectral correlation analysis is proposed. Firstly, the intrinsic mode functions (IMFs) of vibration and noise signals were obtained by EEMD method, and then the IMFs which have the same frequency bands were selected. Secondly, we calculated the spectral correlation coefficients between the selected IMFs, getting the main frequency bands in which engine vibration has significant impact on cab noise. Thirdly, the dominated frequencies were picked out and analyzed by spectral analysis method. The study result shows that the main frequency bands and dominated frequencies in which engine vibration have serious impact on cab noise can be identified effectively by the proposed method, which provides effective guidance to noise reduction of construction machinery.

  2. Some empirical evidence for ecological dissonance theory.

    PubMed

    Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E

    2000-04-01

    Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.

  3. An evaluation of dynamic mutuality measurements and methods in cyclic time series

    NASA Astrophysics Data System (ADS)

    Xia, Xiaohua; Huang, Guitian; Duan, Na

    2010-12-01

    Several measurements and techniques have been developed to detect dynamic mutuality and synchronicity of time series in econometrics. This study aims to compare the performances of five methods, i.e., linear regression, dynamic correlation, Markov switching models, concordance index and recurrence quantification analysis, through numerical simulations. We evaluate the abilities of these methods to capture structure changing and cyclicity in time series and the findings of this paper would offer guidance to both academic and empirical researchers. Illustration examples are also provided to demonstrate the subtle differences of these techniques.

  4. Sea level side loads in high-area-ratio rocket engines

    NASA Technical Reports Server (NTRS)

    Nave, L. H.; Coffey, G. A.

    1973-01-01

    An empirical separation and side load model to obtain applied aerodynamic loads has been developed based on data obtained from full-scale J-2S (265K-pound-thrust engine with an area ratio of 40:1) engine and model testing. Experimental data include visual observations of the separation patterns that show the dynamic nature of the separation phenomenon. Comparisons between measured and applied side loads are made. Correlations relating the separation location to the applied side loads and the methods used to determine the separation location are given.

  5. Fan Noise Prediction: Status and Needs

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.

    1997-01-01

    The prediction of fan noise is an important part to the prediction of overall turbofan engine noise. Advances in computers and better understanding of the flow physics have allowed researchers to compute sound generation from first principles and rely less on empirical correlations. While progress has been made, there are still many aspects of the problem that need to be explored. This paper presents some recent advances in fan noise prediction and suggests areas that still need further development. Fan noise predictions that support the recommendations are taken from existing publications.

  6. An Empirical Examination of the Anomie Theory of Drug Use.

    ERIC Educational Resources Information Center

    Dull, R. Thomas

    1983-01-01

    Investigated the relationship between anomie theory, as measured by Srole's Anomie Scale, and self-admitted drug use in an adult population (N=1,449). Bivariate cross-comparison correlations indicated anomie was significantly correlated with several drug variables, but these associations were extremely weak and of little explanatory value.…

  7. Confirmatory factor analysis of the Child Oral Health Impact Profile (Korean version).

    PubMed

    Cho, Young Il; Lee, Soonmook; Patton, Lauren L; Kim, Hae-Young

    2016-04-01

    Empirical support for the factor structure of the Child Oral Health Impact Profile (COHIP) has not been fully established. The purposes of this study were to evaluate the factor structure of the Korean version of the COHIP (COHIP-K) empirically using confirmatory factor analysis (CFA) based on the theoretical framework and then to assess whether any of the factors in the structure could be grouped into a simpler single second-order factor. Data were collected through self-reported COHIP-K responses from a representative community sample of 2,236 Korean children, 8-15 yr of age. Because a large inter-factor correlation of 0.92 was estimated in the original five-factor structure, the two strongly correlated factors were combined into one factor, resulting in a four-factor structure. The revised four-factor model showed a reasonable fit with appropriate inter-factor correlations. Additionally, the second-order model with four sub-factors was reasonable with sufficient fit and showed equal fit to the revised four-factor model. A cross-validation procedure confirmed the appropriateness of the findings. Our analysis empirically supported a four-factor structure of COHIP-K, a summarized second-order model, and the use of an integrated summary COHIP score. © 2016 Eur J Oral Sci.

  8. Scaling Dissolved Nutrient Removal in River Networks: A Comparative Modeling Investigation

    NASA Astrophysics Data System (ADS)

    Ye, Sheng; Reisinger, Alexander J.; Tank, Jennifer L.; Baker, Michelle A.; Hall, Robert O.; Rosi, Emma J.; Sivapalan, Murugesu

    2017-11-01

    Along the river network, water, sediment, and nutrients are transported, cycled, and altered by coupled hydrological and biogeochemical processes. Our current understanding of the rates and processes controlling the cycling and removal of dissolved inorganic nutrients in river networks is limited due to a lack of empirical measurements in large, (nonwadeable), rivers. The goal of this paper was to develop a coupled hydrological and biogeochemical process model to simulate nutrient uptake at the network scale during summer base flow conditions. The model was parameterized with literature values from headwater streams, and empirical measurements made in 15 rivers with varying hydrological, biological, and topographic characteristics, to simulate nutrient uptake at the network scale. We applied the coupled model to 15 catchments describing patterns in uptake for three different solutes to determine the role of rivers in network-scale nutrient cycling. Model simulation results, constrained by empirical data, suggested that rivers contributed proportionally more to nutrient removal than headwater streams given the fraction of their length represented in a network. In addition, variability of nutrient removal patterns among catchments was varied among solutes, and as expected, was influenced by nutrient concentration and discharge. Net ammonium uptake was not significantly correlated with any environmental descriptor. In contrast, net daily nitrate removal was linked to suspended chlorophyll a (an indicator of primary producers) and land use characteristics. Finally, suspended sediment characteristics and agricultural land use were correlated with net daily removal of soluble reactive phosphorus, likely reflecting abiotic sorption dynamics. Rivers are understudied relative to streams, and our model suggests that rivers can contribute more to network-scale nutrient removal than would be expected based upon their representative fraction of network channel length.

  9. Imaging the Material Properties of Bone Specimens using Reflection-Based Infrared Microspectroscopy

    PubMed Central

    Acerbo, Alvin S.; Carr, G. Lawrence; Judex, Stefan; Miller, Lisa M.

    2012-01-01

    Fourier Transform InfraRed Microspectroscopy (FTIRM) is a widely used method for mapping the material properties of bone and other mineralized tissues, including mineralization, crystallinity, carbonate substitution, and collagen cross-linking. This technique is traditionally performed in a transmission-based geometry, which requires the preparation of plastic-embedded thin sections, limiting its functionality. Here, we theoretically and empirically demonstrate the development of reflection-based FTIRM as an alternative to the widely adopted transmission-based FTIRM, which reduces specimen preparation time and broadens the range of specimens that can be imaged. In this study, mature mouse femurs were plastic-embedded and longitudinal sections were cut at a thickness of 4 μm for transmission-based FTIRM measurements. The remaining bone blocks were polished for specular reflectance-based FTIRM measurements on regions immediately adjacent to the transmission sections. Kramers-Kronig analysis of the reflectance data yielded the dielectric response from which the absorption coefficients were directly determined. The reflectance-derived absorbance was validated empirically using the transmission spectra from the thin sections. The spectral assignments for mineralization, carbonate substitution, and collagen cross-linking were indistinguishable in transmission and reflection geometries, while the stoichiometric/non-stoichiometric apatite crystallinity parameter shifted from 1032 / 1021 cm−1 in transmission-based to 1035 / 1025 cm−1 in reflection-based data. This theoretical demonstration and empirical validation of reflection-based FTIRM eliminates the need for thin sections of bone and more readily facilitates direct correlations with other methods such nanoindentation and quantitative backscatter electron imaging (qBSE) from the same specimen. It provides a unique framework for correlating bone’s material and mechanical properties. PMID:22455306

  10. Estimating surface pCO2 in the northern Gulf of Mexico: Which remote sensing model to use?

    NASA Astrophysics Data System (ADS)

    Chen, Shuangling; Hu, Chuanmin; Cai, Wei-Jun; Yang, Bo

    2017-12-01

    Various approaches and models have been proposed to remotely estimate surface pCO2 in the ocean, with variable performance as they were designed for different environments. Among these, a recently developed mechanistic semi-analytical approach (MeSAA) has shown its advantage for its explicit inclusion of physical and biological forcing in the model, yet its general applicability is unknown. Here, with extensive in situ measurements of surface pCO2, the MeSAA, originally developed for the summertime East China Sea, was tested in the northern Gulf of Mexico (GOM) where river plumes dominate water's biogeochemical properties during summer. Specifically, the MeSAA-predicted surface pCO2 was estimated by combining the dominating effects of thermodynamics, river-ocean mixing and biological activities on surface pCO2. Firstly, effects of thermodynamics and river-ocean mixing (pCO2@Hmixing) were estimated with a two-endmember mixing model, assuming conservative mixing. Secondly, pCO2 variations caused by biological activities (ΔpCO2@bio) was determined through an empirical relationship between sea surface temperature (SST)-normalized pCO2 and MODIS (Moderate Resolution Imaging Spectroradiometer) 8-day composite chlorophyll concentration (CHL). The MeSAA-modeled pCO2 (sum of pCO2@Hmixing and ΔpCO2@bio) was compared with the field-measured pCO2. The Root Mean Square Error (RMSE) was 22.94 μatm (5.91%), with coefficient of determination (R2) of 0.25, mean bias (MB) of - 0.23 μatm and mean ratio (MR) of 1.001, for pCO2 ranging between 316 and 452 μatm. To improve the model performance, a locally tuned MeSAA was developed through the use of a locally tuned ΔpCO2@bio term. A multi-variate empirical regression model was also developed using the same dataset. Both the locally tuned MeSAA and the regression models showed improved performance comparing to the original MeSAA, with R2 of 0.78 and 0.84, RMSE of 12.36 μatm (3.14%) and 10.66 μatm (2.68%), MB of 0.00 μatm and - 0.10 μatm, MR of 1.001 and 1.000, respectively. A sensitivity analysis was conducted to study the uncertainties in the predicted pCO2 as a result of the uncertainties in the input variables of each model. Although the MeSAA was more sensitive to variations in SST and CHL than in sea surface salinity (SSS), and the locally tuned MeSAA and the empirical regression models were more sensitive to changes in SST and SSS than in CHL, generally for these three models the bias induced by the uncertainties in the empirically derived parameters (river endmember total alkalinity (TA) and dissolved inorganic carbon (DIC), biological coefficient of the MeSAA and locally tuned MeSAA models) and environmental variables (SST, SSS, CHL) was within or close to the uncertainty of each model. While all these three models showed that surface pCO2 was positively correlated to SST, the MeSAA showed negative correlation between surface pCO2 and SSS and CHL but the locally tuned MeSAA and the empirical regression showed the opposite. These results suggest that the locally tuned MeSAA worked better in the river-dominated northern GOM than the original MeSAA, with slightly worse statistics but more meaningful physical and biogeochemical interpretations than the empirical regression model. Because data from abnormal upwelling were not used to train the models, they are not applicable for waters with strong upwelling, yet the empirical regression approach showed ability to be further tuned to adapt to such cases.

  11. Role of local network oscillations in resting-state functional connectivity.

    PubMed

    Cabral, Joana; Hugues, Etienne; Sporns, Olaf; Deco, Gustavo

    2011-07-01

    Spatio-temporally organized low-frequency fluctuations (<0.1 Hz), observed in BOLD fMRI signal during rest, suggest the existence of underlying network dynamics that emerge spontaneously from intrinsic brain processes. Furthermore, significant correlations between distinct anatomical regions-or functional connectivity (FC)-have led to the identification of several widely distributed resting-state networks (RSNs). This slow dynamics seems to be highly structured by anatomical connectivity but the mechanism behind it and its relationship with neural activity, particularly in the gamma frequency range, remains largely unknown. Indeed, direct measurements of neuronal activity have revealed similar large-scale correlations, particularly in slow power fluctuations of local field potential gamma frequency range oscillations. To address these questions, we investigated neural dynamics in a large-scale model of the human brain's neural activity. A key ingredient of the model was a structural brain network defined by empirically derived long-range brain connectivity together with the corresponding conduction delays. A neural population, assumed to spontaneously oscillate in the gamma frequency range, was placed at each network node. When these oscillatory units are integrated in the network, they behave as weakly coupled oscillators. The time-delayed interaction between nodes is described by the Kuramoto model of phase oscillators, a biologically-based model of coupled oscillatory systems. For a realistic setting of axonal conduction speed, we show that time-delayed network interaction leads to the emergence of slow neural activity fluctuations, whose patterns correlate significantly with the empirically measured FC. The best agreement of the simulated FC with the empirically measured FC is found for a set of parameters where subsets of nodes tend to synchronize although the network is not globally synchronized. Inside such clusters, the simulated BOLD signal between nodes is found to be correlated, instantiating the empirically observed RSNs. Between clusters, patterns of positive and negative correlations are observed, as described in experimental studies. These results are found to be robust with respect to a biologically plausible range of model parameters. In conclusion, our model suggests how resting-state neural activity can originate from the interplay between the local neural dynamics and the large-scale structure of the brain. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Improving Global Models of Remotely Sensed Ocean Chlorophyll Content Using Partial Least Squares and Geographically Weighted Regression

    NASA Astrophysics Data System (ADS)

    Gholizadeh, H.; Robeson, S. M.

    2015-12-01

    Empirical models have been widely used to estimate global chlorophyll content from remotely sensed data. Here, we focus on the standard NASA empirical models that use blue-green band ratios. These band ratio ocean color (OC) algorithms are in the form of fourth-order polynomials and the parameters of these polynomials (i.e. coefficients) are estimated from the NASA bio-Optical Marine Algorithm Data set (NOMAD). Most of the points in this data set have been sampled from tropical and temperate regions. However, polynomial coefficients obtained from this data set are used to estimate chlorophyll content in all ocean regions with different properties such as sea-surface temperature, salinity, and downwelling/upwelling patterns. Further, the polynomial terms in these models are highly correlated. In sum, the limitations of these empirical models are as follows: 1) the independent variables within the empirical models, in their current form, are correlated (multicollinear), and 2) current algorithms are global approaches and are based on the spatial stationarity assumption, so they are independent of location. Multicollinearity problem is resolved by using partial least squares (PLS). PLS, which transforms the data into a set of independent components, can be considered as a combined form of principal component regression (PCR) and multiple regression. Geographically weighted regression (GWR) is also used to investigate the validity of spatial stationarity assumption. GWR solves a regression model over each sample point by using the observations within its neighbourhood. PLS results show that the empirical method underestimates chlorophyll content in high latitudes, including the Southern Ocean region, when compared to PLS (see Figure 1). Cluster analysis of GWR coefficients also shows that the spatial stationarity assumption in empirical models is not likely a valid assumption.

  13. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamecnik, J. R.; Edwards, T. B.

    The conversions of nitrite to nitrate, the destruction of glycolate, and the conversion of glycolate to formate and oxalate were modeled for the Nitric-Glycolic flowsheet using data from Chemical Process Cell (CPC) simulant runs conducted by SRNL from 2011 to 2015. The goal of this work was to develop empirical correlations for these variables versus measureable variables from the chemical process so that these quantities could be predicted a-priori from the sludge composition and measurable processing variables. The need for these predictions arises from the need to predict the REDuction/OXidation (REDOX) state of the glass from the Defense Waste Processingmore » Facility (DWPF) melter. This report summarizes the initial work on these correlations based on the aforementioned data. Further refinement of the models as additional data is collected is recommended.« less

  15. [Mobbing: a meta-analysis and integrative model of its antecedents and consequences].

    PubMed

    Topa Cantisano, Gabriela; Depolo, Marco; Morales Domínguez, J Francisco

    2007-02-01

    Although mobbing has been extensively studied, empirical research has not led to firm conclusions regarding its antecedents and consequences, both at personal and organizational levels. An extensive literature search yielded 86 empirical studies with 93 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported hypotheses regarding organizational environmental factors as main predictors of mobbing.

  16. Content-Related Knowledge of Biology Teachers from Secondary Schools: Structure and learning opportunities

    NASA Astrophysics Data System (ADS)

    Großschedl, Jörg; Mahler, Daniela; Kleickmann, Thilo; Harms, Ute

    2014-09-01

    Teachers' content-related knowledge is a key factor influencing the learning progress of students. Different models of content-related knowledge have been proposed by educational researchers; most of them take into account three categories: content knowledge, pedagogical content knowledge, and curricular knowledge. As there is no consensus about the empirical separability (i.e. empirical structure) of content-related knowledge yet, a total of 134 biology teachers from secondary schools completed three tests which were to capture each of the three categories of content-related knowledge. The empirical structure of content-related knowledge was analyzed by Rasch analysis, which suggests content-related knowledge to be composed of (1) content knowledge, (2) pedagogical content knowledge, and (3) curricular knowledge. Pedagogical content knowledge and curricular knowledge are highly related (rlatent = .70). The latent correlations between content knowledge and pedagogical content knowledge (rlatent = .48)-and curricular knowledge, respectively (rlatent = .35)-are moderate to low (all ps < .001). Beyond the empirical structure of content-related knowledge, different learning opportunities for teachers were investigated with regard to their relationship to content knowledge, pedagogical content knowledge, and curricular knowledge acquisition. Our results show that an in-depth training in teacher education, professional development, and teacher self-study are positively related to particular categories of content-related knowledge. Furthermore, our results indicate that teaching experience is negatively related to curricular knowledge, compared to no significant relationship with content knowledge and pedagogical content knowledge.

  17. Re-evaluating the link between brain size and behavioural ecology in primates.

    PubMed

    Powell, Lauren E; Isler, Karin; Barton, Robert A

    2017-10-25

    Comparative studies have identified a wide range of behavioural and ecological correlates of relative brain size, with results differing between taxonomic groups, and even within them. In primates for example, recent studies contradict one another over whether social or ecological factors are critical. A basic assumption of such studies is that with sufficiently large samples and appropriate analysis, robust correlations indicative of selection pressures on cognition will emerge. We carried out a comprehensive re-examination of correlates of primate brain size using two large comparative datasets and phylogenetic comparative methods. We found evidence in both datasets for associations between brain size and ecological variables (home range size, diet and activity period), but little evidence for an effect of social group size, a correlation which has previously formed the empirical basis of the Social Brain Hypothesis. However, reflecting divergent results in the literature, our results exhibited instability across datasets, even when they were matched for species composition and predictor variables. We identify several potential empirical and theoretical difficulties underlying this instability and suggest that these issues raise doubts about inferring cognitive selection pressures from behavioural correlates of brain size. © 2017 The Author(s).

  18. Local knowledge: Empirical Fact to Develop Community Based Disaster Risk Management Concept for Community Resilience at Mangkang Kulon Village, Semarang City

    NASA Astrophysics Data System (ADS)

    Kapiarsa, A. B.; Sariffuddin, S.

    2018-02-01

    Local knowledge in disaster management should not be neglected in developing community resilience. The circular relation between humans and their living habitat and community social relation have developed the local knowledge namely specialized knowledge, shared knowledge, and common knowledge. Its correlation with community-based disaster management has become an important discussion specially to answer can local knowledge underlie community-based disaster risk reduction concept development? To answer this question, this research used mix-method. Interview and crosstab method for 73 respondents with 90% trust rate were used to determine the correlation between local knowledge and community characteristics. This research found out that shared knowledge dominated community local knowledge (77%). While common knowledge and specialized knowledge were sequentially 8% and 15%. The high score of shared value (77%) indicated that local knowledge was occurred in household level and not yet indicated in community level. Shared knowledge was found in 3 phases of the resilient community in dealing with disaster, namely mitigation, emergency response, and recovery phase. This research, therefore, has opened a new scientific discussion on the self-help concept in community-help concept in CBDRM concept development in Indonesia.

  19. Empirical prediction of peak pressure levels in anthropogenic impulsive noise. Part I: Airgun arrays signals.

    PubMed

    Galindo-Romero, Marta; Lippert, Tristan; Gavrilov, Alexander

    2015-12-01

    This paper presents an empirical linear equation to predict peak pressure level of anthropogenic impulsive signals based on its correlation with the sound exposure level. The regression coefficients are shown to be weakly dependent on the environmental characteristics but governed by the source type and parameters. The equation can be applied to values of the sound exposure level predicted with a numerical model, which provides a significant improvement in the prediction of the peak pressure level. Part I presents the analysis for airgun arrays signals, and Part II considers the application of the empirical equation to offshore impact piling noise.

  20. Developmental Associations between Short-Term Variability and Long-Term Changes: Intraindividual Correlation of Positive and Negative Affect in Daily Life and Cognitive Aging

    ERIC Educational Resources Information Center

    Hülür, Gizem; Hoppmann, Christiane A.; Ram, Nilam; Gerstorf, Denis

    2015-01-01

    Conceptual notions and empirical evidence suggest that the intraindividual correlation (iCorr) of positive affect (PA) and negative affect (NA) is a meaningful characteristic of affective functioning. PA and NA are typically negatively correlated within-person. Previous research has found that the iCorr of PA and NA is relatively stable over time…

  1. Limits of the memory coefficient in measuring correlated bursts

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Hiraoka, Takayuki

    2018-03-01

    Temporal inhomogeneities in event sequences of natural and social phenomena have been characterized in terms of interevent times and correlations between interevent times. The inhomogeneities of interevent times have been extensively studied, while the correlations between interevent times, often called correlated bursts, are far from being fully understood. For measuring the correlated bursts, two relevant approaches were suggested, i.e., memory coefficient and burst size distribution. Here a burst size denotes the number of events in a bursty train detected for a given time window. Empirical analyses have revealed that the larger memory coefficient tends to be associated with the heavier tail of the burst size distribution. In particular, empirical findings in human activities appear inconsistent, such that the memory coefficient is close to 0, while burst size distributions follow a power law. In order to comprehend these observations, by assuming the conditional independence between consecutive interevent times, we derive the analytical form of the memory coefficient as a function of parameters describing interevent time and burst size distributions. Our analytical result can explain the general tendency of the larger memory coefficient being associated with the heavier tail of burst size distribution. We also find that the apparently inconsistent observations in human activities are compatible with each other, indicating that the memory coefficient has limits to measure the correlated bursts.

  2. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  3. Sexual orientation beliefs: their relationship to anti-gay attitudes and biological determinist arguments.

    PubMed

    Hegarty, P; Pratto, F

    2001-01-01

    Previous studies which have measured beliefs about sexual orientation with either a single item, or a one-dimensional scale are discussed. In the present study beliefs were observed to vary along two dimensions: the "immutability" of sexual orientation and the "fundamentality" of a categorization of persons as heterosexuals and homosexuals. While conceptually related, these two dimensions were empirically distinct on several counts. They were negatively correlated with each other. Condemning attitudes toward lesbians and gay men were correlated positively with fundamentality but negatively with immutability. Immutability, but not fundamentality, affected the assimilation of a biological determinist argument. The relationship between sexual orientation beliefs and anti-gay prejudice is discussed and suggestions for empirical studies of sexual orientation beliefs are presented.

  4. Phi Index: A New Metric to Test the Flush Early and Avoid the Rush Hypothesis

    PubMed Central

    Samia, Diogo S. M.; Blumstein, Daniel T.

    2014-01-01

    Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the “Flush Early and Avoid the Rush” (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1∶1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis. PMID:25405872

  5. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    PubMed

    Samia, Diogo S M; Blumstein, Daniel T

    2014-01-01

    Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  6. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2012-01-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  7. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2011-12-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  8. A meta-analysis of factors affecting trust in human-robot interaction.

    PubMed

    Hancock, Peter A; Billings, Deborah R; Schaefer, Kristin E; Chen, Jessie Y C; de Visser, Ewart J; Parasuraman, Raja

    2011-10-01

    We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI). To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice. Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes. The overall correlational effect size for trust was r = +0.26,with an experimental effect size of d = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role. Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors. The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.

  9. The Nature of Procrastination: A Meta-Analytic and Theoretical Review of Quintessential Self-Regulatory Failure

    ERIC Educational Resources Information Center

    Steel, Piers

    2007-01-01

    Procrastination is a prevalent and pernicious form of self-regulatory failure that is not entirely understood. Hence, the relevant conceptual, theoretical, and empirical work is reviewed, drawing upon correlational, experimental, and qualitative findings. A meta-analysis of procrastination's possible causes and effects, based on 691 correlations,…

  10. Introducing Scale Analysis by Way of a Pendulum

    ERIC Educational Resources Information Center

    Lira, Ignacio

    2007-01-01

    Empirical correlations are a practical means of providing approximate answers to problems in physics whose exact solution is otherwise difficult to obtain. The correlations relate quantities that are deemed to be important in the physical situation to which they apply, and can be derived from experimental data by means of dimensional and/or scale…

  11. Large-Scale Studies on the Transferability of General Problem-Solving Skills and the Pedagogic Potential of Physics

    ERIC Educational Resources Information Center

    Mashood, K. K.; Singh, Vijay A.

    2013-01-01

    Research suggests that problem-solving skills are transferable across domains. This claim, however, needs further empirical substantiation. We suggest correlation studies as a methodology for making preliminary inferences about transfer. The correlation of the physics performance of students with their performance in chemistry and mathematics in…

  12. Correlates of the MMPI-2-RF in a College Setting

    ERIC Educational Resources Information Center

    Forbey, Johnathan D.; Lee, Tayla T. C.; Handel, Richard W.

    2010-01-01

    The current study examined empirical correlates of scores on Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF; A. Tellegen & Y. S. Ben-Porath, 2008; Y. S. Ben-Porath & A. Tellegen, 2008) scales in a college setting. The MMPI-2-RF and six criterion measures (assessing anger, assertiveness, sex roles, cognitive…

  13. Using beta coefficients to impute missing correlations in meta-analysis research: Reasons for caution.

    PubMed

    Roth, Philip L; Le, Huy; Oh, In-Sue; Van Iddekinge, Chad H; Bobko, Philip

    2018-06-01

    Meta-analysis has become a well-accepted method for synthesizing empirical research about a given phenomenon. Many meta-analyses focus on synthesizing correlations across primary studies, but some primary studies do not report correlations. Peterson and Brown (2005) suggested that researchers could use standardized regression weights (i.e., beta coefficients) to impute missing correlations. Indeed, their beta estimation procedures (BEPs) have been used in meta-analyses in a wide variety of fields. In this study, the authors evaluated the accuracy of BEPs in meta-analysis. We first examined how use of BEPs might affect results from a published meta-analysis. We then developed a series of Monte Carlo simulations that systematically compared the use of existing correlations (that were not missing) to data sets that incorporated BEPs (that impute missing correlations from corresponding beta coefficients). These simulations estimated ρ̄ (mean population correlation) and SDρ (true standard deviation) across a variety of meta-analytic conditions. Results from both the existing meta-analysis and the Monte Carlo simulations revealed that BEPs were associated with potentially large biases when estimating ρ̄ and even larger biases when estimating SDρ. Using only existing correlations often substantially outperformed use of BEPs and virtually never performed worse than BEPs. Overall, the authors urge a return to the standard practice of using only existing correlations in meta-analysis. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Statistical power as a function of Cronbach alpha of instrument questionnaire items.

    PubMed

    Heo, Moonseong; Kim, Namhee; Faith, Myles S

    2015-10-14

    In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless of research designs or settings, in order to increase statistical power, development and use of instruments with greater C(α), or equivalently with greater inter-item correlations, is crucial for trials that intend to use questionnaire items for measuring research outcomes. Further development of the power functions for binary or ordinal item scores and under more general item correlation strutures reflecting more real world situations would be a valuable future study.

  15. VS30 – A site-characterization parameter for use in building Codes, simplified earthquake resistant design, GMPEs, and ShakeMaps

    USGS Publications Warehouse

    Borcherdt, Roger D.

    2012-01-01

    VS30, defined as the average seismic shear-wave velocity from the surface to a depth of 30 meters, has found wide-spread use as a parameter to characterize site response for simplified earthquake resistant design as implemented in building codes worldwide. VS30 , as initially introduced by the author for the US 1994 NEHRP Building Code, provides unambiguous definitions of site classes and site coefficients for site-dependent response spectra based on correlations derived from extensive borehole logging and comparative ground-motion measurement programs in California. Subsequent use of VS30 for development of strong ground motion prediction equations (GMPEs) and measurement of extensive sets of VS borehole data have confirmed the previous empirical correlations and established correlations of SVS30 with VSZ at other depths. These correlations provide closed form expressions to predict S30 V at a large number of additional sites and further justify S30 V as a parameter to characterize site response for simplified building codes, GMPEs, ShakeMap, and seismic hazard mapping.

  16. A Semi-Empirical Model for Forecasting Relativistic Electrons at Geostationary Orbit

    NASA Technical Reports Server (NTRS)

    Lyatsky, Wladislaw; Khazanov, George V.

    2008-01-01

    We developed a new prediction model for forecasting relativistic (>2MeV) electrons, which provides a VERY HIGH correlation between predicted and actually measured electron fluxes at geostationary orbit. This model implies the multi-step particle acceleration and is based on numerical integrating two linked continuity equations for primarily accelerated particles and relativistic electrons. The model includes a source and losses, and used solar wind data as only input parameters. We used the coupling function which is a best-fit combination of solar wind/Interplanetary Magnetic Field parameters, responsible for the generation of geomagnetic activity, as a source. The loss function was derived from experimental data. We tested the model for four year period 2004-2007. The correlation coefficient between predicted and actual values of the electron fluxes for whole four year period as well as for each of these years is about 0.9. The high and stable correlation between the computed and actual electron fluxes shows that the reliable forecasting these electrons at geostationary orbit is possible. The correlation coefficient between predicted and actual electron fluxes is stable and incredibly high.

  17. Study of liquid and vapor flow into a Centaur capillary device

    NASA Technical Reports Server (NTRS)

    Blatt, M. H.; Risberg, J. A.

    1979-01-01

    The following areas of liquid and vapor flow were analyzed and experimentally evaluated; 1) the refilling of capillary devices with settled liquid, and 2) vapor flow across wetted screens. These investigations resulted in: 1) the development of a versatile computer program that was successfully correlated with test data and used to predict Centaur D-1S LO2 and LH2 start basket refilling; 2) the development of a semi-empirical model that was only partially correlated with data due to difficulties in obtaining repeatable test results. Also, a comparison was made to determine the best propellant management system for the Centaur D-1S vehicle. The comparison identified the basline Centaur D-1S system (using pressurization, boost pumps and propellant settling) as the best candidate based on payload weight penalty. However, other comparison criteria and advanced mission condition were identified where pressure fed systems, thermally subcooled boost pumps and capillary devices would be selected as attractive alternatives.

  18. Investigation of short cavity CRDS noise terms by optical correlation

    NASA Astrophysics Data System (ADS)

    Griffin, Steven T.; Fathi, Jason

    2013-05-01

    Cavity Ring Down Spectroscopy (CRDS) has been identified as having significant potential for Department of Defense security and sensing applications. Significant factors in the development of new sensor architectures are portability, robustness and economy. A significant factor in new CRDS sensor architectures is cavity length. Prior publication has examined the role of cavity length in sensing modality both from the standpoint of the system's design and the identification of potential difficulties presented by novel approaches. Two of interest here are new noise terms that have been designated turbulence-like and speckle-like in prior publication. In the prior publication the theoretical and some empirical data was presented. This presentation addresses the automation of the experimental apparatus, new data analysis, and implications regarding the significance of the two noise terms. This is accomplished through an Analog-to- Digital Conversion (ADC) from the output of a custom designed optical correlator. Details of the unique application of the developed instrument and implications for short cavity (portable) CRDS applications are presented.

  19. Model study of greenline dayglow emission under geomagnetic storm conditions.

    NASA Astrophysics Data System (ADS)

    Singh, V.; Bag, T.; Sunil Krishna, M. V.

    2016-12-01

    A comprehensive model is developed to study the influences of geomagnetic storms on greenline (557.7 nm) dayglow emission during the solar active and solar quiet conditions in thermosphere. This study is based on a photochemical model which is developed using the latest reaction rate coefficients, quantum yields and collisional cross-sections obtained from the experimental observations and empirical models. This study is for a low latitude station Tirunelveli (8.7N,77.8E), India. The volume emission rate (VER) has been calculated using the densities and temperature from NRLMSISE-00 and IRI-2012 models. The modeled VER shows a positive correlation with the Dst index, and a negative correlation with the number densities of O, O2, and N2. The VER calculated at the peak emission altitude shows depletion during the main phase of the storm. The peak emission altitude doesn't show any appreciable variation during storm period. On the other hand, the peak emission altitude shows an upward movement with the increase in F10.7 solar index.

  20. Causes of coal-miner absenteeism. Information Circular/1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, R.H.; Randolph, R.F.

    The Bureau of Mines report describes several significant problems associated with absenteeism among underground coal miners. The vast empirical literature on employee absenteeism is reviewed, and a conceptual model of the factors that cause absenteeism among miners is presented. Portions of the model were empirically tested by performing correlational and multiple regression analyses on data collected from a group of 64 underground coal miners. The results of these tests are presented and discussed.

  1. GPS-Derived Precipitable Water Compared with the Air Force Weather Agency’s MM5 Model Output

    DTIC Science & Technology

    2002-03-26

    and less then 100 sensors are available throughout Europe . While the receiver density is currently comparable to the upper-air sounding network...profiles from 38 upper air sites throughout Europe . Based on these empirical formulae and simplifications, Bevis (1992) has determined that the error...Alaska using Bevis’ (1992) empirical correlation based on 8718 radiosonde calculations over 2 years. Other studies have been conducted in Europe and

  2. Perceived sexual harassment at work: meta-analysis and structural model of antecedents and consequences.

    PubMed

    Topa Cantisano, Gabriela; Morales Domínguez, J F; Depolo, Marco

    2008-05-01

    Although sexual harassment has been extensively studied, empirical research has not led to firm conclusions about its antecedents and consequences, both at the personal and organizational level. An extensive literature search yielded 42 empirical studies with 60 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported the hypotheses regarding organizational environmental factors as main predictors of harassment.

  3. Population priorities: the challenge of continued rapid population growth.

    PubMed

    Turner, Adair

    2009-10-27

    Rapid population growth continues in the least developed countries. The revisionist case that rapid population could be overcome by technology, that population density was advantageous, that capital shallowing is not a vital concern and that empirical investigations had not proved a correlation between high population growth and low per capita income was both empirically and theoretically flawed. In the modern world, population density does not play the role it did in nineteenth-century Europe and rates of growth in some of today's least developed nations are four times than those in nineteenth-century Europe, and without major accumulation of capital per capita, no major economy has or is likely to make the low- to middle-income transition. Though not sufficient, capital accumulation for growth is absolutely essential to economic growth. While there are good reasons for objecting to the enforced nature of the Chinese one-child policy, we should not underestimate the positive impact which that policy has almost certainly had and will have over the next several decades on Chinese economic performance. And a valid reticence about telling developing countries that they must contain fertility should not lead us to underestimate the severely adverse impact of high fertility rates on the economic performance and prospects of many countries in Africa and the Middle East.

  4. Do foreign exchange and equity markets co-move in Latin American region? Detrended cross-correlation approach

    NASA Astrophysics Data System (ADS)

    Bashir, Usman; Yu, Yugang; Hussain, Muntazir; Zebende, Gilney F.

    2016-11-01

    This paper investigates the dynamics of the relationship between foreign exchange markets and stock markets through time varying co-movements. In this sense, we analyzed the time series monthly of Latin American countries for the period from 1991 to 2015. Furthermore, we apply Granger causality to verify the direction of causality between foreign exchange and stock market and detrended cross-correlation approach (ρDCCA) for any co-movements at different time scales. Our empirical results suggest a positive cross correlation between exchange rate and stock price for all Latin American countries. The findings reveal two clear patterns of correlation. First, Brazil and Argentina have positive correlation in both short and long time frames. Second, the remaining countries are negatively correlated in shorter time scale, gradually moving to positive. This paper contributes to the field in three ways. First, we verified the co-movements of exchange rate and stock prices that were rarely discussed in previous empirical studies. Second, ρDCCA coefficient is a robust and powerful methodology to measure the cross correlation when dealing with non stationarity of time series. Third, most of the studies employed one or two time scales using co-integration and vector autoregressive approaches. Not much is known about the co-movements at varying time scales between foreign exchange and stock markets. ρDCCA coefficient facilitates the understanding of its explanatory depth.

  5. Improvement of the Correlative AFM and ToF-SIMS Approach Using an Empirical Sputter Model for 3D Chemical Characterization.

    PubMed

    Terlier, T; Lee, J; Lee, K; Lee, Y

    2018-02-06

    Technological progress has spurred the development of increasingly sophisticated analytical devices. The full characterization of structures in terms of sample volume and composition is now highly complex. Here, a highly improved solution for 3D characterization of samples, based on an advanced method for 3D data correction, is proposed. Traditionally, secondary ion mass spectrometry (SIMS) provides the chemical distribution of sample surfaces. Combining successive sputtering with 2D surface projections enables a 3D volume rendering to be generated. However, surface topography can distort the volume rendering by necessitating the projection of a nonflat surface onto a planar image. Moreover, the sputtering is highly dependent on the probed material. Local variation of composition affects the sputter yield and the beam-induced roughness, which in turn alters the 3D render. To circumvent these drawbacks, the correlation of atomic force microscopy (AFM) with SIMS has been proposed in previous studies as a solution for the 3D chemical characterization. To extend the applicability of this approach, we have developed a methodology using AFM-time-of-flight (ToF)-SIMS combined with an empirical sputter model, "dynamic-model-based volume correction", to universally correct 3D structures. First, the simulation of 3D structures highlighted the great advantages of this new approach compared with classical methods. Then, we explored the applicability of this new correction to two types of samples, a patterned metallic multilayer and a diblock copolymer film presenting surface asperities. In both cases, the dynamic-model-based volume correction produced an accurate 3D reconstruction of the sample volume and composition. The combination of AFM-SIMS with the dynamic-model-based volume correction improves the understanding of the surface characteristics. Beyond the useful 3D chemical information provided by dynamic-model-based volume correction, the approach permits us to enhance the correlation of chemical information from spectroscopic techniques with the physical properties obtained by AFM.

  6. Development and validation of the Spanish-English Language Proficiency Scale (SELPS).

    PubMed

    Smyk, Ekaterina; Restrepo, M Adelaida; Gorin, Joanna S; Gray, Shelley

    2013-07-01

    This study examined the development and validation of a criterion-referenced Spanish-English Language Proficiency Scale (SELPS) that was designed to assess the oral language skills of sequential bilingual children ages 4-8. This article reports results for the English proficiency portion of the scale. The SELPS assesses syntactic complexity, grammatical accuracy, verbal fluency, and lexical diversity based on 2 story retell tasks. In Study 1, 40 children were given 2 story retell tasks to evaluate the reliability of parallel forms. In Study 2, 76 children participated in the validation of the scale against language sample measures and teacher ratings of language proficiency. Study 1 indicated no significant differences between the SELPS scores on the 2 stories. Study 2 indicated that the SELPS scores correlated significantly with their counterpart language sample measures. Correlations between the SELPS and teacher ratings were moderate. The 2 story retells elicited comparable SELPS scores, providing a valuable tool for test-retest conditions in the assessment of language proficiency. Correlations between the SELPS scores and external variables indicated that these measures assessed the same language skills. Results provided empirical evidence regarding the validity of inferences about language proficiency based on the SELPS score.

  7. Methodological considerations regarding response bias effect in substance use research: is correlation between the measured variables sufficient?

    PubMed Central

    2011-01-01

    Efforts for drug free sport include developing a better understanding of the behavioural determinants that underline doping with an increased interest in developing anti-doping prevention and intervention programmes. Empirical testing of both is dominated by self-report questionnaires, which is the most widely used method in psychological assessments and sociology polls. Disturbingly, the potential distorting effect of socially desirable responding (SD) is seldom considered in doping research, or dismissed based on weak correlation between some SD measure and the variables of interest. The aim of this report is to draw attention to i) the potential distorting effect of SD and ii) the limitation of using correlation analysis between a SD measure and the individual measures. Models of doping opinion as a potentially contentious issue was tested using structural equation modeling technique (SEM) with and without the SD variable, on a dataset of 278 athletes, assessing the SD effect both at the i) indicator and ii) construct levels, as well as iii) testing SD as an independent variable affecting expressed doping opinion. Participants were categorised by their SD score into high- and low SD groups. Based on low correlation coefficients (<|0.22|) observed in the overall sample, SD effect on the indicator variables could be disregarded. Regression weights between predictors and the outcome variable varied between groups with high and low SD but despite the practically non-existing relationship between SD and predictors (<|0.11|) in the low SD group, both groups showed improved model fit with SD, independently. The results of this study clearly demonstrate the presence of SD effect and the inadequacy of the commonly used pairwise correlation to assess social desirability at model level. In the absence of direct observation of the target behaviour (i.e. doping use), evaluation of the effectiveness of future anti-doping campaign, along with empirical testing of refined doping behavioural models, will likely to continue to rely on self-reported information. Over and above controlling the effect of socially desirable responding in research that makes inferences based on self-reported information on social cognitive and behavioural measures, it is recommended that SD effect is appropriately assessed during data analysis. PMID:21244663

  8. What can we learn about dispersion from the conformer surface of n-pentane?

    PubMed

    Martin, Jan M L

    2013-04-11

    In earlier work [Gruzman, D. ; Karton, A.; Martin, J. M. L. J. Phys. Chem. A 2009, 113, 11974], we showed that conformer energies in alkanes (and other systems) are highly dispersion-driven and that uncorrected DFT functionals fail badly at reproducing them, while simple empirical dispersion corrections tend to overcorrect. To gain greater insight into the nature of the phenomenon, we have mapped the torsional surface of n-pentane to 10-degree resolution at the CCSD(T)-F12 level near the basis set limit. The data obtained have been decomposed by order of perturbation theory, excitation level, and same-spin vs opposite-spin character. A large number of approximate electronic structure methods have been considered, as well as several empirical dispersion corrections. Our chief conclusions are as follows: (a) the effect of dispersion is dominated by same-spin correlation (or triplet-pair correlation, from a different perspective); (b) singlet-pair correlation is important for the surface, but qualitatively very dissimilar to the dispersion component; (c) single and double excitations beyond third order are essentially unimportant for this surface; (d) connected triple excitations do play a role but are statistically very similar to the MP2 singlet-pair correlation; (e) the form of the damping function is crucial for good performance of empirical dispersion corrections; (f) at least in the lower-energy regions, SCS-MP2 and especially MP2.5 perform very well; (g) novel spin-component scaled double hybrid functionals such as DSD-PBEP86-D2 acquit themselves very well for this problem.

  9. Climate change and the collapse of the Akkadian empire: Evidence from the deep sea

    NASA Astrophysics Data System (ADS)

    Cullen, H. M.; Demenocal, P. B.; Hemming, S.; Hemming, G.; Brown, F. H.; Guilderson, T.; Sirocko, F.

    2000-04-01

    The Akkadian empire ruled Mesopotamia from the headwaters of the Tigris-Euphrates Rivers to the Persian Gulf during the late third millennium B.C. Archeological evidence has shown that this highly developed civilization collapsed abruptly near 4170 ± 150 calendar yr B.P., perhaps related to a shift to more arid conditions. Detailed paleoclimate records to test this assertion from Mesopotamia are rare, but changes in regional aridity are preserved in adjacent ocean basins. We document Holocene changes in regional aridity using mineralogic and geochemical analyses of a marine sediment core from the Gulf of Oman, which is directly downwind of Mesopotamian dust source areas and archeological sites. Our results document a very abrupt increase in eolian dust and Mesopotamian aridity, accelerator mass spectrometer radiocarbon dated to 4025 ± 125 calendar yr B.P., which persisted for ˜300 yr. Radiogenic (Nd and Sr) isotope analyses confirm that the observed increase in mineral dust was derived from Mesopotamian source areas. Geochemical correlation of volcanic ash shards between the archeological site and marine sediment record establishes a direct temporal link between Mesopotamian aridification and social collapse, implicating a sudden shift to more arid conditions as a key factor contributing to the collapse of the Akkadian empire.

  10. Fluorescence Imaging Study of Transition in Underexpanded Free Jets

    NASA Technical Reports Server (NTRS)

    Wilkes, Jennifer A.; Danehy, Paul M.; Nowak, Robert J.

    2005-01-01

    Planar laser-induced fluorescence (PLIF) is demonstrated to be a valuable tool for studying the onset of transition to turbulence. For this study, we have used PLIF of nitric oxide (NO) to image underexpanded axisymmetric free jets issuing into a low-pressure chamber through a smooth converging nozzle with a sonic orifice. Flows were studied over a range of Reynolds numbers and nozzle-exit-to-ambient pressure ratios with the aim of empirically determining criteria governing the onset of turbulence. We have developed an image processing technique, involving calculation of the standard deviation of the intensity in PLIF images, in order to aid in the identification of turbulence. We have used the resulting images to identify laminar, transitional and turbulent flow regimes. Jet scaling parameters were used to define a rescaled Reynolds number that incorporates the influence of a varying pressure ratio. An empirical correlation was found between transition length and this rescaled Reynolds number for highly underexpanded jets.

  11. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  12. Analytical Fuselage and Wing Weight Estimation of Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Chambers, Mark C.; Ardema, Mark D.; Patron, Anthony P.; Hahn, Andrew S.; Miura, Hirokazu; Moore, Mark D.

    1996-01-01

    A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft, and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT has traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight. Using statistical analysis techniques, relations between the load-bearing fuselage and wing weights calculated by PDCYL and corresponding actual weights were determined.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantin, Lucian A.; Fabiano, Eduardo; Della Sala, Fabio

    We introduce a novel non-local ingredient for the construction of exchange density functionals: the reduced Hartree parameter, which is invariant under the uniform scaling of the density and represents the exact exchange enhancement factor for one- and two-electron systems. The reduced Hartree parameter is used together with the conventional meta-generalized gradient approximation (meta-GGA) semilocal ingredients (i.e., the electron density, its gradient, and the kinetic energy density) to construct a new generation exchange functional, termed u-meta-GGA. This u-meta-GGA functional is exact for the exchange of any one- and two-electron systems, is size-consistent and non-empirical, satisfies the uniform density scaling relation, andmore » recovers the modified gradient expansion derived from the semiclassical atom theory. For atoms, ions, jellium spheres, and molecules, it shows a good accuracy, being often better than meta-GGA exchange functionals. Our construction validates the use of the reduced Hartree ingredient in exchange-correlation functional development, opening the way to an additional rung in the Jacob’s ladder classification of non-empirical density functionals.« less

  14. A Comparative Study of the Empirical Relationship in Student Performance between Physics and Other STEM Subjects

    NASA Astrophysics Data System (ADS)

    Guerra, Maricela

    The Next Generation Science Standards (NGSS) advocated by the National Research Council emphasize the connections among Science, Technology, Engineering, and Mathematics (STEM) disciplines. By design, NGSS is expected to replace the previous science education standards to enhance the quality of STEM education across the nation. To support this initiative, this investigation was conducted to fill a void in the research literature by developing an empirical indicator for the relationship of student performance across STEM subjects using a large-scale database from the Trends in Mathematics and Science Study (TIMSS). In particular, an innovative approach has been taken in this study to support the canonical correlation analysis of student plausible scores between physics and other STEM subjects at different grade levels and in a cross-country context. Results from this doctoral research revealed the need to strengthen the alignment between the intended, implemented, and attained curricula to support the integration of STEM disciplines in the United States.

  15. Robust Visual Tracking Revisited: From Correlation Filter to Template Matching.

    PubMed

    Liu, Fanghui; Gong, Chen; Huang, Xiaolin; Zhou, Tao; Yang, Jie; Tao, Dacheng

    2018-06-01

    In this paper, we propose a novel matching based tracker by investigating the relationship between template matching and the recent popular correlation filter based trackers (CFTs). Compared to the correlation operation in CFTs, a sophisticated similarity metric termed mutual buddies similarity is proposed to exploit the relationship of multiple reciprocal nearest neighbors for target matching. By doing so, our tracker obtains powerful discriminative ability on distinguishing target and background as demonstrated by both empirical and theoretical analyses. Besides, instead of utilizing single template with the improper updating scheme in CFTs, we design a novel online template updating strategy named memory, which aims to select a certain amount of representative and reliable tracking results in history to construct the current stable and expressive template set. This scheme is beneficial for the proposed tracker to comprehensively understand the target appearance variations, recall some stable results. Both qualitative and quantitative evaluations on two benchmarks suggest that the proposed tracking method performs favorably against some recently developed CFTs and other competitive trackers.

  16. Regressed relations for forced convection heat transfer in a direct injection stratified charge rotary engine

    NASA Technical Reports Server (NTRS)

    Lee, Chi M.; Schock, Harold J.

    1988-01-01

    Currently, the heat transfer equation used in the rotary combustion engine (RCE) simulation model is taken from piston engine studies. These relations have been empirically developed by the experimental input coming from piston engines whose geometry differs considerably from that of the RCE. The objective of this work was to derive equations to estimate heat transfer coefficients in the combustion chamber of an RCE. This was accomplished by making detailed temperature and pressure measurements in a direct injection stratified charge (DISC) RCE under a range of conditions. For each specific measurement point, the local gas velocity was assumed equal to the local rotor tip speed. Local physical properties of the fluids were then calculated. Two types of correlation equations were derived and are described in this paper. The first correlation expresses the Nusselt number as a function of the Prandtl number, Reynolds number, and characteristic temperature ratio; the second correlation expresses the forced convection heat transfer coefficient as a function of fluid temperature, pressure and velocity.

  17. Supersonic Jet Exhaust Noise at High Subsonic Flight Speed

    NASA Technical Reports Server (NTRS)

    Norum, Thomas D.; Garber, Donald P.; Golub, Robert A.; Santa Maria, Odilyn L.; Orme, John S.

    2004-01-01

    An empirical model to predict the effects of flight on the noise from a supersonic transport is developed. This model is based on an analysis of the exhaust jet noise from high subsonic flights of the F-15 ACTIVE Aircraft. Acoustic comparisons previously attainable only in a wind tunnel were accomplished through the control of both flight operations and exhaust nozzle exit diameter. Independent parametric variations of both flight and exhaust jet Mach numbers at given supersonic nozzle pressure ratios enabled excellent correlations to be made for both jet broadband shock noise and jet mixing noise at flight speeds up to Mach 0.8. Shock noise correlated with flight speed and emission angle through a Doppler factor exponent of about 2.6. Mixing noise at all downstream angles was found to correlate well with a jet relative velocity exponent of about 7.3, with deviations from this behavior only at supersonic eddy convection speeds and at very high flight Mach numbers. The acoustic database from the flight test is also provided.

  18. Defense Waste Processing Facility Nitric- Glycolic Flowsheet Chemical Process Cell Chemistry: Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamecnik, J.; Edwards, T.

    The conversions of nitrite to nitrate, the destruction of glycolate, and the conversion of glycolate to formate and oxalate were modeled for the Nitric-Glycolic flowsheet using data from Chemical Process Cell (CPC) simulant runs conducted by Savannah River National Laboratory (SRNL) from 2011 to 2016. The goal of this work was to develop empirical correlation models to predict these values from measureable variables from the chemical process so that these quantities could be predicted a-priori from the sludge or simulant composition and measurable processing variables. The need for these predictions arises from the need to predict the REDuction/OXidation (REDOX) statemore » of the glass from the Defense Waste Processing Facility (DWPF) melter. This report summarizes the work on these correlations based on the aforementioned data. Previous work on these correlations was documented in a technical report covering data from 2011-2015. This current report supersedes this previous report. Further refinement of the models as additional data are collected is recommended.« less

  19. A comparative study of golf industry between Yangtze River Delta, China and Central Japan

    NASA Astrophysics Data System (ADS)

    Yang, Yangfan; Jin, Pingbin; Gong, Huiwen

    2018-03-01

    As a competition event of the 2016 Olympic Game, golf sport has aroused great attention around the world. And the Yangtze River Delta(YRD) in China, has already got certain basis and qualifications of developing golf industry, but somehow far from meeting the great potential demand of the market. This research selects the Yangtze River Delta (YRD) and Central Japan (CJ), which are indifferent golf developing stages, as the objectives. Comparative studies are being carried out with an aim of revealing the discrepancies of golf industry in selected regions. The correlations between golf industry and regional economic developing level have been explored as well. Mainly based on a geographical perspective, this research presents an initial effort to combine approaches of setting comparative indexes and spatial analysis, so that golf industry of selected regions will be compared in all directions. The results reveal that great gaps exist in YRD and CJ in terms of golf construction, service, and golf consumption. Problems in developing golf industry in YRD are identified based on the empirical results. A long-term golf development in YRD that deviating from the realistic demand is attributed to both government policies and the operational principles that the market subjects hold. Based on a comparative empirical study, suggestions relating to the government as well as the market players are put forward, with an aim of guiding the golf industry to develop in a sustainable way.

  20. The rise and fall of social communities: Cascades of followers triggered by innovators

    NASA Astrophysics Data System (ADS)

    Hu, Yanqing; Havlin, Shlomo; Makse, Hernan

    2013-03-01

    New scientific ideas as well as key political messages, consumer products, advertisement strategies and art trends are originally adopted by a small number of pioneers who innovate and develop the ``new ideas''. When these innovators migrate to develop the novel idea, their former social network gradually weakens its grips as followers migrate too. As a result, an internal ``cascade of followers'' starts immediately thereafter speeding up the extinction of the entire original network. A fundamental problem in network theory is to determine the minimum number of pioneers that, upon leaving, will disintegrate their social network. Here, we first employ empirical analyses of collaboration networks of scientists to show that these communities are extremely fragile with regard to the departure of a few pioneers. This process can be mapped out on a percolation model in a correlated graph crucially augmented with outgoing ``influence links''. Analytical solutions predict phase transitions, either abrupt or continuous, where networks are disintegrated through cascades of followers as in the empirical data. The theory provides a framework to predict the vulnerability of a large class of networks containing influence links ranging from social and infrastructure networks to financial systems and markets.

  1. A conceptual model of the trophodynamical response to river discharge in a large marine ecosystem

    NASA Astrophysics Data System (ADS)

    Skreslet, Stig

    1997-08-01

    Year-class strength in North-East Arctic cod ( Gadus morhua), which inhabit the Barents Sea, and commercial landings of juveniles from this population, have been positively correlated with Norwegian meltwater discharge one and three years in advance, respectively. A conceptual model is developed, by empirical data used to investigate how the freshwater signal may be transmitted with time and in space through the food-web. It assumes that interannual variation in discharged volume of meltwater during summer forces planktonic primary production in neritic fronts. The strength of this impulse is transmitted from one organismic system to another, along the north Norwegian shelf, being advected by Calanus finmarchicus, a herbivorous copepod. The population system of this copepod interacts with the survival and growth of juvenile NE Arctic cod, and causes the cod stock size to fluctuate with the strength of the signal. By migration and advection within their respective population systems, NE Arctic cod and C. finmarchicus possibly transmit the freshwater signal on extensive time and space scales, from the Norwegian shelf to distant parts of the Arctic Mediterranean Ecocystem that contains both population systems. Continued empirical research and numerical modelling is needed to develop this theory.

  2. Empirical modelling to predict the refractive index of human blood.

    PubMed

    Yahya, M; Saghir, M Z

    2016-02-21

    Optical techniques used for the measurement of the optical properties of blood are of great interest in clinical diagnostics. Blood analysis is a routine procedure used in medical diagnostics to confirm a patient's condition. Measuring the optical properties of blood is difficult due to the non-homogenous nature of the blood itself. In addition, there is a lot of variation in the refractive indices reported in the literature. These are the reasons that motivated the researchers to develop a mathematical model that can be used to predict the refractive index of human blood as a function of concentration, temperature and wavelength. The experimental measurements were conducted on mimicking phantom hemoglobin samples using the Abbemat Refractometer. The results analysis revealed a linear relationship between the refractive index and concentration as well as temperature, and a non-linear relationship between refractive index and wavelength. These results are in agreement with those found in the literature. In addition, a new formula was developed based on empirical modelling which suggests that temperature and wavelength coefficients be added to the Barer formula. The verification of this correlation confirmed its ability to determine refractive index and/or blood hematocrit values with appropriate clinical accuracy.

  3. Validity of empirical models of exposure in asphalt paving

    PubMed Central

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  4. Do more intelligent brains retain heightened plasticity for longer in development? A computational investigation.

    PubMed

    Thomas, Michael S C

    2016-06-01

    Twin studies indicate that the heritability of general cognitive ability - the genetic contribution to individual differences - increases with age. Brant et al. (2013) reported that this increase in heritability occurs earlier in development for low ability children than high ability children. Allied with structural brain imaging results that indicate faster thickening and thinning of cortex for high ability children (Shaw et al., 2006), Brant and colleagues argued higher cognitive ability represents an extended sensitive period for brain development. However, they admitted no coherent mechanistic account can currently reconcile the key empirical data. Here, computational methods are employed to demonstrate the empirical data can be reconciled without recourse to variations in sensitive periods. These methods utilized population-based artificial neural network models of cognitive development. In the model, ability-related variations stemmed from the timing of the increases in the non-linearity of computational processes, causing dizygotic twins to diverge in their behavior. These occurred in a population where: (a) ability was determined by the combined small contributions of many neurocomputational factors, and (b) individual differences in ability were largely genetically constrained. The model's explanation of developmental increases in heritability contrasts with proposals that these increases represent emerging gene-environment correlations (Haworth et al., 2010). The article advocates simulating inherited individual differences within an explicitly developmental framework. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  5. Development of an empirical mathematical model for describing and optimizing the hygiene potential of a thermophilic anaerobic bioreactor treating faeces.

    PubMed

    Lübken, M; Wichern, M; Bischof, F; Prechtl, S; Horn, H

    2007-01-01

    Poor sanitation and insufficient disposal of sewage and faeces are primarily responsible for water associated health problems in developing countries. Domestic sewage and faeces are prevalently discharged into surface waters which are used by the inhabitants as a source for drinking water. This paper presents a decentralized anaerobic process technique for handling of such domestic organic waste. Such an efficient and compact system for treating faeces and food waste may be of great benefit for developing countries. Besides a stable biogas production for energy generation, the reduction of bacterial pathogens is of particular importance. In our research we investigated the removal capacity of the reactor concerning pathogens, which has been operated under thermophilic conditions. Faecal coliforms and intestinal enterococci have been detected as indicator organisms for bacterial pathogens. By the multiple regression analysis technique an empirical mathematical model has been developed. The model shows a high correlation between removal efficiency and both, hydraulic retention time (HRT) and temperature. By this model an optimized HRT for defined bacterial pathogens effluent standards can be easily calculated. Thus, hygiene potential can be evaluated along with economic aspects. In this paper not only results for describing the hygiene potential of a thermophilic anaerobic bioreactor are presented, but also an exemplary method to draw the right conclusions out of biological tests with the aid of mathematical tools.

  6. Content, Social, and Metacognitive Statements: An Empirical Study Comparing Human-Human and Human-Computer Tutorial Dialogue

    DTIC Science & Technology

    2010-01-01

    for each participant using the formula gain = ( posttest − pretest )/(1− pretest ). 6.2 Content-Learning Correlations The summary of language statistics...differences also affect which factors are correlated with learning gain and user satisfaction. We argue that ITS designers should pay particular...factors are correlated with learning gain and user satisfaction. We argue that ITS designers should pay particular attention to strategies for dealing

  7. Developing a prototype for short-term psychodynamic (supportive-expressive) therapy: An empirical study with the psychotherapy process Q-set.

    PubMed

    Leichsenring, Falk; Ablon, Stuart; Barber, Jacques P; Beutel, Manfred; Gibbons, Mary Beth Connolly; Crits-Christoph, Paul; Klein, Susanne; Leweke, Frank; Steinert, Christiane; Wiltink, Jörg; Salzer, Simone

    2016-07-01

    A Psychotherapy Process Q-set (PQS) prototype characteristic of short-term psychodynamic therapy (STPP) does not yet exist. Experts in supportive-expressive (SE) therapy used the 100-Item PQS questionnaire to rate an ideal short-term SE therapy. Agreement between raters was high (Cronbach's alpha = 0.94). The prototype for SE therapy showed a significant correlation with the psychoanalytic prototype, but with 28% of variance explained, the majority of variance of the former was not explained by the latter or vice versa. Furthermore, the SE prototype showed significant correlations with the cognitive-behavioral prototype and the prototype of interpersonal therapy by Ablon and Jones (r = 0.69, 0.43). We recommend using the PQS prototype presented here for future process research on STPP.

  8. Winter Precipitation Forecast in the European and Mediterranean Regions Using Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Totz, Sonja; Tziperman, Eli; Coumou, Dim; Pfeiffer, Karl; Cohen, Judah

    2017-12-01

    The European climate is changing under global warming, and especially the Mediterranean region has been identified as a hot spot for climate change with climate models projecting a reduction in winter rainfall and a very pronounced increase in summertime heat waves. These trends are already detectable over the historic period. Hence, it is beneficial to forecast seasonal droughts well in advance so that water managers and stakeholders can prepare to mitigate deleterious impacts. We developed a new cluster-based empirical forecast method to predict precipitation anomalies in winter. This algorithm considers not only the strength but also the pattern of the precursors. We compare our algorithm with dynamic forecast models and a canonical correlation analysis-based prediction method demonstrating that our prediction method performs better in terms of time and pattern correlation in the Mediterranean and European regions.

  9. Correlation between multispectral photography and near-surface turbidities

    NASA Technical Reports Server (NTRS)

    Wertz, D. L.; Mealor, W. T.; Steele, M. L.; Pinson, J. W.

    1976-01-01

    Four-band multispectral photography obtained from an aerial platform at an altitude of about 10,000 feet has been utilized to measure near-surface turbidity at numerous sampling sites in the Ross Barnett Reservoir, Mississippi. Correlation of the photographs with turbidity measurements has been accomplished via an empirical mathematical model which depends upon visual color recognition when the composited photographs are examined on either an I squared S model 600 or a Spectral Data model 65 color-additive viewer. The mathematical model was developed utilizing least-squares, iterative, and standard statistical methods and includes a time-dependent term related to sun angle. This model is consistent with information obtained from two overflights of the target area - July 30, 1973 and October 30, 1973 - and now is being evaluated with regard to information obtained from a third overflight on November 8, 1974.

  10. Childhood Traumatic Grief: A Multi-Site Empirical Examination of the Construct and Its Correlates

    ERIC Educational Resources Information Center

    Brown, Elissa J.; Amaya-Jackson, Lisa; Cohen, Judith; Handel, Stephanie; De Bocanegra, Heike Thiel; Zatta, Eileen; Goodman, Robin F.; Mannarino, Anthony

    2008-01-01

    This study evaluated the construct of childhood traumatic grief (CTG) and its correlates through a multi-site assessment of 132 bereaved children and adolescents. Youth completed a new measure of the characteristics, attributions, and reactions to exposure to death (CARED), as well as measures of CTG, posttraumatic stress disorder (PTSD),…

  11. Prevalence and Socio-Demographic Correlates of Psychological Distress among Students at an Australian University

    ERIC Educational Resources Information Center

    Larcombe, Wendy; Finch, Sue; Sore, Rachel; Murray, Christina M.; Kentish, Sandra; Mulder, Raoul A.; Lee-Stecum, Parshia; Baik, Chi; Tokatlidis, Orania; Williams, David A.

    2016-01-01

    This research contributes to the empirical literature on university student mental well-being by investigating the prevalence and socio-demographic correlates of severe levels of psychological distress. More than 5000 students at a metropolitan Australian university participated in an anonymous online survey in 2013 that included the short form of…

  12. Correlates of Conduct Problems and Depression Comorbidity in Elementary School Boys and Girls Receiving Special Educational Services

    ERIC Educational Resources Information Center

    Poirier, Martine; Déry, Michèle; Toupin, Jean; Verlaan, Pierrette; Lemelin, Jean-Pascal; Jagiellowicz, Jadzia

    2015-01-01

    There is limited empirical research on the correlates of conduct problems (CP) and depression comorbidity during childhood. This study investigated 479 elementary school children (48.2% girls). It compared children with comorbidity to children with CP only, depression only, and control children on individual, academic, social, and family…

  13. Visual Skills and Chinese Reading Acquisition: A Meta-Analysis of Correlation Evidence

    ERIC Educational Resources Information Center

    Yang, Ling-Yan; Guo, Jian-Peng; Richman, Lynn C.; Schmidt, Frank L.; Gerken, Kathryn C.; Ding, Yi

    2013-01-01

    This paper used meta-analysis to synthesize the relation between visual skills and Chinese reading acquisition based on the empirical results from 34 studies published from 1991 to 2011. We obtained 234 correlation coefficients from 64 independent samples, with a total of 5,395 participants. The meta-analysis revealed that visual skills as a…

  14. Exponential Correlation of IQ and the Wealth of Nations

    ERIC Educational Resources Information Center

    Dickerson, Richard E.

    2006-01-01

    Plots of mean IQ and per capita real Gross Domestic Product for groups of 81 and 185 nations, as collected by Lynn and Vanhanen, are best fitted by an exponential function of the form: GDP = "a" * 10["b"*(IQ)], where "a" and "b" are empirical constants. Exponential fitting yields markedly higher correlation coefficients than either linear or…

  15. Classical Item Analysis Using Latent Variable Modeling: A Note on a Direct Evaluation Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2011-01-01

    A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…

  16. Correlates of parent-youth discordance about youth-witnessed violence: a brief report.

    PubMed

    Lewis, Terri; Thompson, Richard; Kotch, Jonathan B; Proctor, Laura J; Litrownik, Alan J; English, Diana J; Runyan, Desmond K; Wiley, Tisha R; Dubowitz, Howard

    2013-01-01

    Studies have consistently demonstrated a lack of agreement between youth and parent reports regarding youth-witnessed violence (YWV). However, little empirical investigation has been conducted on the correlates of disagreement. Concordance between youth and parents about YWV was examined in 766 parent-youth dyads from the Longitudinal Studies of Child Abuse and Neglect (LONGSCAN). Results showed that significantly more youth (42%) than parents (15%) reported YWV. Among the dyads in which at least one informant reported YWV (N = 344), we assessed whether youth delinquency, parental monitoring, parent-child relationship quality, history of child maltreatment, income, and parental depression were predictive of parent-youth concordance. Findings indicated that youth engagement in delinquent activities was higher in the groups in which the youth reported violence exposure. More empirical study is needed to assess correlates of agreement in high-risk youth to better inform associations found between exposures and outcomes as well as practice and policy for violence exposed youth.

  17. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  18. An experimental and theoretical study to relate uncommon rock/fluid properties to oil recovery. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, R.

    Waterflooding is the most commonly used secondary oil recovery technique. One of the requirements for understanding waterflood performance is a good knowledge of the basic properties of the reservoir rocks. This study is aimed at correlating rock-pore characteristics to oil recovery from various reservoir rock types and incorporating these properties into empirical models for Predicting oil recovery. For that reason, this report deals with the analyses and interpretation of experimental data collected from core floods and correlated against measurements of absolute permeability, porosity. wettability index, mercury porosimetry properties and irreducible water saturation. The results of the radial-core the radial-core andmore » linear-core flow investigations and the other associated experimental analyses are presented and incorporated into empirical models to improve the predictions of oil recovery resulting from waterflooding, for sandstone and limestone reservoirs. For the radial-core case, the standardized regression model selected, based on a subset of the variables, predicted oil recovery by waterflooding with a standard deviation of 7%. For the linear-core case, separate models are developed using common, uncommon and combination of both types of rock properties. It was observed that residual oil saturation and oil recovery are better predicted with the inclusion of both common and uncommon rock/fluid properties into the predictive models.« less

  19. Employment Condition, Economic Deprivation and Self-Evaluated Health in Europe: Evidence from EU-SILC 2009-2012.

    PubMed

    Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana

    2017-02-03

    Background : The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods : Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion : Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions : Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income.

  20. Employment Condition, Economic Deprivation and Self-Evaluated Health in Europe: Evidence from EU-SILC 2009–2012

    PubMed Central

    Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana

    2017-01-01

    Background: The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods: Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion: Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions: Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income. PMID:28165375

  1. Why do generic drugs fail to achieve an adequate market share in Greece? Empirical findings and policy suggestions.

    PubMed

    Balasopoulos, T; Charonis, A; Athanasakis, K; Kyriopoulos, J; Pavi, E

    2017-03-01

    Since 2010, the memoranda of understanding were implemented in Greece as a measure of fiscal adjustment. Public pharmaceutical expenditure was one of the main focuses of this implementation. Numerous policies, targeted on pharma spending, reduced the pharmaceutical budget by 60.5%. Yet, generics' penetration in Greece remained among the lowest among OECD countries. This study aims to highlight the factors that affect the perceptions of the population on generic drugs and to suggest effective policy measures. The empirical analysis is based on a national cross-sectional survey that was conducted through a sample of 2003 individuals, representative of the general population. Two ordinal logistic regression models were constructed in order to identify the determinants that affect the respondents' beliefs on the safety and the effectiveness of generic drugs. The empirical findings presented a positive and statistically significant correlation with income, bill payment difficulties, safety and effectiveness of drugs, prescription and dispensing preferences and the views toward pharmaceutical companies. Also, age and trust toward medical community have a positive and statistically significant correlation with the perception on the safety of generic drugs. Policy interventions are suggested on the bases of the empirical results on 3 major categories; (a) information campaigns, (b) incentives to doctors and pharmacists and (c) to strengthen the bioequivalence control framework and the dissemination of results. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Ground Based Ultraviolet Remote Sensing of Volcanic Gas Plumes

    PubMed Central

    Kantzas, Euripides P.; McGonigle, Andrew J. S.

    2008-01-01

    Ultraviolet spectroscopy has been implemented for over thirty years to monitor volcanic SO2 emissions. These data have provided valuable information concerning underground magmatic conditions, which have been of utility in eruption forecasting efforts. During the last decade the traditionally used correlation spectrometers have been upgraded with miniature USB coupled UV spectrometers, opening a series of exciting new empirical possibilities for understanding volcanoes and their impacts upon the atmosphere. Here we review these technological developments, in addition to the scientific insights they have precipitated, covering the strengths and current limitations of this approach. PMID:27879780

  3. Asymptotics of nonparametric L-1 regression models with dependent data

    PubMed Central

    ZHAO, ZHIBIAO; WEI, YING; LIN, DENNIS K.J.

    2013-01-01

    We investigate asymptotic properties of least-absolute-deviation or median quantile estimates of the location and scale functions in nonparametric regression models with dependent data from multiple subjects. Under a general dependence structure that allows for longitudinal data and some spatially correlated data, we establish uniform Bahadur representations for the proposed median quantile estimates. The obtained Bahadur representations provide deep insights into the asymptotic behavior of the estimates. Our main theoretical development is based on studying the modulus of continuity of kernel weighted empirical process through a coupling argument. Progesterone data is used for an illustration. PMID:24955016

  4. Quantifying the process and outcomes of person-centered planning.

    PubMed

    Holburn, S; Jacobson, J W; Vietze, P M; Schwartz, A A; Sersen, E

    2000-09-01

    Although person-centered planning is a popular approach in the field of developmental disabilities, there has been little systematic assessment of its process and outcomes. To measure person-centered planning, we developed three instruments designed to assess its various aspects. We then constructed variables comprising both a Process and an Outcome Index using a combined rational-empirical method. Test-retest reliability and measures of internal consistency appeared adequate. Variable correlations and factor analysis were generally consistent with our conceptualization and resulting item and variable classifications. Practical implications for intervention integrity, program evaluation, and organizational performance are discussed.

  5. Enhanced Evaporation and Condensation in Tubes

    NASA Astrophysics Data System (ADS)

    Honda, Hiroshi

    A state-of-the-art review of enhanced evaporation and condensation in horizontal microfin tubes and micro-channels that are used for air-conditioning and refrigeration applications is presented. The review covers the effects of flow pattern and geometrical parameters of the tubes on the heat transfer performance. Attention is paid to the effect of surface tension which leads to enhanced evaporation and condensation in the microfin tubes and micro-channels. A review of prior efforts to develop empirical correlations of the heat transfer coefficient and theoretical models for evaporation and condensation in the horizontal microfin tubes and micro-channels is also presented.

  6. Statistical correlations of shear wave velocity and penetration resistance for soils

    NASA Astrophysics Data System (ADS)

    Dikmen, Ünal

    2009-03-01

    In this paper, the correlation between shear wave velocity and standard penetration test blow counts (SPT-N) is investigated. The study focused primarily on the correlation of SPT-N and shear wave velocity (Vs) for several soil categories: all soils, sand, silt and clay-type soils. New empirical formulae are suggested to correlate SPT-N and Vs, based on a dataset collected in a part of Eskişehir settlement in the western central Anatolia region of Turkey. The formulae are based on geotechnical soundings and active and passive seismic experiments. The new and previously suggested formulae showing correlations between uncorrected SPT-N and Vs have been compared and evaluated by using the same dataset. The results suggest that better correlations in estimation of Vs are acquired when the uncorrected blow counts are used. The blow count is a major parameter and the soil type has no significant influence on the results. In cohesive soils, the plasticity contents and, in non-cohesive soils except for gravels, the graded contents have no significant effect on the estimation of Vs. The results support most of the conclusions of earlier studies. These practical relationships developed between SPT-N and Vs should be used with caution in geotechnical engineering and should be checked against measured Vs.

  7. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  8. Using brain stimulation to disentangle neural correlates of conscious vision

    PubMed Central

    de Graaf, Tom A.; Sack, Alexander T.

    2014-01-01

    Research into the neural correlates of consciousness (NCCs) has blossomed, due to the advent of new and increasingly sophisticated brain research tools. Neuroimaging has uncovered a variety of brain processes that relate to conscious perception, obtained in a range of experimental paradigms. But methods such as functional magnetic resonance imaging or electroencephalography do not always afford inference on the functional role these brain processes play in conscious vision. Such empirical NCCs could reflect neural prerequisites, neural consequences, or neural substrates of a conscious experience. Here, we take a closer look at the use of non-invasive brain stimulation (NIBS) techniques in this context. We discuss and review how NIBS methodology can enlighten our understanding of brain mechanisms underlying conscious vision by disentangling the empirical NCCs. PMID:25295015

  9. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    PubMed

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  10. GPS-Based Reduced Dynamic Orbit Determination Using Accelerometer Data

    NASA Technical Reports Server (NTRS)

    VanHelleputte, Tom; Visser, Pieter

    2007-01-01

    Currently two gravity field satellite missions, CHAMP and GRACE, are equipped with high sensitivity electrostatic accelerometers, measuring the non-conservative forces acting on the spacecraft in three orthogonal directions. During the gravity field recovery these measurements help to separate gravitational and non-gravitational contributions in the observed orbit perturbations. For precise orbit determination purposes all these missions have a dual-frequency GPS receiver on board. The reduced dynamic technique combines the dense and accurate GPS observations with physical models of the forces acting on the spacecraft, complemented by empirical accelerations, which are stochastic parameters adjusted in the orbit determination process. When the spacecraft carries an accelerometer, these measured accelerations can be used to replace the models of the non-conservative forces, such as air drag and solar radiation pressure. This approach is implemented in a batch least-squares estimator of the GPS High Precision Orbit Determination Software Tools (GHOST), developed at DLR/GSOC and DEOS. It is extensively tested with data of the CHAMP and GRACE satellites. As accelerometer observations typically can be affected by an unknown scale factor and bias in each measurement direction, they require calibration during processing. Therefore the estimated state vector is augmented with six parameters: a scale and bias factor for the three axes. In order to converge efficiently to a good solution, reasonable a priori values for the bias factor are necessary. These are calculated by combining the mean value of the accelerometer observations with the mean value of the non-conservative force models and empirical accelerations, estimated when using these models. When replacing the non-conservative force models with accelerometer observations and still estimating empirical accelerations, a good orbit precision is achieved. 100 days of GRACE B data processing results in a mean orbit fit of a few centimeters with respect to high-quality JPL reference orbits. This shows a slightly better consistency compared to the case when using force models. A purely dynamic orbit, without estimating empirical accelerations thus only adjusting six state parameters and the bias and scale factors, gives an orbit fit for the GRACE B test case below the decimeter level. The in orbit calibrated accelerometer observations can be used to validate the modelled accelerations and estimated empirical accelerations computed with the GHOST tools. In along track direction they show the best resemblance, with a mean correlation coefficient of 93% for the same period. In radial and normal direction the correlation is smaller. During days of high solar activity the benefit of using accelerometer observations is clearly visible. The observations during these days show fluctuations which the modelled and empirical accelerations can not follow.

  11. Empirical analysis on the human dynamics of blogging behavior on GitHub

    NASA Astrophysics Data System (ADS)

    Yan, Deng-Cheng; Wei, Zong-Wen; Han, Xiao-Pu; Wang, Bing-Hong

    2017-01-01

    GitHub is a social collaborative coding platform on which software developers not only collaborate on codes but also share knowledge through blogs using GitHub Pages. In this article, we analyze the blogging behavior of software developers on GitHub Pages. The results show that both the commit number and the inter-event time of two consecutive blogging actions follow heavy-tailed distribution. We further observe a significant variety of activity among individual developers, and a strongly positive correlation between the activity and the power-law exponent of the inter-event time distribution. We also find a difference between the user behaviors of GitHub Pages and other online systems which is driven by the diversity of users and length of contents. In addition, our result shows an obvious difference between the majority of developers and elite developers in their burstiness property.

  12. Correlation of transonic-cone preston-tube data and skin friction

    NASA Technical Reports Server (NTRS)

    Abu-Mostafa, A. S.; Reed, T. D.

    1984-01-01

    Preston-tube measurements obtained on the Arnold Engineering Development Center (AEDC) Transition Cone have been correlated with theoretical skin friction coefficients in transitional and turbulent flow. This has been done for the NASA Ames 11-Ft Transonic Wind Tunnel (11 TWT) and flight tests. The developed semi-empirical correlations of Preston-tube data have been used to derive a calibration procedure for the 11 TWT flow quality. This procedure has been applied to the corrected laminar data, and an effective freestream unit Reynolds number is defined by requiring a matching of the average Preston-tube pressure in flight and in the tunnel. This study finds that the operating Reynolds number is below the effective value required for a match in laminar Preston-tube data. The distribution of this effective Reynolds number with Mach number correlates well with the freestream noise level in this tunnel. Analyses of transitional and turbulent data, however, did not result in effective Reynolds numbers that can be correlated with background noise. This is a result of the fact that vorticity fluctuations present in transitional and turbulent boundary layers dominate Preston-tube pressure fluctuations and, therefore, mask the tunnel noise eff ects. So, in order to calibrate the effects of noise on transonic wind tunnel tests only laminar data should be used, preferably at flow conditions similar to those in flight tests. To calibrate the effects of transonic wind-tunnel noise on drag measurements, however, the Preston-tube data must be supplemented with direct measurements of skin friction.

  13. Local normalization: Uncovering correlations in non-stationary financial time series

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Guhr, Thomas

    2010-09-01

    The measurement of correlations between financial time series is of vital importance for risk management. In this paper we address an estimation error that stems from the non-stationarity of the time series. We put forward a method to rid the time series of local trends and variable volatility, while preserving cross-correlations. We test this method in a Monte Carlo simulation, and apply it to empirical data for the S&P 500 stocks.

  14. The Interpersonal Adaptiveness of Dispositional Guilt and Shame: A Meta-Analytic Investigation.

    PubMed

    Tignor, Stefanie M; Colvin, C Randall

    2017-06-01

    Despite decades of empirical research, conclusions regarding the adaptiveness of dispositional guilt and shame are mixed. We use meta-analysis to summarize the empirical literature and clarify these ambiguities. Specifically, we evaluate how guilt and shame are uniquely related to pro-social orientation and, in doing so, highlight the substantial yet under-acknowledged impact of researchers' methodological choices. A series of meta-analyses was conducted investigating the relationship between dispositional guilt (or shame) and pro-social orientation. Two main methodological moderators of interest were tested: test format (scenario vs. checklist) and statistical analysis (semi-partial vs. zero-order correlations). Among studies employing zero-order correlations, dispositional guilt was positively correlated with pro-social orientation (k = 63, Mr = .13, p < .001), whereas dispositional shame was negatively correlated, (k = 47, Mr = -.05, p = .07). Test format was a significant moderator for guilt studies only, with scenario measures producing significantly stronger effects. Semi-partial correlations resulted in significantly stronger effects among guilt and shame studies. Although dispositional guilt and shame are differentially related to pro-social orientation, such relationships depend largely on the methodological choices of the researcher, particularly in the case of guilt. Implications for the study of these traits are discussed. © 2016 Wiley Periodicals, Inc.

  15. No complexity–stability relationship in empirical ecosystems

    PubMed Central

    Jacquet, Claire; Moritz, Charlotte; Morissette, Lyne; Legagneux, Pierre; Massol, François; Archambault, Philippe; Gravel, Dominique

    2016-01-01

    Understanding the mechanisms responsible for stability and persistence of ecosystems is one of the greatest challenges in ecology. Robert May showed that, contrary to intuition, complex randomly built ecosystems are less likely to be stable than simpler ones. Few attempts have been tried to test May's prediction empirically, and we still ignore what is the actual complexity–stability relationship in natural ecosystems. Here we perform a stability analysis of 116 quantitative food webs sampled worldwide. We find that classic descriptors of complexity (species richness, connectance and interaction strength) are not associated with stability in empirical food webs. Further analysis reveals that a correlation between the effects of predators on prey and those of prey on predators, combined with a high frequency of weak interactions, stabilize food web dynamics relative to the random expectation. We conclude that empirical food webs have several non-random properties contributing to the absence of a complexity–stability relationship. PMID:27553393

  16. Low Velocity Earth-Penetration Test and Analysis

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jones, Yvonne; Knight, Norman F., Jr.; Kellas, Sotiris

    2001-01-01

    Modeling and simulation of structural impacts into soil continue to challenge analysts to develop accurate material models and detailed analytical simulations to predict the soil penetration event. This paper discusses finite element modeling of a series of penetrometer drop tests into soft clay. Parametric studies are performed with penetrometers of varying diameters, masses, and impact speeds to a maximum of 45 m/s. Parameters influencing the simulation such as the contact penalty factor and the material model representing the soil are also studied. An empirical relationship between key parameters is developed and is shown to correlate experimental and analytical results quite well. The results provide preliminary design guidelines for Earth impact that may be useful for future space exploration sample return missions.

  17. Ultrasonic nondestructive evaluation, microstructure, and mechanical property interrelations

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1984-01-01

    Ultrasonic techniques for mechanical property characterizations are reviewed and conceptual models are advanced for explaining and interpreting the empirically based results. At present, the technology is generally empirically based and is emerging from the research laboratory. Advancement of the technology will require establishment of theoretical foundations for the experimentally observed interrelations among ultrasonic measurements, mechanical properties, and microstructure. Conceptual models are applied to ultrasonic assessment of fracture toughness to illustrate an approach for predicting correlations found among ultrasonic measurements, microstructure, and mechanical properties.

  18. Supercomputer modelling of an electronic structure for KCl nanocrystal with edge dislocation with the use of semiempirical and nonempirical models

    NASA Astrophysics Data System (ADS)

    Timoshenko, Yu K.; Shunina, V. A.; Shashkin, A. I.

    2018-03-01

    In the present work we used semiempirical and non-empirical models for electronic states of KCl nanocrystal containing edge dislocation for comparison of the obtained results. Electronic levels and local densities of states were calculated. As a result we found a reasonable qualitative correlation of semiempirical and non-empirical results. Using the results of computer modelling we discuss the problem of localization of electronic states near the line of edge dislocation.

  19. Imaging subsurface hydrothermal structure using a dense geophone array in Yellowstone

    NASA Astrophysics Data System (ADS)

    Wu, S. M.; Lin, F. C.; Farrell, J.; Smith, R. B.

    2016-12-01

    The recent development of ambient noise cross-correlation and the availability of large N seismic arrays allow for the study of detailed shallow crustal structure. In this study, we apply multi-component noise cross-correlation to explore shallow hydrothermal structure near Old Faithful geyser in Yellowstone National Park using a temporary geophone array. The array was composed of 133 three-component 5-Hz geophones and was deployed for two weeks during November 2015. The average station spacing is 50 meters and the full aperture of the array is around 1 km with good azimuthal and spatial coverage. The Upper Geyser Basin, where Old Faithful is located, has the largest concentration of geysers in the world. This unique active hydrothermal environment and hence the extremely inhomogeneous noise source distribution makes the construction of empirical Green's functions difficult based on the traditional noise cross-correlation method. In this presentation, we show examples of the constructed cross-correlation functions and demonstrate their spatial and temporal relationships with known hydrothermal activity. We also demonstrate how useful seismic signals can be extracted from these cross-correlation functions and used for subsurface imaging. In particular, we will discuss the existence of a recharge cavity beneath Old Faithful revealed by the noise cross-correlations. In addition, we also investigated the temporal structure variation based on time-lapse noise cross-correlations and these preliminary results will also be discussed.

  20. The interplay between post-critical beliefs and anxiety: an exploratory study in a Polish sample.

    PubMed

    Śliwak, Jacek; Zarzycka, Beata

    2012-06-01

    The present research investigates the relationship between anxiety and the religiosity dimensions that Wulff (Psychology of religion: classic and contemporary views, Wiley, New York, 1991; Psychology of religion. Classic and contemporary views, Wiley, New York, 1997; Psychologia religii. Klasyczna i współczesna, Wydawnictwo Szkolne i Pedagogiczne, Warszawa, 1999) described as Exclusion vs. Inclusion of Transcendence and Literal vs. Symbolic. The researchers used the Post-Critical Belief scale (Hutsebaut in J Empir Theol 9(2):48-66, 1996; J Empir Theol 10(1):39-54, 1997) to measure Wulff's religiosity dimensions and the IPAT scale (Krug et al. 1967) to measure anxiety. Results from an adult sample (N = 83) suggest that three dimensions show significant relations with anxiety. Orthodoxy correlated negatively with suspiciousness (L) and positively with guilt proneness (O) factor-in the whole sample. Among women, Historical Relativism negatively correlated with suspiciousness (L), lack of integration (Q3), general anxiety and covert anxiety. Among men, Historical Relativism positively correlated with tension (Q4) and emotional instability (C), general anxiety, covert anxiety and overt anxiety. External Critique was correlated with suspiciousness (L) by men.

  1. Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size

    PubMed Central

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology. PMID:25192357

  2. Publication bias in psychology: a diagnosis based on the correlation between effect size and sample size.

    PubMed

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. We found a negative correlation of r = -.45 [95% CI: -.53; -.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology.

  3. Mechanistic quantitative structure-activity relationship model for the photoinduced toxicity of polycyclic aromatic hydrocarbons. 2: An empirical model for the toxicity of 16 polycyclic aromatic hydrocarbons to the duckweed Lemna gibba L. G-3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, X.D.; Krylov, S.N.; Ren, L.

    1997-11-01

    Photoinduced toxicity of polycyclic aromatic hydrocarbons (PAHs) occurs via photosensitization reactions (e.g., generation of singlet-state oxygen) and by photomodification (photooxidation and/or photolysis) of the chemicals to more toxic species. The quantitative structure-activity relationship (QSAR) described in the companion paper predicted, in theory, that photosensitization and photomodification additively contribute to toxicity. To substantiate this QSAR modeling exercise it was necessary to show that toxicity can be described by empirically derived parameters. The toxicity of 16 PAHs to the duckweed Lemna gibba was measured as inhibition of leaf production in simulated solar radiation (a light source with a spectrum similar to thatmore » of sunlight). A predictive model for toxicity was generated based on the theoretical model developed in the companion paper. The photophysical descriptors required of each PAH for modeling were efficiency of photon absorbance, relative uptake, quantum yield for triplet-state formation, and the rate of photomodification. The photomodification rates of the PAHs showed a moderate correlation to toxicity, whereas a derived photosensitization factor (PSF; based on absorbance, triplet-state quantum yield, and uptake) for each PAH showed only a weak, complex correlation to toxicity. However, summing the rate of photomodification and the PSF resulted in a strong correlation to toxicity that had predictive value. When the PSF and a derived photomodification factor (PMF; based on the photomodification rate and toxicity of the photomodified PAHs) were summed, an excellent explanatory model of toxicity was produced, substantiating the additive contributions of the two factors.« less

  4. Excess entropy scaling for the segmental and global dynamics of polyethylene melts.

    PubMed

    Voyiatzis, Evangelos; Müller-Plathe, Florian; Böhm, Michael C

    2014-11-28

    The range of validity of the Rosenfeld and Dzugutov excess entropy scaling laws is analyzed for unentangled linear polyethylene chains. We consider two segmental dynamical quantities, i.e. the bond and the torsional relaxation times, and two global ones, i.e. the chain diffusion coefficient and the viscosity. The excess entropy is approximated by either a series expansion of the entropy in terms of the pair correlation function or by an equation of state for polymers developed in the context of the self associating fluid theory. For the whole range of temperatures and chain lengths considered, the two estimates of the excess entropy are linearly correlated. The scaled bond and torsional relaxation times fall into a master curve irrespective of the chain length and the employed scaling scheme. Both quantities depend non-linearly on the excess entropy. For a fixed chain length, the reduced diffusion coefficient and viscosity scale linearly with the excess entropy. An empirical reduction to a chain length-independent master curve is accessible for both dynamic quantities. The Dzugutov scheme predicts an increased value of the scaled diffusion coefficient with increasing chain length which contrasts physical expectations. The origin of this trend can be traced back to the density dependence of the scaling factors. This finding has not been observed previously for Lennard-Jones chain systems (Macromolecules, 2013, 46, 8710-8723). Thus, it limits the applicability of the Dzugutov approach to polymers. In connection with diffusion coefficients and viscosities, the Rosenfeld scaling law appears to be of higher quality than the Dzugutov approach. An empirical excess entropy scaling is also proposed which leads to a chain length-independent correlation. It is expected to be valid for polymers in the Rouse regime.

  5. Understanding and Forecasting Upper Atmosphere Nitric Oxide Through Data Mining Analysis of TIMED/SABER Data

    NASA Astrophysics Data System (ADS)

    Flynn, S.; Knipp, D. J.; Matsuo, T.; Mlynczak, M. G.; Hunt, L. A.

    2017-12-01

    Storm time energy input to the upper atmosphere is countered by infrared radiative emissions from nitric oxide (NO). The temporal profile of these energy sources and losses strongly control thermospheric density profiles, which in turn affect the drag experienced by low Earth orbiting satellites. Storm time processes create NO. In some extreme cases an overabundance of NO emissions unexpectedly decreases atmospheric temperature and density to lower than pre-storm values. Quantifying the spatial and temporal variability of the NO emissions using eigenmodes will increase the understanding of how upper atmospheric NO behaves, and could be used to increase the accuracy of future space weather and climate models. Thirteen years of NO flux data, observed at 100-250 km altitude by the SABER instrument onboard the TIMED satellite, is decomposed into five empirical orthogonal functions (EOFs) and their amplitudes to: 1) determine the strongest modes of variability in the data set, and 2) develop a compact model of NO flux. The first five EOFs account for 85% of the variability in the data, and their uncertainty is verified using cross-validation analysis. Although these linearly independent EOFs are not necessarily independent in a geophysical sense, the first three EOFs correlate strongly with different geophysical processes. The first EOF correlates strongly with Kp and F10.7, suggesting that geomagnetic storms and solar weather account for a large portion of NO flux variability. EOF 2 shows annual variations, and EOF 3 correlates with solar wind parameters. Using these relations, an empirical model of the EOF amplitudes can be derived, which could be used as a predictive tool for future NO emissions. We illustrate the NO model, highlight some of the hemispheric asymmetries, and discuss the geophysical associations of the EOFs.

  6. [Association between obesity and DNA methylation among the 7-16 year-old twins].

    PubMed

    Li, C X; Gao, Y; Gao, W J; Yu, C Q; Lyu, J; Lyu, R R; Duan, J L; Sun, Y; Guo, X H; Wang, S F; Zhou, B; Wang, G; Cao, W H; Li, L M

    2018-04-10

    Objective: On whole-genome scale, we tried to explore the correlation between obesity-related traits and DNA methylation sites, based on discordant monozygotic twin pairs. Methods: A total of 90 pairs of 6-17 year-old twins were recruited in Chaoyang district, Yanqing district and Fangshan district in Beijing in 2016. Information on twins was gathered through a self-designed questionnaire and results: from physical examination, including height, weight and waist circumference of the subjects under study. DNA methylation detection was chosen on the Illumina Human Methylation EPIC BeadChip. R 3.3.1 language was used to read the DNA methylation signal under quality control on samples and probes. Ebayes function of empirical Bayes paired moderated t -test was used to identify the differential methylated CpG sites (DMCs). VarFit function of empirical Bayes paired moderated Levene test was used to identify the differentially variables CpG sits (DVCs) in obese and normal groups. Results According to the obesity discordance criteria, we collected 23 pairs of twins (age range 7 to 16 years), including 12 male pairs. A total of 817 471 qualified CpG loci were included in the genome-wide correlation analysis. According to the significance level of FDR set as <0.05, no positive sites would meet this standard. When DMC CpG site cg05684382, with the smallest P value (1.26E-06) as on chromosome 12, the DVC CpG site cg26188191 with the smallest P value (6.44E-06) appeared in CMIP gene on chromosome 16. Conclusions: In this study, we analyzed the genome-wide DNA methylation and its correlation with obesity traits. After multiple testing corrections, no positive sites were found to have associated with obesity. However, results from the correlation analysis demonstrated sites cg05684382 (chr: 12) and cg26188191 (chr: 16) might have played a role in the development of obesity. This study provides a methodologic reference for the studies on discordance twins related problems.

  7. Empirical links between natural mortality and recovery in marine fishes.

    PubMed

    Hutchings, Jeffrey A; Kuparinen, Anna

    2017-06-14

    Probability of species recovery is thought to be correlated with specific aspects of organismal life history, such as age at maturity and longevity, and how these affect rates of natural mortality ( M ) and maximum per capita population growth ( r max ). Despite strong theoretical underpinnings, these correlates have been based on predicted rather than realized population trajectories following threat mitigation. Here, we examine the level of empirical support for postulated links between a suite of life-history traits (related to maturity, age, size and growth) and recovery in marine fishes. Following threat mitigation (medium time since cessation of overfishing = 20 years), 71% of 55 temperate populations had fully recovered, the remainder exhibiting, on average, negligible change (impaired recovery). Singly, life-history traits did not influence recovery status. In combination, however, those that jointly reflect length-based mortality at maturity, M α , revealed that recovered populations have higher M α , which we hypothesize to reflect local adaptations associated with greater r max But, within populations, the smaller sizes at maturity generated by overfishing are predicted to increase M α , slowing recovery and increasing its uncertainty. We conclude that recovery potential is greater for populations adapted to high M but that temporal increases in M concomitant with smaller size at maturity will have the opposite effect. The recovery metric documented here ( M α ) has a sound theoretical basis, is significantly correlated with direct estimates of M that directly reflect r max , is not reliant on data-intensive time series, can be readily estimated, and offers an empirically defensible correlate of recovery, given its clear links to the positive and impaired responses to threat mitigation that have been observed in fish populations over the past three decades. © 2017 The Author(s).

  8. Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.

    PubMed

    Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David

    2008-04-01

    A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.

  9. A computer model for liquid jet atomization in rocket thrust chambers

    NASA Astrophysics Data System (ADS)

    Giridharan, M. G.; Lee, J. G.; Krishnan, A.; Yang, H. Q.; Ibrahim, E.; Chuech, S.; Przekwas, A. J.

    1991-12-01

    The process of atomization has been used as an efficient means of burning liquid fuels in rocket engines, gas turbine engines, internal combustion engines, and industrial furnaces. Despite its widespread application, this complex hydrodynamic phenomenon has not been well understood, and predictive models for this process are still in their infancy. The difficulty in simulating the atomization process arises from the relatively large number of parameters that influence it, including the details of the injector geometry, liquid and gas turbulence, and the operating conditions. In this study, numerical models are developed from first principles, to quantify factors influencing atomization. For example, the surface wave dynamics theory is used for modeling the primary atomization and the droplet energy conservation principle is applied for modeling the secondary atomization. The use of empirical correlations has been minimized by shifting the analyses to fundamental levels. During applications of these models, parametric studies are performed to understand and correlate the influence of relevant parameters on the atomization process. The predictions of these models are compared with existing experimental data. The main tasks of this study were the following: development of a primary atomization model; development of a secondary atomization model; development of a model for impinging jets; development of a model for swirling jets; and coupling of the primary atomization model with a CFD code.

  10. Geopressure modeling from petrophysical data: An example from East Kalimantan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herkommer, M.A.

    1994-07-01

    Localized models of abnormal formation pressure (geopressure) are important economic and safety tools frequently used for well planning and drilling operations. Simplified computer-based procedures have been developed that permit these models to be developed more rapidly and with greater accuracy. These techniques are broadly applicable to basins throughout the world where abnormal formation pressures occur. An example from the Attaka field of East Kalimantan, southeast Asia, shows how geopressure models are developed. Using petrophysical and engineering data, empirical correlations between observed pressure and petrophysical logs can be created by computer-assisted data-fitting techniques. These correlations serve as the basis for modelsmore » of the geopressure. By performing repeated analyses on wells at various locations, contour maps on the top of abnormal geopressure can be created. Methods that are simple in their development and application make the task of geopressure estimation less formidable to the geologist and petroleum engineer. Further, more accurate estimates can significantly improve drilling speeds while reducing the incidence of stuck pipe, kicks, and blowouts. In general, geopressure estimates are used in all phases of drilling operations: To develop mud plans and specify equipment ratings, to assist in the recognition of geopressured formations and determination of mud weights, and to improve predictions at offset locations and geologically comparable areas.« less

  11. Novel Multidimensional Cross-Correlation Data Comparison Techniques for Spectroscopic Discernment in a Volumetrically Sensitive, Moderating Type Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony

    2014-03-01

    A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.

  12. Property relationships of the physical infrastructure and the traffic flow networks

    NASA Astrophysics Data System (ADS)

    Zhou, Ta; Zou, Sheng-Rong; He, Da-Ren

    2010-03-01

    We studied both empirically and analytically the correlation between the degrees or the clustering coefficients, respectively, of the networks in the physical infrastructure and the traffic flow layers in three Chinese transportation systems. The systems are bus transportation systems in Beijing and Hangzhou, and the railway system in the mainland. It is found that the correlation between the degrees obey a linear function; while the correlation between the clustering coefficients obey a power law. A possible dynamic explanation on the rules is presented.

  13. Interest Rates and Coupon Bonds in Quantum Finance

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.

    2009-09-01

    1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.

  14. Macro-evolutionary studies of cultural diversity: a review of empirical studies of cultural transmission and cultural adaptation.

    PubMed

    Mace, Ruth; Jordan, Fiona M

    2011-02-12

    A growing body of theoretical and empirical research has examined cultural transmission and adaptive cultural behaviour at the individual, within-group level. However, relatively few studies have tried to examine proximate transmission or test ultimate adaptive hypotheses about behavioural or cultural diversity at a between-societies macro-level. In both the history of anthropology and in present-day work, a common approach to examining adaptive behaviour at the macro-level has been through correlating various cultural traits with features of ecology. We discuss some difficulties with simple ecological associations, and then review cultural phylogenetic studies that have attempted to go beyond correlations to understand the underlying cultural evolutionary processes. We conclude with an example of a phylogenetically controlled approach to understanding proximate transmission pathways in Austronesian cultural diversity.

  15. Macro-evolutionary studies of cultural diversity: a review of empirical studies of cultural transmission and cultural adaptation

    PubMed Central

    Mace, Ruth; Jordan, Fiona M.

    2011-01-01

    A growing body of theoretical and empirical research has examined cultural transmission and adaptive cultural behaviour at the individual, within-group level. However, relatively few studies have tried to examine proximate transmission or test ultimate adaptive hypotheses about behavioural or cultural diversity at a between-societies macro-level. In both the history of anthropology and in present-day work, a common approach to examining adaptive behaviour at the macro-level has been through correlating various cultural traits with features of ecology. We discuss some difficulties with simple ecological associations, and then review cultural phylogenetic studies that have attempted to go beyond correlations to understand the underlying cultural evolutionary processes. We conclude with an example of a phylogenetically controlled approach to understanding proximate transmission pathways in Austronesian cultural diversity. PMID:21199844

  16. Predicting the effects of magnesium oxide nanoparticles and temperature on the thermal conductivity of water using artificial neural network and experimental data

    NASA Astrophysics Data System (ADS)

    Afrand, Masoud; Hemmat Esfe, Mohammad; Abedini, Ehsan; Teimouri, Hamid

    2017-03-01

    The current paper first presents an empirical correlation based on experimental results for estimating thermal conductivity enhancement of MgO-water nanofluid using curve fitting method. Then, artificial neural networks (ANNs) with various numbers of neurons have been assessed by considering temperature and MgO volume fraction as the inputs variables and thermal conductivity enhancement as the output variable to select the most appropriate and optimized network. Results indicated that the network with 7 neurons had minimum error. Eventually, the output of artificial neural network was compared with the results of the proposed empirical correlation and those of the experiments. Comparisons revealed that ANN modeling was more accurate than curve-fitting method in the predicting the thermal conductivity enhancement of the nanofluid.

  17. Investigation of pressure drop in capillary tube for mixed refrigerant Joule-Thomson cryocooler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ardhapurkar, P. M.; Sridharan, Arunkumar; Atrey, M. D.

    2014-01-29

    A capillary tube is commonly used in small capacity refrigeration and air-conditioning systems. It is also a preferred expansion device in mixed refrigerant Joule-Thomson (MR J-T) cryocoolers, since it is inexpensive and simple in configuration. However, the flow inside a capillary tube is complex, since flashing process that occurs in case of refrigeration and air-conditioning systems is metastable. A mixture of refrigerants such as nitrogen, methane, ethane, propane and iso-butane expands below its inversion temperature in the capillary tube of MR J-T cryocooler and reaches cryogenic temperature. The mass flow rate of refrigerant mixture circulating through capillary tube depends onmore » the pressure difference across it. There are many empirical correlations which predict pressure drop across the capillary tube. However, they have not been tested for refrigerant mixtures and for operating conditions of the cryocooler. The present paper assesses the existing empirical correlations for predicting overall pressure drop across the capillary tube for the MR J-T cryocooler. The empirical correlations refer to homogeneous as well as separated flow models. Experiments are carried out to measure the overall pressure drop across the capillary tube for the cooler. Three different compositions of refrigerant mixture are used to study the pressure drop variations. The predicted overall pressure drop across the capillary tube is compared with the experimentally obtained value. The predictions obtained using homogeneous model show better match with the experimental results compared to separated flow models.« less

  18. Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference

    USGS Publications Warehouse

    Olea, R.A.; Pardo-Iguzquiza, E.

    2011-01-01

    The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.

  19. A summary of selected early results from the ERTS-1 menhaden experiment

    NASA Technical Reports Server (NTRS)

    Stevenson, W. H. (Principal Investigator); Kemmerer, A. J.; Benigno, J. A.; Reese, G. B.; Minkler, F. C.

    1973-01-01

    The author has identified the following significant results. Imagery from ERTS-1 satellite was used in conjunction with aerial photographically-sensed menhaden distribution information, sea truth oceanographic measurements, and commercial fishing information from a 8685 square kilometer study area in the north-central portion of the Gulf of Mexico to demonstrate relationships between selected oceanographic parameters and menhaden distribution, ERTS-1 imagery and menhaden distribution, and ERTS-1 imagery and oceanographic parameters. ERTS-1, MSS band 5 imagery density levels correlated with photographically detected menhaden distribution patterns and could be explained based on sea truth Secchi disc transparency and water depth measurements. These two parameters, together with surface salinity, Forel-Ule color, and chlorophyll-a also were found to correlate significantly with menhaden distribution. Eight empirical models were developed which provided menhaden distribution predictions for the study area on combinations of Secchi disc transparency, water depth, surface salinity, and Forel-Ule color measurements.

  20. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student’s t-distribution*

    PubMed Central

    Leão, William L.; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor’s 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model. PMID:29333210

  1. Auditory hallucinations: nomenclature and classification.

    PubMed

    Blom, Jan Dirk; Sommer, Iris E C

    2010-03-01

    The literature on the possible neurobiologic correlates of auditory hallucinations is expanding rapidly. For an adequate understanding and linking of this emerging knowledge, a clear and uniform nomenclature is a prerequisite. The primary purpose of the present article is to provide an overview of the nomenclature and classification of auditory hallucinations. Relevant data were obtained from books, PubMed, Embase, and the Cochrane Library. The results are presented in the form of several classificatory arrangements of auditory hallucinations, governed by the principles of content, perceived source, perceived vivacity, relation to the sleep-wake cycle, and association with suspected neurobiologic correlates. This overview underscores the necessity to reappraise the concepts of auditory hallucinations developed during the era of classic psychiatry, to incorporate them into our current nomenclature and classification of auditory hallucinations, and to test them empirically with the aid of the structural and functional imaging techniques currently available.

  2. Absolute Measurement of the Refractive Index of Water by a Mode-Locked Laser at 518 nm.

    PubMed

    Meng, Zhaopeng; Zhai, Xiaoyu; Wei, Jianguo; Wang, Zhiyang; Wu, Hanzhong

    2018-04-09

    In this paper, we demonstrate a method using a frequency comb, which can precisely measure the refractive index of water. We have developed a simple system, in which a Michelson interferometer is placed into a quartz-glass container with a low expansion coefficient, and for which compensation of the thermal expansion of the water container is not required. By scanning a mirror on a moving stage, a pair of cross-correlation patterns can be generated. We can obtain the length information via these cross-correlation patterns, with or without water in the container. The refractive index of water can be measured by the resulting lengths. Long-term experimental results show that our method can measure the refractive index of water with a high degree of accuracy-measurement uncertainty at 10 -5 level has been achieved, compared with the values calculated by the empirical formula.

  3. Absolute Measurement of the Refractive Index of Water by a Mode-Locked Laser at 518 nm

    PubMed Central

    Meng, Zhaopeng; Zhai, Xiaoyu; Wei, Jianguo; Wang, Zhiyang; Wu, Hanzhong

    2018-01-01

    In this paper, we demonstrate a method using a frequency comb, which can precisely measure the refractive index of water. We have developed a simple system, in which a Michelson interferometer is placed into a quartz-glass container with a low expansion coefficient, and for which compensation of the thermal expansion of the water container is not required. By scanning a mirror on a moving stage, a pair of cross-correlation patterns can be generated. We can obtain the length information via these cross-correlation patterns, with or without water in the container. The refractive index of water can be measured by the resulting lengths. Long-term experimental results show that our method can measure the refractive index of water with a high degree of accuracy—measurement uncertainty at 10−5 level has been achieved, compared with the values calculated by the empirical formula. PMID:29642518

  4. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student's t-distribution.

    PubMed

    Leão, William L; Abanto-Valle, Carlos A; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor's 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model.

  5. Empirical seasonal forecasts of the NAO

    NASA Astrophysics Data System (ADS)

    Sanchezgomez, E.; Ortizbevia, M.

    2003-04-01

    We present here seasonal forecasts of the North Atlantic Oscillation (NAO) issued from ocean predictors with an empirical procedure. The Singular Values Decomposition (SVD) of the cross-correlation matrix between predictor and predictand fields at the lag used for the forecast lead is at the core of the empirical model. The main predictor field are sea surface temperature anomalies, although sea ice cover anomalies are also used. Forecasts are issued in probabilistic form. The model is an improvement over a previous version (1), where Sea Level Pressure Anomalies were first forecast, and the NAO Index built from this forecast field. Both correlation skill between forecast and observed field, and number of forecasts that hit the correct NAO sign, are used to assess the forecast performance , usually above those values found in the case of forecasts issued assuming persistence. For certain seasons and/or leads, values of the skill are above the .7 usefulness treshold. References (1) SanchezGomez, E. and Ortiz Bevia M., 2002, Estimacion de la evolucion pluviometrica de la Espana Seca atendiendo a diversos pronosticos empiricos de la NAO, in 'El Agua y el Clima', Publicaciones de la AEC, Serie A, N 3, pp 63-73, Palma de Mallorca, Spain

  6. Measurement and correlation of the solubility of gossypol acetic acid and gossypol acetic acid of optical activity in different solvents

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Tang, H.; Liu, X. Y.; Zhai, X.; Yao, X. C.

    2018-01-01

    The equilibrium method was used to measure the solubility of gossypol acetic acid and gossypol acetic acid of optical activity in isopropyl alcohol, ethanol, acetic acid and ethyl acetate at temperature from 288.15 to 315.15. The Empirical equation and the Apelblat equation model were adopted to correlate the experimental data. For gossypol acetic acid, the root-mean-square deviations (RMSD) were observed in the range of 0.023-4.979 and 0.0112-0.614 for the Empirical equation and the Apelblat equation, respectively. For gossypol acetic acid of optical activity, the RMSD were observed in the range of 0.021-2.211 and 0.021-2.243 for the Empirical equation and the Apelblat equation, individually. And the maximum relative average deviation was 7.5%. Both equations offered an accurate mathematical expression of the experimental results. The calculated solubility showed a good relationship with the experimental solubility for most of solvents. This study provided valuable datas not only for optimizing the process of purification of gossypol acetic acid of optical activity in industry but also for further theoretical studies.

  7. A generalization of random matrix theory and its application to statistical physics.

    PubMed

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  8. Specific prognostic factors for secondary pancreatic infection in severe acute pancreatitis.

    PubMed

    Armengol-Carrasco, M; Oller, B; Escudero, L E; Roca, J; Gener, J; Rodríguez, N; del Moral, P; Moreno, P

    1999-01-01

    The aim of the present study was to investigate whether there are specific prognostic factors to predict the development of secondary pancreatic infection (SPI) in severe acute pancreatitis in order to perform a computed tomography-fine needle aspiration with bacteriological sampling at the right moment and confirm the diagnosis. Twenty-five clinical and laboratory parameters were determined sequentially in 150 patients with severe acute pancreatitis (SAP) and univariate, and multivariate regression analyses were done looking for correlation with the development of SPI. Only APACHE II score and C-reactive protein levels were related to the development of SPI in the multivariate analysis. A regression equation was designed using these two parameters, and empiric cut-off points defined the subgroup of patients at high risk of developing secondary pancreatic infection. The results showed that it is possible to predict SPI during SAP allowing bacteriological confirmation and early treatment of this severe condition.

  9. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    PubMed

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  10. Artifact interactions retard technological improvement: An empirical study

    PubMed Central

    Magee, Christopher L.

    2017-01-01

    Empirical research has shown performance improvement of many different technological domains occurs exponentially but with widely varying improvement rates. What causes some technologies to improve faster than others do? Previous quantitative modeling research has identified artifact interactions, where a design change in one component influences others, as an important determinant of improvement rates. The models predict that improvement rate for a domain is proportional to the inverse of the domain’s interaction parameter. However, no empirical research has previously studied and tested the dependence of improvement rates on artifact interactions. A challenge to testing the dependence is that any method for measuring interactions has to be applicable to a wide variety of technologies. Here we propose a novel patent-based method that is both technology domain-agnostic and less costly than alternative methods. We use textual content from patent sets in 27 domains to find the influence of interactions on improvement rates. Qualitative analysis identified six specific keywords that signal artifact interactions. Patent sets from each domain were then examined to determine the total count of these 6 keywords in each domain, giving an estimate of artifact interactions in each domain. It is found that improvement rates are positively correlated with the inverse of the total count of keywords with Pearson correlation coefficient of +0.56 with a p-value of 0.002. The results agree with model predictions, and provide, for the first time, empirical evidence that artifact interactions have a retarding effect on improvement rates of technological domains. PMID:28777798

  11. Differential correlation of suicide and homicide rates according to geographical areas: A study with population-level data.

    PubMed

    Fountoulakis, Konstantinos N; Gonda, Xenia

    2017-03-01

    The current study investigated the relationship of suicide and homicide rates internationally. WHO database mortality data for 82 countries concerning suicide, homicides, and cancer and traffic accidents as controls were used. The analysis included Pearson correlation and multiple linear regression analysis. Worldwide homicidal rates explained 55.42%, 43.86% and 41.7% of male and 22.0%, 22.14% and 13.25% of female suicides for 2000, 2005 and 2010 respectively. In Europe there was a positive correlation between male suicide rates and all homicide rates including homicide rates in both genders, in male victims, and in female victims. In America there is no significant correlation. In Asia there is a significant correlation of male suicidal rates only with homicide rates of female victims. We observed marked and interesting differences in the pattern of association between Europe and the Americas. Overall the current paper suggests that at least in some human populations, suicidality and homicidality share common etiopathogenetic substrates and could be triggered by the same internal or external events or might develop based on common genetic background. Empirically it has been suggested that suicide is related to higher living standards while murder is related to poor quality of life and lower living standards. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  12. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  13. Compare diagnostic tests using transformation-invariant smoothed ROC curves⋆

    PubMed Central

    Tang, Liansheng; Du, Pang; Wu, Chengqing

    2012-01-01

    Receiver operating characteristic (ROC) curve, plotting true positive rates against false positive rates as threshold varies, is an important tool for evaluating biomarkers in diagnostic medicine studies. By definition, ROC curve is monotone increasing from 0 to 1 and is invariant to any monotone transformation of test results. And it is often a curve with certain level of smoothness when test results from the diseased and non-diseased subjects follow continuous distributions. Most existing ROC curve estimation methods do not guarantee all of these properties. One of the exceptions is Du and Tang (2009) which applies certain monotone spline regression procedure to empirical ROC estimates. However, their method does not consider the inherent correlations between empirical ROC estimates. This makes the derivation of the asymptotic properties very difficult. In this paper we propose a penalized weighted least square estimation method, which incorporates the covariance between empirical ROC estimates as a weight matrix. The resulting estimator satisfies all the aforementioned properties, and we show that it is also consistent. Then a resampling approach is used to extend our method for comparisons of two or more diagnostic tests. Our simulations show a significantly improved performance over the existing method, especially for steep ROC curves. We then apply the proposed method to a cancer diagnostic study that compares several newly developed diagnostic biomarkers to a traditional one. PMID:22639484

  14. The rate of bubble growth in a superheated liquid in pool boiling

    NASA Astrophysics Data System (ADS)

    Abdollahi, Mohammad Reza; Jafarian, Mehdi; Jamialahmadi, Mohammad

    2017-12-01

    A semi-empirical model for the estimation of the rate of bubble growth in nucleate pool boiling is presented, considering a new equation to estimate the temperature history of the bubble in the bulk of liquid. The conservation equations of energy, mass and momentum have been firstly derived and solved analytically. The present analytical model of the bubble growth predicts that the radius of the bubble grows as a function of √{t}.{\\operatorname{erf}}( N√{t}) , while so far the bubble growth rate has been mainly correlated to √{t} in the previous studies. In the next step, the analytical solutions were used to develop a new semi-empirical equation. To achieve this, firstly the analytical solution were non-dimensionalised and then the experimental data, available in the literature, were applied to tune the dimensionless coefficients appeared in the dimensionless equation. Finally, the reliability of the proposed semi-empirical model was assessed through comparison of the model predictions with the available experimental data in the literature, which were not applied in the tuning of the dimensionless parameters of the model. The comparison of the model predictions with other proposed models in the literature was also performed. These comparisons show that this model enables more accurate predictions than previously proposed models with a deviation of less than 10% in a wide range of operating conditions.

  15. Density-functional theory based on the electron distribution on the energy coordinate

    NASA Astrophysics Data System (ADS)

    Takahashi, Hideaki

    2018-03-01

    We developed an electronic density functional theory utilizing a novel electron distribution n(ɛ) as a basic variable to compute ground state energy of a system. n(ɛ) is obtained by projecting the electron density n({\\boldsymbol{r}}) defined on the space coordinate {\\boldsymbol{r}} onto the energy coordinate ɛ specified with the external potential {\\upsilon }ext}({\\boldsymbol{r}}) of interest. It was demonstrated that the Kohn-Sham equation can also be formulated with the exchange-correlation functional E xc[n(ɛ)] that employs the density n(ɛ) as an argument. It turned out an exchange functional proposed in our preliminary development suffices to describe properly the potential energies of several types of chemical bonds with comparable accuracies to the corresponding functional based on local density approximation. As a remarkable feature of the distribution n(ɛ) it inherently involves the spatially non-local information of the exchange hole at the bond dissociation limit in contrast to conventional approximate functionals. By taking advantage of this property we also developed a prototype of the static correlation functional E sc including no empirical parameters, which showed marked improvements in describing the dissociations of covalent bonds in {{{H}}}2,{{{C}}}2{{{H}}}4 and {CH}}4 molecules.

  16. Development of Erosive Burning Models for CFD Predictions of Solid Rocket Motor Internal Environments

    NASA Technical Reports Server (NTRS)

    Wang, Qun-Zhen

    2003-01-01

    Four erosive burning models, equations (11) to (14). are developed in this work by using a power law relationship to correlate (1) the erosive burning ratio and the local velocity gradient at propellant surfaces; (2) the erosive burning ratio and the velocity gradient divided by centerline velocity; (3) the erosive burning difference and the local velocity gradient at propellant surfaces; and (4) the erosive burning difference and the velocity gradient divided by centerline velocity. These models depend on the local velocity gradient at the propellant surface (or the velocity gradient divided by centerline velocity) only and, unlike other empirical models, are independent of the motor size. It was argued that, since the erosive burning is a local phenomenon occurring near the surface of the solid propellant, the erosive burning ratio should be independent of the bore diameter if it is correlated with some local flow parameters such as the velocity gradient at the propellant surface. This seems to be true considering the good results obtained by applying these models, which are developed from the small size 5 inch CP tandem motor testing, to CFD simulations of much bigger motors.

  17. Empirical Studies on Correlations between Lexical Knowledge and English Proficiency of Chinese EFL Learners in Mainland China over the Past Two Decades

    ERIC Educational Resources Information Center

    Zhou, Yao; Dai, Zhongxin

    2016-01-01

    Knowledge of English vocabulary contributes to the learner's proficiency of English as a foreign language, but how the learner's lexical knowledge behaves in the contribution. Researchers in mainland China have conducted studies of various kinds in order to find out how the learner's lexical knowledge correlates with his proficiency. This article…

  18. ASSESSING THE UNCERTAINTY OF NUCLEAR DETERRENCE

    DTIC Science & Technology

    2017-04-22

    empirical attempts. From both qualitative and quantitative perspectives, this paper finds cause to question the certainty that nuclear deterrence will...suggests nuclear weapons do indeed possess a higher deterrence effect than conventional forces alone. Data from the “ Correlates of War” data set was...certainly do not provide an absolute deterrent against aggression. 16 While nuclear weapons appear to be correlated with a reduction in the occurrences

  19. Proverb interpretation in schizophrenia: the significance of symptomatology and cognitive processes.

    PubMed

    Sponheim, Scott R; Surerus-Johnson, Christa; Leskela, Jennie; Dieperink, Michael E

    2003-12-15

    Although clinicians have patients interpret proverbs in mental status exams for psychosis, there are few empirical studies investigating the significance of proverb interpretation. In schizophrenia patients, we found abstraction positively correlated with overall intelligence but no symptom measures, concreteness negatively correlated with overall intelligence, executive functioning, attention, and memory, and bizarre-idiosyncratic responses associated with positive formal thought disorder but no cognitive functions.

  20. Spatial correlations in polydisperse, frictionless, two-dimensional packings

    NASA Astrophysics Data System (ADS)

    O'Donovan, C. B.; Möbius, M. E.

    2011-08-01

    We investigate next-nearest-neighbor correlations of the contact number in simulations of polydisperse, frictionless packings in two dimensions. We find that disks with few contacting neighbors are predominantly in contact with disks that have many neighbors and vice versa at all packing fractions. This counterintuitive result can be explained by drawing a direct analogy to the Aboav-Weaire law in cellular structures. We find an empirical one parameter relation similar to the Aboav-Weaire law that satisfies an exact sum rule constraint. Surprisingly, there are no correlations in the radii between neighboring particles, despite correlations between contact number and radius.

  1. Solid motor aft closure insulation erosion. [heat flux correlation for rate analysis

    NASA Technical Reports Server (NTRS)

    Stampfl, E.; Landsbaum, E. M.

    1973-01-01

    The erosion rate of aft closure insulation in a number of large solid propellant motors was empirically analyzed by correlating the average ablation rate with a number of variables that had previously been demonstrated to affect heat flux. The main correlating parameter was a heat flux based on the simplified Bartz heat transfer coefficient corrected for two-dimensional effects. A multiplying group contained terms related to port-to-throat ratio, local wall angle, grain geometry and nozzle cant angle. The resulting equation gave a good correlation and is a useful design tool.

  2. Experimental Study on Cooling Heat Transfer of Supercritical Carbon Dioxide Inside Horizontal Micro-Fin Tubes

    NASA Astrophysics Data System (ADS)

    Kuwahara, Ken; Higashiiu, Shinya; Ito, Daisuke; Koyama, Shigeru

    This paper deals with the experimental study on cooling heat transfer of supercritical carbon dioxide inside micro-fin tubes. The geometrical parameters in micro-fin tubes used in the present study are 6.02 mm in outer diameter, 4.76 mm to 5.11 mm in average inner diameter, 0.15 mm to 0.24 mm in fin height, 5 to 25 in helix angle, 46 to 52 in number of fins and 1.4 to 2.3 in area expansion ratio. Heat transfer coefficients were measured at 8-10 MPa in pressure, 360-690 kg/(m2•s) in mass velocity and 20-75 °C in CO2 temperature. The measured heat transfer coefficients of micro-fin tubes were 1.4 to 2 times higher than those of the smooth tube having 4.42 in inner diameter. The predicted heat transfer coefficients using the correlation equation, which was developed for single-phase turbulent fluid flow inside micro-fin-tubes, showed large deviations to the measured values. The new correlation to predict cooling heat transfer coefficient of supercritical carbon dioxide inside micro-fin tubes was developed taking into account the shape of fins based on experimental data empirically. This correlation equation agreed within ±20% of almost all of the experimental data.

  3. Statistical analysis of geomagnetic field intensity differences between ASM and VFM instruments onboard Swarm constellation

    NASA Astrophysics Data System (ADS)

    De Michelis, Paola; Tozzi, Roberta; Consolini, Giuseppe

    2017-02-01

    From the very first measurements made by the magnetometers onboard Swarm satellites launched by European Space Agency (ESA) in late 2013, it emerged a discrepancy between scalar and vector measurements. An accurate analysis of this phenomenon brought to build an empirical model of the disturbance, highly correlated with the Sun incidence angle, and to correct vector data accordingly. The empirical model adopted by ESA results in a significant decrease in the amplitude of the disturbance affecting VFM measurements so greatly improving the vector magnetic data quality. This study is focused on the characterization of the difference between magnetic field intensity measured by the absolute scalar magnetometer (ASM) and that reconstructed using the vector field magnetometer (VFM) installed on Swarm constellation. Applying empirical mode decomposition method, we find the intrinsic mode functions (IMFs) associated with ASM-VFM total intensity differences obtained with data both uncorrected and corrected for the disturbance correlated with the Sun incidence angle. Surprisingly, no differences are found in the nature of the IMFs embedded in the analyzed signals, being these IMFs characterized by the same dominant periodicities before and after correction. The effect of correction manifests in the decrease in the energy associated with some IMFs contributing to corrected data. Some IMFs identified by analyzing the ASM-VFM intensity discrepancy are characterized by the same dominant periodicities of those obtained by analyzing the temperature fluctuations of the VFM electronic unit. Thus, the disturbance correlated with the Sun incidence angle could be still present in the corrected magnetic data. Furthermore, the ASM-VFM total intensity difference and the VFM electronic unit temperature display a maximal shared information with a time delay that depends on local time. Taken together, these findings may help to relate the features of the observed VFM-ASM total intensity difference to the physical characteristics of the real disturbance thus contributing to improve the empirical model proposed for the correction of data.[Figure not available: see fulltext.

  4. A test of the hypothesis that correlational selection generates genetic correlations.

    PubMed

    Roff, Derek A; Fairbairn, Daphne J

    2012-09-01

    Theory predicts that correlational selection on two traits will cause the major axis of the bivariate G matrix to orient itself in the same direction as the correlational selection gradient. Two testable predictions follow from this: for a given pair of traits, (1) the sign of correlational selection gradient should be the same as that of the genetic correlation, and (2) the correlational selection gradient should be positively correlated with the value of the genetic correlation. We test this hypothesis with a meta-analysis utilizing empirical estimates of correlational selection gradients and measures of the correlation between the two focal traits. Our results are consistent with both predictions and hence support the underlying hypothesis that correlational selection generates a genetic correlation between the two traits and hence orients the bivariate G matrix. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  5. Are Economic Development and Education Improvement Associated with Participation in Transnational Terrorism?

    PubMed

    Elbakidze, L; Jin, Y H

    2015-08-01

    Using transnational terrorism data from 1980 to 2000, this study empirically examines the relationships between frequency of participation in transnational terrorism acts and economic development and education improvement. We find an inverse U-shaped association between the frequency of various nationals acting as perpetrators in transnational terrorism acts and per capita income in their respective home countries. As per capita incomes increase from relatively low levels, frequencies of participation in transnational terrorism increase. However, at sufficiently higher levels of per capita income, further increase in per capita income is negatively associated with the rate of participation in transnational terrorism. Education improvement from elementary to secondary is positively correlated with frequency of participation in transnational terrorism events, whereas further improvement from secondary to tertiary level is negatively correlated with participation in transnational terrorism. We also find that citizens of countries with greater openness to international trade, lower degree of income inequality, greater economic freedom, larger proportion of population with tertiary education, and less religious prevalence participate in transnational terrorism events less frequently. © 2015 Society for Risk Analysis.

  6. Developing and testing the CHORDS: Characteristics of Responsible Drinking Survey.

    PubMed

    Barry, Adam E; Goodson, Patricia

    2011-01-01

    Report on the development and psychometric testing of a theoretically and evidence-grounded instrument, the Characteristics of Responsible Drinking Survey (CHORDS). Instrument subjected to four phases of pretesting (cognitive validity, cognitive and motivational qualities, pilot test, and item evaluation) and a final posttest implementation. Large public university in Texas. Randomly selected convenience sample (n  =  729) of currently enrolled students. This 78-item questionnaire measures individuals' responsible drinking beliefs, motivations, intentions, and behaviors. Cronbach α, split-half reliability, principal components analysis and Spearman ρ were conducted to investigate reliability, stability, and validity. Measures in the CHORDS exhibited high internal consistency reliability and strong correlations of split-half reliability. Factor analyses indicated five distinct scales were present, as proposed in the theoretical model. Subscale composite scores also exhibited a correlation to alcohol consumption behaviors, indicating concurrent validity. The CHORDS represents the first instrument specifically designed to assess responsible drinking beliefs and behaviors. It was found to elicit valid and reliable data among a college student sample. This instrument holds much promise for practitioners who desire to empirically investigate dimensions of responsible drinking.

  7. Fetal Cardiac Responding: A Correlate of Birth Weight and Neonatal Behavior.

    ERIC Educational Resources Information Center

    Emory, Eugene K.; Noonan, John R.

    1984-01-01

    Explores whether an empirical classification of healthy fetuses as fetal heart rate accelerators or decelerators would predict birth weight and neonatal behavior scored with the Brazelton Neonatal Behavior Assessment Scale. (Author/RH)

  8. Essays on pricing electricity and electricity derivatives in deregulated markets

    NASA Astrophysics Data System (ADS)

    Popova, Julia

    2008-10-01

    This dissertation is composed of four essays on the behavior of wholesale electricity prices and their derivatives. The first essay provides an empirical model that takes into account the spatial features of a transmission network on the electricity market. The spatial structure of the transmission grid plays a key role in determining electricity prices, but it has not been incorporated into previous empirical models. The econometric model in this essay incorporates a simple representation of the transmission system into a spatial panel data model of electricity prices, and also accounts for the effect of dynamic transmission system constraints on electricity market integration. Empirical results using PJM data confirm the existence of spatial patterns in electricity prices and show that spatial correlation diminishes as transmission lines become more congested. The second essay develops and empirically tests a model of the influence of natural gas storage inventories on the electricity forward premium. I link a model of the effect of gas storage constraints on the higher moments of the distribution of electricity prices to a model of the effect of those moments on the forward premium. Empirical results using PJM data support the model's predictions that gas storage inventories sharply reduce the electricity forward premium when demand for electricity is high and space-heating demand for gas is low. The third essay examines the efficiency of PJM electricity markets. A market is efficient if prices reflect all relevant information, so that prices follow a random walk. The hypothesis of random walk is examined using empirical tests, including the Portmanteau, Augmented Dickey-Fuller, KPSS, and multiple variance ratio tests. The results are mixed though evidence of some level of market efficiency is found. The last essay investigates the possibility that previous researchers have drawn spurious conclusions based on classical unit root tests incorrectly applied to wholesale electricity prices. It is well known that electricity prices exhibit both cyclicity and high volatility which varies through time. Results indicate that heterogeneity in unconditional variance---which is not detected by classical unit root tests---may contribute to the appearance of non-stationarity.

  9. Boundary layers in centrifugal compressors. [application of boundary layer theory to compressor design

    NASA Technical Reports Server (NTRS)

    Dean, R. C., Jr.

    1974-01-01

    The utility of boundary-layer theory in the design of centrifugal compressors is demonstrated. Boundary-layer development in the diffuser entry region is shown to be important to stage efficiency. The result of an earnest attempt to analyze this boundary layer with the best tools available is displayed. Acceptable prediction accuracy was not achieved. The inaccuracy of boundary-layer analysis in this case would result in stage efficiency prediction as much as four points low. Fluid dynamic reasons for analysis failure are discussed with support from flow data. Empirical correlations used today to circumnavigate the weakness of the theory are illustrated.

  10. Initial empirical analysis of nuclear power plant organization and its effect on safety performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, J.; McLaughlin, S.D.; Osborn, R.N.

    This report contains an analysis of the relationship between selected aspects of organizational structure and the safety-related performance of nuclear power plants. The report starts by identifying and operationalizing certain key dimensions of organizational structure that may be expected to be related to plant safety performance. Next, indicators of plant safety performance are created by combining existing performance measures into more reliable indicators. Finally, the indicators of plant safety performance using correlational and discriminant analysis. The overall results show that plants with better developed coordination mechanisms, shorter vertical hierarchies, and a greater number of departments tend to perform more safely.

  11. IT product competition Network

    NASA Astrophysics Data System (ADS)

    Xu, Xiu-Lian; Zhou, Lei; Shi, Jian-Jun; Wang, Yong-Li; Feng, Ai-Xia; He, Da-Ren

    2008-03-01

    Along with the technical development, the IT product competition becomes increasingly fierce in recent years. The factories, which produce the same IT product, have to improve continuously their own product quality for taking a large piece of cake in the product sale market. We suggest using a complex network description for the IT product competition. In the network the factories are defined as nodes, and two nodes are connected by a link if they produce a common IT product. The edge represents the sale competition relationship. 2121 factories and 265 products have been investigated. Some statistical properties, such as the degree distribution, node strength distribution, assortativity, and node degree correlation have been empirically obtained.

  12. Impingement thermal performance of perforated circular pin-fin heat sinks

    NASA Astrophysics Data System (ADS)

    Wen, Mao-Yu; Yeh, Cheng-Hsiung

    2018-04-01

    The study presents the experimental information on heat transfer performance of jet impingement cooling on circular pin- fin heat sinks with/without a hollow perforated base plate. Smoke flow visualization is also used to investigate the behavior of the complicated flow phenomena of the present heat sinks for this impingement cooling. The effects of flow Reynolds numbers (3458≤Re≤11,526), fin height, the geometry of the heat sinks (with/without a hollow perforated base plate), and jet-to-test heat sink placement (1 ≤ H/ d≤16) are examined. In addition, empirical correlation to estimate the heat transfer coefficient was also developed.

  13. Happy software developers solve problems better: psychological measurements in empirical software engineering

    PubMed Central

    Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866

  14. The dynamics of adapting, unregulated populations and a modified fundamental theorem.

    PubMed

    O'Dwyer, James P

    2013-01-06

    A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.

  15. Backward jump continuous-time random walk: An application to market trading

    NASA Astrophysics Data System (ADS)

    Gubiec, Tomasz; Kutner, Ryszard

    2010-10-01

    The backward jump modification of the continuous-time random walk model or the version of the model driven by the negative feedback was herein derived for spatiotemporal continuum in the context of a share price evolution on a stock exchange. In the frame of the model, we described stochastic evolution of a typical share price on a stock exchange with a moderate liquidity within a high-frequency time scale. The model was validated by satisfactory agreement of the theoretical velocity autocorrelation function with its empirical counterpart obtained for the continuous quotation. This agreement is mainly a result of a sharp backward correlation found and considered in this article. This correlation is a reminiscence of such a bid-ask bounce phenomenon where backward price jump has the same or almost the same length as preceding jump. We suggested that this correlation dominated the dynamics of the stock market with moderate liquidity. Although assumptions of the model were inspired by the market high-frequency empirical data, its potential applications extend beyond the financial market, for instance, to the field covered by the Le Chatelier-Braun principle of contrariness.

  16. The limitations of simple gene set enrichment analysis assuming gene independence.

    PubMed

    Tamayo, Pablo; Steinhardt, George; Liberzon, Arthur; Mesirov, Jill P

    2016-02-01

    Since its first publication in 2003, the Gene Set Enrichment Analysis method, based on the Kolmogorov-Smirnov statistic, has been heavily used, modified, and also questioned. Recently a simplified approach using a one-sample t-test score to assess enrichment and ignoring gene-gene correlations was proposed by Irizarry et al. 2009 as a serious contender. The argument criticizes Gene Set Enrichment Analysis's nonparametric nature and its use of an empirical null distribution as unnecessary and hard to compute. We refute these claims by careful consideration of the assumptions of the simplified method and its results, including a comparison with Gene Set Enrichment Analysis's on a large benchmark set of 50 datasets. Our results provide strong empirical evidence that gene-gene correlations cannot be ignored due to the significant variance inflation they produced on the enrichment scores and should be taken into account when estimating gene set enrichment significance. In addition, we discuss the challenges that the complex correlation structure and multi-modality of gene sets pose more generally for gene set enrichment methods. © The Author(s) 2012.

  17. Backward jump continuous-time random walk: an application to market trading.

    PubMed

    Gubiec, Tomasz; Kutner, Ryszard

    2010-10-01

    The backward jump modification of the continuous-time random walk model or the version of the model driven by the negative feedback was herein derived for spatiotemporal continuum in the context of a share price evolution on a stock exchange. In the frame of the model, we described stochastic evolution of a typical share price on a stock exchange with a moderate liquidity within a high-frequency time scale. The model was validated by satisfactory agreement of the theoretical velocity autocorrelation function with its empirical counterpart obtained for the continuous quotation. This agreement is mainly a result of a sharp backward correlation found and considered in this article. This correlation is a reminiscence of such a bid-ask bounce phenomenon where backward price jump has the same or almost the same length as preceding jump. We suggested that this correlation dominated the dynamics of the stock market with moderate liquidity. Although assumptions of the model were inspired by the market high-frequency empirical data, its potential applications extend beyond the financial market, for instance, to the field covered by the Le Chatelier-Braun principle of contrariness.

  18. Multifractal detrended cross-correlations between the CSI 300 index futures and the spot markets based on high-frequency data

    NASA Astrophysics Data System (ADS)

    Cao, Guangxi; Han, Yan; Cui, Weijun; Guo, Yu

    2014-11-01

    The cross-correlation between the China Securities Index 300 (CSI 300) index futures and the spot markets based on high-frequency data is discussed in this paper. We empirically analyze the cross-correlation by using the multifractal detrended cross-correlation analysis (MF-DCCA), and investigate further the characteristics of asymmetry, frequency difference, and transmission direction of the cross-correlation. The results indicate that the cross-correlation between the two markets is significant and multifractal. Meanwhile, weak asymmetries exist in the cross-correlation, and higher data frequency results in a lower multifractality degree of the cross-correlation. The causal relationship between the two markets is bidirectional, but the CSI 300 index futures market has greater impact on the spot market.

  19. The evolution of trade-offs under directional and correlational selection.

    PubMed

    Roff, Derek A; Fairbairn, Daphne J

    2012-08-01

    Using quantitative genetic theory, we develop predictions for the evolution of trade-offs in response to directional and correlational selection. We predict that directional selection favoring an increase in one trait in a trade-off will result in change in the intercept but not the slope of the trade-off function, with the mean value of the selected trait increasing and that of the correlated trait decreasing. Natural selection will generally favor an increase in some combination of trait values, which can be represented as directional selection on an index value. Such selection induces both directional and correlational selection on the component traits. Theory predicts that selection on an index value will also change the intercept but not the slope of the trade-off function but because of correlational selection, the direction of change in component traits may be in the same or opposite directions. We test these predictions using artificial selection on the well-established trade-off between fecundity and flight capability in the cricket, Gryllus firmus and compare the empirical results with a priori predictions made using genetic parameters from a separate half-sibling experiment. Our results support the predictions and illustrate the complexity of trade-off evolution when component traits are subject to both directional and correlational selection. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  20. Electronic Structures of Anti-Ferromagnetic Tetraradicals: Ab Initio and Semi-Empirical Studies.

    PubMed

    Zhang, Dawei; Liu, Chungen

    2016-04-12

    The energy relationships and electronic structures of the lowest-lying spin states in several anti-ferromagnetic tetraradical model systems are studied with high-level ab initio and semi-empirical methods. The Full-CI method (FCI), the complete active space second-order perturbation theory (CASPT2), and the n-electron valence state perturbation theory (NEVPT2) are employed to obtain reference results. By comparing the energy relationships predicted from the Heisenberg and Hubbard models with ab initio benchmarks, the accuracy of the widely used Heisenberg model for anti-ferromagnetic spin-coupling in low-spin polyradicals is cautiously tested in this work. It is found that the strength of electron correlation (|U/t|) concerning anti-ferromagnetically coupled radical centers could range widely from strong to moderate correlation regimes and could become another degree of freedom besides the spin multiplicity. Accordingly, the Heisenberg-type model works well in the regime of strong correlation, which reproduces well the energy relationships along with the wave functions of all the spin states. In moderately spin-correlated tetraradicals, the results of the prototype Heisenberg model deviate severely from those of multi-reference electron correlation ab initio methods, while the extended Heisenberg model, containing four-body terms, can introduce reasonable corrections and maintains its accuracy in this condition. In the weak correlation regime, both the prototype Heisenberg model and its extended forms containing higher-order correction terms will encounter difficulties. Meanwhile, the Hubbard model shows balanced accuracy from strong to weak correlation cases and can reproduce qualitatively correct electronic structures, which makes it more suitable for the study of anti-ferromagnetic coupling in polyradical systems.

  1. Crustal Imaging of the Faroe Islands and North Sea Using Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Sammarco, C.; Rawlinson, N.; Cornwell, D. G.

    2016-12-01

    The recent development of ambient seismic noise imaging offers the potential for obtaining detailed seismic models of the crust. Cross-correlation of long-term recordings from station pairs reveals an empirical "Green's function" which is related to the impulse response of the medium between the two stations. Here, we present new results using two different broadband datasets: one that spans the Faroe Islands and another that spans the North Sea. The smaller scale Faroe Islands study was tackled first, because with only 12 stations, it was well suited for the development and testing of a new data processing and inversion workflow. In the Faroe Islands study cross-correlations with high signal-to-noise ratios were obtained by applying phase weighted stacking, which is shown to be a significant improvement over convectional linear stacking. For example, coherent noise concentrated near the zero time lag of the linearly stacked cross correlations appears to have an influence on the dispersion characteristics beyond 10 s period, but we have managed to minimize these effects with phase weighted stacking. We obtain group velocity maps from 0.5s to 15s period by inverting inter-station travel times using an iterative non-linear inversion scheme. It reveals the presence of significant lateral heterogeneity in the mid-upper crust, including evidence of a low velocity zone in the upper crust, which may mark the base of the basalt layer. This is most clearly revealed by taking the average group velocity dispersion curve for all station pairs and inverting for 1-D shear wave velocity. The computation of a 3-D shear wave speed model both verifies and adds further detail to these results. Application to the North Sea dataset was challenging due to the highly attenuative nature of the crust in this region, which has previously been observed to dramatically reduce the signal-to-noise ratio of short period surface waves. However, with the help of phase-weighted stacking good quality empirical Green's functions can be retrieved for this large dataset. Both group and phase velocity dispersion information are extracted from the cross-correlations, which are then inverted to produce period-dependent velocity maps. The next stage is to invert these maps for 3-D shear wave velocity structure beneath the North Sea region.

  2. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Reassessment of fission fragment angular distributions from continuum states in the context of transition-state theory

    NASA Astrophysics Data System (ADS)

    Vaz, Louis C.; Alexander, John M.

    1983-07-01

    Fission angular distributions have been studied for years and have been treated as classic examples of trasitions-state theory. Early work involving composite nuclei of relatively low excitation energy E ∗ (⪅35 MeV) and spin I (⪅25ħ) gave support to theory and delimited interesting properties of the transitions-state nuclei. More recent research on fusion fission and sequential fission after deeply inelastic reactions involves composite nuclei of much higher energies (⪅200 MeV) and spins (⪅100ħ). Extension of the basic ideas developed for low-spin nuclei requires detailed consideration of the role of these high spins and, in particular, the “spin window” for fussion. We have made empirical correlations of cross sections for evaporation residues and fission in order to get a description of this spin window. A systematic reanalysis has been made for fusion fission induced by H, He and heavier ions. Empirical correlations of K 20 (K 20 = {IeffT }/{h̷2}) are presented along with comparisons of Ieff to moments of inertia for saddle-point nuclei from the rotating liquid drop model. This model gives an excellent guide for the intermidiate spin zone (30⪅ I ⪅65), while strong shell and/or pairing effects are evident for excitations less than ⪅35 MeV. Observations of strong anisotropies for very high-spin systems signal the demise of certain approximation commonly made in the theory, and suggestions are made toward this end.

  4. On the relationship between tumour growth rate and survival in non-small cell lung cancer.

    PubMed

    Mistry, Hitesh B

    2017-01-01

    A recurrent question within oncology drug development is predicting phase III outcome for a new treatment using early clinical data. One approach to tackle this problem has been to derive metrics from mathematical models that describe tumour size dynamics termed re-growth rate and time to tumour re-growth. They have shown to be strong predictors of overall survival in numerous studies but there is debate about how these metrics are derived and if they are more predictive than empirical end-points. This work explores the issues raised in using model-derived metric as predictors for survival analyses. Re-growth rate and time to tumour re-growth were calculated for three large clinical studies by forward and reverse alignment. The latter involves re-aligning patients to their time of progression. Hence, it accounts for the time taken to estimate re-growth rate and time to tumour re-growth but also assesses if these predictors correlate to survival from the time of progression. I found that neither re-growth rate nor time to tumour re-growth correlated to survival using reverse alignment. This suggests that the dynamics of tumours up until disease progression has no relationship to survival post progression. For prediction of a phase III trial I found the metrics performed no better than empirical end-points. These results highlight that care must be taken when relating dynamics of tumour imaging to survival and that bench-marking new approaches to existing ones is essential.

  5. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    ERIC Educational Resources Information Center

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  6. Converging Paradigms: A Reflection on Parallel Theoretical Developments in Psychoanalytic Metapsychology and Empirical Dream Research.

    PubMed

    Schmelowszky, Ágoston

    2016-08-01

    In the last decades one can perceive a striking parallelism between the shifting perspective of leading representatives of empirical dream research concerning their conceptualization of dreaming and the paradigm shift within clinically based psychoanalytic metapsychology with respect to its theory on the significance of dreaming. In metapsychology, dreaming becomes more and more a central metaphor of mental functioning in general. The theories of Klein, Bion, and Matte-Blanco can be considered as milestones of this paradigm shift. In empirical dream research, the competing theories of Hobson and of Solms respectively argued for and against the meaningfulness of the dream-work in the functioning of the mind. In the meantime, empirical data coming from various sources seemed to prove the significance of dream consciousness for the development and maintenance of adaptive waking consciousness. Metapsychological speculations and hypotheses based on empirical research data seem to point in the same direction, promising for contemporary psychoanalytic practice a more secure theoretical base. In this paper the author brings together these diverse theoretical developments and presents conclusions regarding psychoanalytic theory and technique, as well as proposing an outline of an empirical research plan for testing the specificity of psychoanalysis in developing dream formation.

  7. Developing Empirical Lightning Cessation Forecast Guidance for the Cape Canaveral Air Force Station and Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Stano, Geoffrey T.; Fuelberg, Henry E.; Roeder, William P.

    2010-01-01

    This research addresses the 45th Weather Squadron's (45WS) need for improved guidance regarding lightning cessation at Cape Canaveral Air Force Station and Kennedy Space Center (KSC). KSC's Lightning Detection and Ranging (LDAR) network was the primary observational tool to investigate both cloud-to-ground and intracloud lightning. Five statistical and empirical schemes were created from LDAR, sounding, and radar parameters derived from 116 storms. Four of the five schemes were unsuitable for operational use since lightning advisories would be canceled prematurely, leading to safety risks to personnel. These include a correlation and regression tree analysis, three variants of multiple linear regression, event time trending, and the time delay between the greatest height of the maximum dBZ value to the last flash. These schemes failed to adequately forecast the maximum interval, the greatest time between any two flashes in the storm. The majority of storms had a maximum interval less than 10 min, which biased the schemes toward small values. Success was achieved with the percentile method (PM) by separating the maximum interval into percentiles for the 100 dependent storms.

  8. An empirically derived short form of the Hypoglycaemia Fear Survey II.

    PubMed

    Grabman, J; Vajda Bailey, K; Schmidt, K; Cariou, B; Vaur, L; Madani, S; Cox, D; Gonder-Frederick, L

    2017-04-01

    To develop an empirically derived short version of the Hypoglycaemia Fear Survey II that still accurately measures fear of hypoglycaemia. Item response theory methods were used to generate an 11-item version of the Hypoglycaemia Fear Survey from a sample of 487 people with Type 1 or Type 2 diabetes mellitus. Subsequently, this scale was tested on a sample of 2718 people with Type 1 or insulin-treated Type 2 diabetes taking part in DIALOG, a large observational prospective study of hypoglycaemia in France. The short form of the Hypoglycaemia Fear Survey II matched the factor structure of the long form for respondents with both Type 1 and Type 2 diabetes, while maintaining adequate internal reliability on the total scale and all three subscales. The two forms were highly correlated on both the total scale and each subscale (Pearson's R > 0.89). The short form of the Hypoglycaemia Fear Survey II is an important first step in more efficiently measuring fear of hypoglycaemia. Future prospective studies are needed for further validity testing and exploring the survey's applicability to different populations. © 2016 Diabetes UK.

  9. Quantification of variability and uncertainty for air toxic emission inventories with censored emission factor data.

    PubMed

    Frey, H Christopher; Zhao, Yuchao

    2004-11-15

    Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.

  10. Overview of physical models of liquid entrainment in annular gas-liquid flow

    NASA Astrophysics Data System (ADS)

    Cherdantsev, Andrey V.

    2018-03-01

    A number of recent papers devoted to development of physically-based models for prediction of liquid entrainment in annular regime of two-phase flow are analyzed. In these models shearing-off the crests of disturbance waves by the gas drag force is supposed to be the physical mechanism of entrainment phenomenon. The models are based on a number of assumptions on wavy structure, including inception of disturbance waves due to Kelvin-Helmholtz instability, linear velocity profile inside liquid film and high degree of three-dimensionality of disturbance waves. Validity of the assumptions is analyzed by comparison to modern experimental observations. It was shown that nearly every assumption is in strong qualitative and quantitative disagreement with experiments, which leads to massive discrepancies between the modeled and real properties of the disturbance waves. As a result, such models over-predict the entrained fraction by several orders of magnitude. The discrepancy is usually reduced using various kinds of empirical corrections. This, combined with empiricism already included in the models, turns the models into another kind of empirical correlations rather than physically-based models.

  11. A computational efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Dini, Paolo; Maughmer, Mark D.

    1990-01-01

    In predicting the aerodynamic characteristics of airfoils operating at low Reynolds numbers, it is often important to account for the effects of laminar (transitional) separation bubbles. Previous approaches to the modelling of this viscous phenomenon range from fast but sometimes unreliable empirical correlations for the length of the bubble and the associated increase in momentum thickness, to more accurate but significantly slower displacement-thickness iteration methods employing inverse boundary-layer formulations in the separated regions. Since the penalty in computational time associated with the more general methods is unacceptable for airfoil design applications, use of an accurate yet computationally efficient model is highly desirable. To this end, a semi-empirical bubble model was developed and incorporated into the Eppler and Somers airfoil design and analysis program. The generality and the efficiency was achieved by successfully approximating the local viscous/inviscid interaction, the transition location, and the turbulent reattachment process within the framework of an integral boundary-layer method. Comparisons of the predicted aerodynamic characteristics with experimental measurements for several airfoils show excellent and consistent agreement for Reynolds numbers from 2,000,000 down to 100,000.

  12. Systematic Interpolation Method Predicts Antibody Monomer-Dimer Separation by Gradient Elution Chromatography at High Protein Loads.

    PubMed

    Creasy, Arch; Reck, Jason; Pabst, Timothy; Hunter, Alan; Barker, Gregory; Carta, Giorgio

    2018-05-29

    A previously developed empirical interpolation (EI) method is extended to predict highly overloaded multicomponent elution behavior on a cation exchange (CEX) column based on batch isotherm data. Instead of a fully mechanistic model, the EI method employs an empirically modified multicomponent Langmuir equation to correlate two-component adsorption isotherm data at different salt concentrations. Piecewise cubic interpolating polynomials are then used to predict competitive binding at intermediate salt concentrations. The approach is tested for the separation of monoclonal antibody monomer and dimer mixtures by gradient elution on the cation exchange resin Nuvia HR-S. Adsorption isotherms are obtained over a range of salt concentrations with varying monomer and dimer concentrations. Coupled with a lumped kinetic model, the interpolated isotherms predict the column behavior for highly overloaded conditions. Predictions based on the EI method showed good agreement with experimental elution curves for protein loads up to 40 mg/mL column or about 50% of the column binding capacity. The approach can be extended to other chromatographic modalities and to more than two components. This article is protected by copyright. All rights reserved.

  13. Hydrogeomorphology explains acidification-driven variation in aquatic biological communities in the Neversink Basin, USA

    USGS Publications Warehouse

    Harpold, Adrian A.; Burns, Douglas A.; Walter, M.T.; Steenhuis, Tammo S.

    2013-01-01

    Describing the distribution of aquatic habitats and the health of biological communities can be costly and time-consuming; therefore, simple, inexpensive methods to scale observations of aquatic biota to watersheds that lack data would be useful. In this study, we explored the potential of a simple “hydrogeomorphic” model to predict the effects of acid deposition on macroinvertebrate, fish, and diatom communities in 28 sub-watersheds of the 176-km2 Neversink River basin in the Catskill Mountains of New York State. The empirical model was originally developed to predict stream-water acid neutralizing capacity (ANC) using the watershed slope and drainage density. Because ANC is known to be strongly related to aquatic biological communities in the Neversink, we speculated that the model might correlate well with biotic indicators of ANC response. The hydrogeomorphic model was strongly correlated to several measures of macroinvertebrate and fish community richness and density, but less strongly correlated to diatom acid tolerance. The model was also strongly correlated to biological communities in 18 sub-watersheds independent of the model development, with the linear correlation capturing the strongly acidic nature of small upland watersheds (2). Overall, we demonstrated the applicability of geospatial data sets and a simple hydrogeomorphic model for estimating aquatic biological communities in areas with stream-water acidification, allowing estimates where no direct field observations are available. Similar modeling approaches have the potential to complement or refine expensive and time-consuming measurements of aquatic biota populations and to aid in regional assessments of aquatic health.

  14. A root-mean-square pressure fluctuations model for internal flow applications

    NASA Technical Reports Server (NTRS)

    Chen, Y. S.

    1985-01-01

    A transport equation for the root-mean-square pressure fluctuations of turbulent flow is derived from the time-dependent momentum equation for incompressible flow. Approximate modeling of this transport equation is included to relate terms with higher order correlations to the mean quantities of turbulent flow. Three empirical constants are introduced in the model. Two of the empirical constants are estimated from homogeneous turbulence data and wall pressure fluctuations measurements. The third constant is determined by comparing the results of large eddy simulations for a plane channel flow and an annulus flow.

  15. Semi-empirical airframe noise prediction model

    NASA Technical Reports Server (NTRS)

    Hersh, A. S.; Putnam, T. W.; Lasagna, P. L.; Burcham, F. W., Jr.

    1976-01-01

    A semi-empirical maximum overall sound pressure level (OASPL) airframe noise model was derived. The noise radiated from aircraft wings and flaps was modeled by using the trailing-edge diffracted quadrupole sound theory derived by Ffowcs Williams and Hall. The noise radiated from the landing gear was modeled by using the acoustic dipole sound theory derived by Curle. The model was successfully correlated with maximum OASPL flyover noise measurements obtained at the NASA Dryden Flight Research Center for three jet aircraft - the Lockheed JetStar, the Convair 990, and the Boeing 747 aircraft.

  16. A Symmetric Time-Varying Cluster Rate of Descent Model

    NASA Technical Reports Server (NTRS)

    Ray, Eric S.

    2015-01-01

    A model of the time-varying rate of descent of the Orion vehicle was developed based on the observed correlation between canopy projected area and drag coefficient. This initial version of the model assumes cluster symmetry and only varies the vertical component of velocity. The cluster fly-out angle is modeled as a series of sine waves based on flight test data. The projected area of each canopy is synchronized with the primary fly-out angle mode. The sudden loss of projected area during canopy collisions is modeled at minimum fly-out angles, leading to brief increases in rate of descent. The cluster geometry is converted to drag coefficient using empirically derived constants. A more complete model is under development, which computes the aerodynamic response of each canopy to its local incidence angle.

  17. [The attempts at drug therapy of cancer by Anton Störck (1731-1803). History of experimental pharmacology in the old Vienna Medical School].

    PubMed

    Schweppe, K W; Probst, C

    1982-03-15

    The essay deals with the development of medical research in Vienna - especially the development of therapeutic drugs. This progress is related to the philosophical, historical, and political background of the enlightened absolutism and the reformatory efforts of van Swieten during the regency of Maria Theresia in Austria. Anton Störck's research on hemlock (Conium maculatum) is used as an example. The method of Störck's research-work is described. Furthermore it is demonstrated to what extent Störck's data, deduced from empirical examinations, are integrated in the official medical system, i.e. Boerhaave's iatromechanic system. Finally the attempt is made to correlate these processes of medical history with the scientific-historical model of Thomas Kuhn.

  18. Computer based interpretation of infrared spectra-structure of the knowledge-base, automatic rule generation and interpretation

    NASA Astrophysics Data System (ADS)

    Ehrentreich, F.; Dietze, U.; Meyer, U.; Abbas, S.; Schulz, H.

    1995-04-01

    It is a main task within the SpecInfo-Project to develop interpretation tools that can handle a great deal more of the complicated, more specific spectrum-structure-correlations. In the first step the empirical knowledge about the assignment of structural groups and their characteristic IR-bands has been collected from literature and represented in a computer readable well-structured form. Vague, verbal rules are managed by introduction of linguistic variables. The next step was the development of automatic rule generating procedures. We had combined and enlarged the IDIOTS algorithm with the algorithm by Blaffert relying on set theory. The procedures were successfully applied to the SpecInfo database. The realization of the preceding items is a prerequisite for the improvement of the computerized structure elucidation procedure.

  19. Forecasting Fire Season Severity in South America Using Sea Surface Temperature Anomalies

    NASA Technical Reports Server (NTRS)

    Chen, Yang; Randerson, James T.; Morton, Douglas C.; DeFries, Ruth S.; Collatz, G. James; Kasibhatla, Prasad S.; Giglio, Louis; Jin, Yufang; Marlier, Miriam E.

    2011-01-01

    Fires in South America cause forest degradation and contribute to carbon emissions associated with land use change. We investigated the relationship between year-to-year changes in fire activity in South America and sea surface temperatures. We found that the Oceanic Ni o Index was correlated with interannual fire activity in the eastern Amazon, whereas the Atlantic Multidecadal Oscillation index was more closely linked with fires in the southern and southwestern Amazon. Combining these two climate indices, we developed an empirical model to forecast regional fire season severity with lead times of 3 to 5 months. Our approach may contribute to the development of an early warning system for anticipating the vulnerability of Amazon forests to fires, thus enabling more effective management with benefits for climate and air quality.

  20. Model of Fluidized Bed Containing Reacting Solids and Gases

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Lathouwers, Danny

    2003-01-01

    A mathematical model has been developed for describing the thermofluid dynamics of a dense, chemically reacting mixture of solid particles and gases. As used here, "dense" signifies having a large volume fraction of particles, as for example in a bubbling fluidized bed. The model is intended especially for application to fluidized beds that contain mixtures of carrier gases, biomass undergoing pyrolysis, and sand. So far, the design of fluidized beds and other gas/solid industrial processing equipment has been based on empirical correlations derived from laboratory- and pilot-scale units. The present mathematical model is a product of continuing efforts to develop a computational capability for optimizing the designs of fluidized beds and related equipment on the basis of first principles. Such a capability could eliminate the need for expensive, time-consuming predesign testing.

  1. Prediction of Meiyu rainfall in Taiwan by multi-lead physical-empirical models

    NASA Astrophysics Data System (ADS)

    Yim, So-Young; Wang, Bin; Xing, Wen; Lu, Mong-Ming

    2015-06-01

    Taiwan is located at the dividing point of the tropical and subtropical monsoons over East Asia. Taiwan has double rainy seasons, the Meiyu in May-June and the Typhoon rains in August-September. To predict the amount of Meiyu rainfall is of profound importance to disaster preparedness and water resource management. The seasonal forecast of May-June Meiyu rainfall has been a challenge to current dynamical models and the factors controlling Taiwan Meiyu variability has eluded climate scientists for decades. Here we investigate the physical processes that are possibly important for leading to significant fluctuation of the Taiwan Meiyu rainfall. Based on this understanding, we develop a physical-empirical model to predict Taiwan Meiyu rainfall at a lead time of 0- (end of April), 1-, and 2-month, respectively. Three physically consequential and complementary predictors are used: (1) a contrasting sea surface temperature (SST) tendency in the Indo-Pacific warm pool, (2) the tripolar SST tendency in North Atlantic that is associated with North Atlantic Oscillation, and (3) a surface warming tendency in northeast Asia. These precursors foreshadow an enhanced Philippine Sea anticyclonic anomalies and the anomalous cyclone near the southeastern China in the ensuing summer, which together favor increasing Taiwan Meiyu rainfall. Note that the identified precursors at various lead-times represent essentially the same physical processes, suggesting the robustness of the predictors. The physical empirical model made by these predictors is capable of capturing the Taiwan rainfall variability with a significant cross-validated temporal correlation coefficient skill of 0.75, 0.64, and 0.61 for 1979-2012 at the 0-, 1-, and 2-month lead time, respectively. The physical-empirical model concept used here can be extended to summer monsoon rainfall prediction over the Southeast Asia and other regions.

  2. The Impact of Variability of Selected Geological and Mining Parameters on the Value and Risks of Projects in the Hard Coal Mining Industry

    NASA Astrophysics Data System (ADS)

    Kopacz, Michał

    2017-09-01

    The paper attempts to assess the impact of variability of selected geological (deposit) parameters on the value and risks of projects in the hard coal mining industry. The study was based on simulated discounted cash flow analysis, while the results were verified for three existing bituminous coal seams. The Monte Carlo simulation was based on nonparametric bootstrap method, while correlations between individual deposit parameters were replicated with use of an empirical copula. The calculations take into account the uncertainty towards the parameters of empirical distributions of the deposit variables. The Net Present Value (NPV) and the Internal Rate of Return (IRR) were selected as the main measures of value and risk, respectively. The impact of volatility and correlation of deposit parameters were analyzed in two aspects, by identifying the overall effect of the correlated variability of the parameters and the indywidual impact of the correlation on the NPV and IRR. For this purpose, a differential approach, allowing determining the value of the possible errors in calculation of these measures in numerical terms, has been used. Based on the study it can be concluded that the mean value of the overall effect of the variability does not exceed 11.8% of NPV and 2.4 percentage points of IRR. Neglecting the correlations results in overestimating the NPV and the IRR by up to 4.4%, and 0.4 percentage point respectively. It should be noted, however, that the differences in NPV and IRR values can vary significantly, while their interpretation depends on the likelihood of implementation. Generalizing the obtained results, based on the average values, the maximum value of the risk premium in the given calculation conditions of the "X" deposit, and the correspondingly large datasets (greater than 2500), should not be higher than 2.4 percentage points. The impact of the analyzed geological parameters on the NPV and IRR depends primarily on their co-existence, which can be measured by the strength of correlation. In the analyzed case, the correlations result in limiting the range of variation of the geological parameters and economics results (the empirical copula reduces the NPV and IRR in probabilistic approach). However, this is due to the adjustment of the calculation under conditions similar to those prevailing in the deposit.

  3. Communication: A new class of non-empirical explicit density functionals on the third rung of Jacob's ladder

    NASA Astrophysics Data System (ADS)

    de Silva, Piotr; Corminboeuf, Clémence

    2015-09-01

    We construct an orbital-free non-empirical meta-generalized gradient approximation (GGA) functional, which depends explicitly on density through the density overlap regions indicator [P. de Silva and C. Corminboeuf, J. Chem. Theory Comput. 10, 3745 (2014)]. The functional does not depend on either the kinetic energy density or the density Laplacian; therefore, it opens a new class of meta-GGA functionals. By construction, our meta-GGA yields exact exchange and correlation energy for the hydrogen atom and recovers the second order gradient expansion for exchange in the slowly varying limit. We show that for molecular systems, overall performance is better than non-empirical GGAs. For atomization energies, performance is on par with revTPSS, without any dependence on Kohn-Sham orbitals.

  4. Evolution properties of online user preference diversity

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Ji, Lei; Liu, Jian-Guo; Han, Jingti

    2017-02-01

    Detecting the evolution properties of online user preference diversity is of significance for deeply understanding online collective behaviors. In this paper, we empirically explore the evolution patterns of online user rating preference, where the preference diversity is measured by the variation coefficient of the user rating sequence. The statistical results for four real systems show that, for movies and reviews, the user rating preference would become diverse and then get centralized finally. By introducing the empirical variation coefficient, we present a Markov model, which could regenerate the evolution properties of two online systems regarding to the stable variation coefficients. In addition, we investigate the evolution of the correlation between the user ratings and the object qualities, and find that the correlation would keep increasing as the user degree increases. This work could be helpful for understanding the anchoring bias and memory effects of the online user collective behaviors.

  5. Bearing Fault Detection Based on Empirical Wavelet Transform and Correlated Kurtosis by Acoustic Emission.

    PubMed

    Gao, Zheyu; Lin, Jing; Wang, Xiufeng; Xu, Xiaoqiang

    2017-05-24

    Rolling bearings are widely used in rotating equipment. Detection of bearing faults is of great importance to guarantee safe operation of mechanical systems. Acoustic emission (AE), as one of the bearing monitoring technologies, is sensitive to weak signals and performs well in detecting incipient faults. Therefore, AE is widely used in monitoring the operating status of rolling bearing. This paper utilizes Empirical Wavelet Transform (EWT) to decompose AE signals into mono-components adaptively followed by calculation of the correlated kurtosis (CK) at certain time intervals of these components. By comparing these CK values, the resonant frequency of the rolling bearing can be determined. Then the fault characteristic frequencies are found by spectrum envelope. Both simulation signal and rolling bearing AE signals are used to verify the effectiveness of the proposed method. The results show that the new method performs well in identifying bearing fault frequency under strong background noise.

  6. The study of Thai stock market across the 2008 financial crisis

    NASA Astrophysics Data System (ADS)

    Kanjamapornkul, K.; Pinčák, Richard; Bartoš, Erik

    2016-11-01

    The cohomology theory for financial market can allow us to deform Kolmogorov space of time series data over time period with the explicit definition of eight market states in grand unified theory. The anti-de Sitter space induced from a coupling behavior field among traders in case of a financial market crash acts like gravitational field in financial market spacetime. Under this hybrid mathematical superstructure, we redefine a behavior matrix by using Pauli matrix and modified Wilson loop for time series data. We use it to detect the 2008 financial market crash by using a degree of cohomology group of sphere over tensor field in correlation matrix over all possible dominated stocks underlying Thai SET50 Index Futures. The empirical analysis of financial tensor network was performed with the help of empirical mode decomposition and intrinsic time scale decomposition of correlation matrix and the calculation of closeness centrality of planar graph.

  7. Heat-transfer processes in air-cooled engine cylinders

    NASA Technical Reports Server (NTRS)

    Pinkel, Benjamin

    1938-01-01

    From a consideration of heat-transfer theory, semi-empirical expressions are set up for the transfer of heat from the combustion gases to the cylinder of an air-cooled engine and from the cylinder to the cooling air. Simple equations for the average head and barrel temperatures as functions of the important engine and cooling variables are obtained from these expressions. The expressions involve a few empirical constants, which may be readily determined from engine tests. Numerical values for these constants were obtained from single-cylinder engine tests for cylinders of the Pratt & Whitney 1535 and 1340-h engines. The equations provide a means of calculating the effect of the various engine and cooling variables on the cylinder temperatures and also of correlating the results of engine cooling tests. An example is given of the application of the equations to the correlation of cooling-test data obtained in flight.

  8. Study of Gender Differences in Performance at the U.S. Naval Academy and U.S. Coast Guard Academy

    DTIC Science & Technology

    2005-06-01

    teacher preparation. By using both qualitative and quantitative methods for pre-service teachers, Kelly concludes that most teachers could not identify...Engineering MATH/SCIENCE Marine and Environmental Sciences Math and Computer Science Operations Research SOCIAL SCIENCE Government...Tabachnik and Findell, 2001). Correlational research is often a good precursor to answering other questions by empirical methods . Correlations measure the

  9. The virulence–transmission trade-off in vector-borne plant viruses: a review of (non-)existing studies

    PubMed Central

    Froissart, R.; Doumayrou, J.; Vuillaume, F.; Alizon, S.; Michalakis, Y.

    2010-01-01

    The adaptive hypothesis invoked to explain why parasites harm their hosts is known as the trade-off hypothesis, which states that increased parasite transmission comes at the cost of shorter infection duration. This correlation arises because both transmission and disease-induced mortality (i.e. virulence) are increasing functions of parasite within-host density. There is, however, a glaring lack of empirical data to support this hypothesis. Here, we review empirical investigations reporting to what extent within-host viral accumulation determines the transmission rate and the virulence of vector-borne plant viruses. Studies suggest that the correlation between within-plant viral accumulation and transmission rate of natural isolates is positive. Unfortunately, results on the correlation between viral accumulation and virulence are very scarce. We found only very few appropriate studies testing such a correlation, themselves limited by the fact that they use symptoms as a proxy for virulence and are based on very few viral genotypes. Overall, the available evidence does not allow us to confirm or refute the existence of a transmission–virulence trade-off for vector-borne plant viruses. We discuss the type of data that should be collected and how theoretical models can help us refine testable predictions of virulence evolution. PMID:20478886

  10. ESTIMATION OF FUNCTIONALS OF SPARSE COVARIANCE MATRICES.

    PubMed

    Fan, Jianqing; Rigollet, Philippe; Wang, Weichen

    High-dimensional statistical tests often ignore correlations to gain simplicity and stability leading to null distributions that depend on functionals of correlation matrices such as their Frobenius norm and other ℓ r norms. Motivated by the computation of critical values of such tests, we investigate the difficulty of estimation the functionals of sparse correlation matrices. Specifically, we show that simple plug-in procedures based on thresholded estimators of correlation matrices are sparsity-adaptive and minimax optimal over a large class of correlation matrices. Akin to previous results on functional estimation, the minimax rates exhibit an elbow phenomenon. Our results are further illustrated in simulated data as well as an empirical study of data arising in financial econometrics.

  11. ESTIMATION OF FUNCTIONALS OF SPARSE COVARIANCE MATRICES

    PubMed Central

    Fan, Jianqing; Rigollet, Philippe; Wang, Weichen

    2016-01-01

    High-dimensional statistical tests often ignore correlations to gain simplicity and stability leading to null distributions that depend on functionals of correlation matrices such as their Frobenius norm and other ℓr norms. Motivated by the computation of critical values of such tests, we investigate the difficulty of estimation the functionals of sparse correlation matrices. Specifically, we show that simple plug-in procedures based on thresholded estimators of correlation matrices are sparsity-adaptive and minimax optimal over a large class of correlation matrices. Akin to previous results on functional estimation, the minimax rates exhibit an elbow phenomenon. Our results are further illustrated in simulated data as well as an empirical study of data arising in financial econometrics. PMID:26806986

  12. Development of the diabetes typology model for discerning Type 2 diabetes mellitus with national survey data.

    PubMed

    Bellatorre, Anna; Jackson, Sharon H; Choi, Kelvin

    2017-01-01

    To classify individuals with diabetes mellitus (DM) into DM subtypes using population-based studies. Population-based survey. Individuals participated in 2003-2004, 2005-2006, or 2009-2010 the National Health and Nutrition Examination Survey (NHANES), and 2010 Coronary Artery Risk Development in Young Adults (CARDIA) survey (research materials obtained from the National Heart, Lung, and Blood Institute Biologic Specimen and Data Repository Information Coordinating Center). 3084, 3040 and 3318 US adults from the 2003-2004, 2005-2006 and 2009-2010 NHANES samples respectively, and 5,115 US adults in the CARDIA cohort. We proposed the Diabetes Typology Model (DTM) through the use of six composite measures based on the Homeostatic Model Assessment (HOMA-IR, HOMA-%β, high HOMA-%S), insulin and glucose levels, and body mass index and conducted latent class analyses to empirically classify individuals into different classes. Three empirical latent classes consistently emerged across studies (entropy = 0.81-0.998). These three classes were likely Type 1 DM, likely Type 2 DM, and atypical DM. The classification has high sensitivity (75.5%), specificity (83.3%), and positive predictive value (97.4%) when validated against C-peptide level. Correlates of Type 2 DM were significantly associated with model-identified Type 2 DM. Compared to regression analysis on known correlates of Type 2 DM using all diabetes cases as outcomes, using DTM to remove likely Type 1 DM and atypical DM cases results in a 2.5-5.3% r-square improvement in the regression analysis, as well as model fits as indicated by significant improvement in -2 log likelihood (p<0.01). Lastly, model-defined likely Type 2 DM was significantly associated with known correlates of Type 2 DM (e.g., age, waist circumference), which provide additional validation of the DTM-defined classes. Our Diabetes Typology Model reflects a promising first step toward discerning likely DM types from population-based data. This novel tool will improve how large population-based studies can be used to examine behavioral and environmental factors associated with different types of DM.

  13. Magnetosphere-ionosphere coupling during substorm onset

    NASA Technical Reports Server (NTRS)

    Maynard, N. C.; Burke, W. J.; Erickson, G. M.; Basinka, E. M.; Yahnin, A. G.

    1996-01-01

    Through the analysis of a combination of CRRES satellite measurements and ground-based measurements, an empirical scenario was developed for the onset of substorms. The process develops from ripples at the inner edge of the plasma sheet associated with dusk to dawn excursions of the electric field, prior to the beginning of dipolarization. The importance of Poynting flux is considered. Substorms develop when significant amounts of energy flow in both directions with the second cycle stronger than the initial cycle. Pseudobreakups occur when the energy flowing in both directions is weak or out of phase. The observations indicate that the dusk to dawn excursions of the cross-tail electric field correlate with changes in currents and particle energies observed by CRRES, and with ultra low frequency wave activity observed on the ground. Magnetic signatures of field aligned current filaments, associated with the substorm current wedge were observed to be initiated by the process.

  14. Development and validation of the multidimensional state boredom scale.

    PubMed

    Fahlman, Shelley A; Mercer-Lynn, Kimberley B; Flora, David B; Eastwood, John D

    2013-02-01

    This article describes the development and validation of the Multidimensional State Boredom Scale (MSBS)-the first and only full-scale measure of state boredom. It was developed based on a theoretically and empirically grounded definition of boredom. A five-factor structure of the scale (Disengagement, High Arousal, Low Arousal, Inattention, and Time Perception) was supported by exploratory factor analyses and confirmatory factor analyses of two independent samples. Furthermore, all subscales were significantly related to a single, second-order factor. The MSBS factor structure was shown to be invariant across gender. MSBS scores were significantly correlated with measures of trait boredom, depression, anxiety, anger, inattention, impulsivity, neuroticism, life satisfaction, and purpose in life. Finally, MSBS scores distinguished between participants who were experimentally manipulated into a state of boredom and those who were not, above and beyond measures of trait boredom, negative affect, and depression.

  15. Fuel Performance Experiments and Modeling: Fission Gas Bubble Nucleation and Growth in Alloy Nuclear Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean; Shao, Lin; Tsvetkov, Pavel

    2014-04-07

    Advanced fast reactor systems being developed under the DOE's Advanced Fuel Cycle Initiative are designed to destroy TRU isotopes generated in existing and future nuclear energy systems. Over the past 40 years, multiple experiments and demonstrations have been completed using U-Zr, U-Pu-Zr, U-Mo and other metal alloys. As a result, multiple empirical and semi-empirical relationships have been established to develop empirical performance modeling codes. Many mechanistic questions about fission as mobility, bubble coalescience, and gas release have been answered through industrial experience, research, and empirical understanding. The advent of modern computational materials science, however, opens new doors of development suchmore » that physics-based multi-scale models may be developed to enable a new generation of predictive fuel performance codes that are not limited by empiricism.« less

  16. Redundant correlation effect on personalized recommendation

    NASA Astrophysics Data System (ADS)

    Qiu, Tian; Han, Teng-Yue; Zhong, Li-Xin; Zhang, Zi-Ke; Chen, Guang

    2014-02-01

    The high-order redundant correlation effect is investigated for a hybrid algorithm of heat conduction and mass diffusion (HHM), through both heat conduction biased (HCB) and mass diffusion biased (MDB) correlation redundancy elimination processes. The HCB and MDB algorithms do not introduce any additional tunable parameters, but keep the simple character of the original HHM. Based on two empirical datasets, the Netflix and MovieLens, the HCB and MDB are found to show better recommendation accuracy for both the overall objects and the cold objects than the HHM algorithm. Our work suggests that properly eliminating the high-order redundant correlations can provide a simple and effective approach to accurate recommendation.

  17. A First Look at Electric Motor Noise For Future Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.; Henderson, Brenda S.; Envia, Edmane

    2016-01-01

    Motor tone predictions using a vibration analysis and input from design parameters for high power density motors show that the noise can be significantly higher or lower than the empirical correlations and exceeds the stated uncertainty.

  18. An empirical and model study on automobile market in Taiwan

    NASA Astrophysics Data System (ADS)

    Tang, Ji-Ying; Qiu, Rong; Zhou, Yueping; He, Da-Ren

    2006-03-01

    We have done an empirical investigation on automobile market in Taiwan including the development of the possession rate of the companies in the market from 1979 to 2003, the development of the largest possession rate, and so on. A dynamic model for describing the competition between the companies is suggested based on the empirical study. In the model each company is given a long-term competition factor (such as technology, capital and scale) and a short-term competition factor (such as management, service and advertisement). Then the companies play games in order to obtain more possession rate in the market under certain rules. Numerical simulation based on the model display a competition developing process, which qualitatively and quantitatively agree with our empirical investigation results.

  19. Formulation, Implementation and Validation of a Two-Fluid model in a Fuel Cell CFD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Kunal; Cole, J. Vernon; Kumar, Sanjiv

    2008-12-01

    Water management is one of the main challenges in PEM Fuel Cells. While water is essential for membrane electrical conductivity, excess liquid water leads to flooding of catalyst layers. Despite the fact that accurate prediction of two-phase transport is key for optimal water management, understanding of the two-phase transport in fuel cells is relatively poor. Wang et. al. have studied the two-phase transport in the channel and diffusion layer separately using a multiphase mixture model. The model fails to accurately predict saturation values for high humidity inlet streams. Nguyen et. al. developed a two-dimensional, two-phase, isothermal, isobaric, steady state modelmore » of the catalyst and gas diffusion layers. The model neglects any liquid in the channel. Djilali et. al. developed a three-dimensional two-phase multicomponent model. The model is an improvement over previous models, but neglects drag between the liquid and the gas phases in the channel. In this work, we present a comprehensive two-fluid model relevant to fuel cells. Models for two-phase transport through Channel, Gas Diffusion Layer (GDL) and Channel-GDL interface, are discussed. In the channel, the gas and liquid pressures are assumed to be same. The surface tension effects in the channel are incorporated using the continuum surface force (CSF) model. The force at the surface is expressed as a volumetric body force and added as a source to the momentum equation. In the GDL, the gas and liquid are assumed to be at different pressures. The difference in the pressures (capillary pressure) is calculated using an empirical correlations. At the Channel-GDL interface, the wall adhesion affects need to be taken into account. SIMPLE-type methods recast the continuity equation into a pressure-correction equation, the solution of which then provides corrections for velocities and pressures. However, in the two-fluid model, the presence of two phasic continuity equations gives more freedom and more complications. A general approach would be to form a mixture continuity equation by linearly combining the phasic continuity equations using appropriate weighting factors. Analogous to mixture equation for pressure correction, a difference equation is used for the volume/phase fraction by taking the difference between the phasic continuity equations. The relative advantages of the above mentioned algorithmic variants for computing pressure correction and volume fractions are discussed and quantitatively assessed. Preliminary model validation is done for each component of the fuel cell. The two-phase transport in the channel is validated using empirical correlations. Transport in the GDL is validated against results obtained from LBM and VOF simulation techniques. The Channel-GDL interface transport will be validated against experiment and empirical correlation of droplet detachment at the interface.« less

  20. Body surface assessment with 3D laser-based anthropometry: reliability, validation, and improvement of empirical surface formulae.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Scholz, Markus

    2017-02-01

    Body surface area is a physiological quantity relevant for many medical applications. In clinical practice, it is determined by empirical formulae. 3D laser-based anthropometry provides an easy and effective way to measure body surface area but is not ubiquitously available. We used data from laser-based anthropometry from a population-based study to assess validity of published and commonly used empirical formulae. We performed a large population-based study on adults collecting classical anthropometric measurements and 3D body surface assessments (N = 1435). We determined reliability of the 3D body surface assessment and validity of 18 different empirical formulae proposed in the literature. The performance of these formulae is studied in subsets of sex and BMI. Finally, improvements of parameter settings of formulae and adjustments for sex and BMI were considered. 3D body surface measurements show excellent intra- and inter-rater reliability of 0.998 (overall concordance correlation coefficient, OCCC was used as measure of agreement). Empirical formulae of Fujimoto and Watanabe, Shuter and Aslani and Sendroy and Cecchini performed best with excellent concordance with OCCC > 0.949 even in subgroups of sex and BMI. Re-parametrization of formulae and adjustment for sex and BMI slightly improved results. In adults, 3D laser-based body surface assessment is a reliable alternative to estimation by empirical formulae. However, there are empirical formulae showing excellent results even in subgroups of sex and BMI with only little room for improvement.

  1. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.

  2. Correlates of STI testing among vocational school students in the Netherlands

    PubMed Central

    2010-01-01

    Background Adolescents are at risk for acquiring sexually transmitted infections (STIs). However, test rates among adolescents in the Netherlands are low and effective interventions that encourage STI testing are scarce. Adolescents who attend vocational schools are particularly at risk for STI. The purpose of this study is to inform the development of motivational health promotion messages by identifying the psychosocial correlates of STI testing intention among adolescents with sexual experience attending vocational schools. Methods This study was conducted among 501 students attending vocational schools aged 16 to 25 years (mean 18.3 years ± 2.1). Data were collected via a web-based survey exploring relationships, sexual behavior and STI testing behavior. Items measuring the psychosocial correlates of testing were derived from Fishbein's Integrative Model. Data were subjected to multiple regression analyses. Results Students reported substantial sexual risk behavior and low intention to participate in STI testing. The model explained 39% of intention to engage in STI testing. The most important predictor was attitude. Perceived norms, perceived susceptibility and test site characteristics were also significant predictors. Conclusions The present study provides important and relevant empirical input for the development of health promotion interventions aimed at motivating adolescents at vocational schools in the Netherlands to participate in STI testing. Health promotion interventions developed for this group should aim to change attitudes, address social norms and increase personal risk perception for STI while also promoting the accessibility of testing facilities. PMID:21106064

  3. Does Branding Need Web Usability? A Value-Oriented Empirical Study

    NASA Astrophysics Data System (ADS)

    Bolchini, Davide; Garzotto, Franca; Sorce, Fabio

    Does usability of a web-based communication artifact affect brand, i.e., the set of beliefs, emotions, attitudes, or qualities that people mentally associate to the entity behind that artifact? Intuitively, the answer is “yes”: usability is a fundamental aspect of the quality of the experience with a website, and a “good” experience with a “product” or its reifications tends to translate into “good” brand perception. To date, however, the existence of a connection between web usability and brand perception is shown through anecdotic arguments, and is not supported by published systematic research. This paper discusses a study that empirically investigates this correlation in a more rigorous, analytical, and replicable way. Our main contribution is twofold: on the one hand, we provide empirical evidence to the heuristic principle that web usability influences branding, and we do that through four between subjects controlled experiments that involved 120 subjects. On the other hand, we inform the study with a systematic value-oriented approach to the user experience, and thus provide a conceptual framework that can be reused in other experimental settings, either for replicating our study, or for designing similar studies focusing on the correlation of web branding vs. design factors other than usability.

  4. Molecular basis of quantitative structure-properties relationships (QSPR): a quantum similarity approach.

    PubMed

    Ponec, R; Amat, L; Carbó-Dorca, R

    1999-05-01

    Since the dawn of quantitative structure-properties relationships (QSPR), empirical parameters related to structural, electronic and hydrophobic molecular properties have been used as molecular descriptors to determine such relationships. Among all these parameters, Hammett sigma constants and the logarithm of the octanol-water partition coefficient, log P, have been massively employed in QSPR studies. In the present paper, a new molecular descriptor, based on quantum similarity measures (QSM), is proposed as a general substitute of these empirical parameters. This work continues previous analyses related to the use of QSM to QSPR, introducing molecular quantum self-similarity measures (MQS-SM) as a single working parameter in some cases. The use of MQS-SM as a molecular descriptor is first confirmed from the correlation with the aforementioned empirical parameters. The Hammett equation has been examined using MQS-SM for a series of substituted carboxylic acids. Then, for a series of aliphatic alcohols and acetic acid esters, log P values have been correlated with the self-similarity measure between density functions in water and octanol of a given molecule. And finally, some examples and applications of MQS-SM to determine QSAR are presented. In all studied cases MQS-SM appeared to be excellent molecular descriptors usable in general QSPR applications of chemical interest.

  5. Molecular basis of quantitative structure-properties relationships (QSPR): A quantum similarity approach

    NASA Astrophysics Data System (ADS)

    Ponec, Robert; Amat, Lluís; Carbó-dorca, Ramon

    1999-05-01

    Since the dawn of quantitative structure-properties relationships (QSPR), empirical parameters related to structural, electronic and hydrophobic molecular properties have been used as molecular descriptors to determine such relationships. Among all these parameters, Hammett σ constants and the logarithm of the octanol- water partition coefficient, log P, have been massively employed in QSPR studies. In the present paper, a new molecular descriptor, based on quantum similarity measures (QSM), is proposed as a general substitute of these empirical parameters. This work continues previous analyses related to the use of QSM to QSPR, introducing molecular quantum self-similarity measures (MQS-SM) as a single working parameter in some cases. The use of MQS-SM as a molecular descriptor is first confirmed from the correlation with the aforementioned empirical parameters. The Hammett equation has been examined using MQS-SM for a series of substituted carboxylic acids. Then, for a series of aliphatic alcohols and acetic acid esters, log P values have been correlated with the self-similarity measure between density functions in water and octanol of a given molecule. And finally, some examples and applications of MQS-SM to determine QSAR are presented. In all studied cases MQS-SM appeared to be excellent molecular descriptors usable in general QSPR applications of chemical interest.

  6. Software Development Management: Empirical and Analytical Perspectives

    ERIC Educational Resources Information Center

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  7. Globalization, Development and International Migration: A Cross-National Analysis of Less-Developed Countries, 1970-2000

    ERIC Educational Resources Information Center

    Sanderson, Matthew R.; Kentor, Jeffrey D.

    2009-01-01

    It is widely argued that globalization and economic development are associated with international migration. However, these relationships have not been tested empirically. We use a cross-national empirical analysis to assess the impact of global and national factors on international migration from less-developed countries. An interdisciplinary…

  8. Method for stationarity-segmentation of spike train data with application to the Pearson cross-correlation.

    PubMed

    Quiroga-Lombard, Claudio S; Hass, Joachim; Durstewitz, Daniel

    2013-07-01

    Correlations among neurons are supposed to play an important role in computation and information coding in the nervous system. Empirically, functional interactions between neurons are most commonly assessed by cross-correlation functions. Recent studies have suggested that pairwise correlations may indeed be sufficient to capture most of the information present in neural interactions. Many applications of correlation functions, however, implicitly tend to assume that the underlying processes are stationary. This assumption will usually fail for real neurons recorded in vivo since their activity during behavioral tasks is heavily influenced by stimulus-, movement-, or cognition-related processes as well as by more general processes like slow oscillations or changes in state of alertness. To address the problem of nonstationarity, we introduce a method for assessing stationarity empirically and then "slicing" spike trains into stationary segments according to the statistical definition of weak-sense stationarity. We examine pairwise Pearson cross-correlations (PCCs) under both stationary and nonstationary conditions and identify another source of covariance that can be differentiated from the covariance of the spike times and emerges as a consequence of residual nonstationarities after the slicing process: the covariance of the firing rates defined on each segment. Based on this, a correction of the PCC is introduced that accounts for the effect of segmentation. We probe these methods both on simulated data sets and on in vivo recordings from the prefrontal cortex of behaving rats. Rather than for removing nonstationarities, the present method may also be used for detecting significant events in spike trains.

  9. The Thurgood Marshall School of Law Empirical Findings: A Report of the Correlational Analysis of Bar Passing Rates and Final GPA of Years 2005-2009

    ERIC Educational Resources Information Center

    Kadhi, T.; Holley, D.; Palasota, A.; Garrison, P.; Green, T.

    2010-01-01

    The following analysis was done to investigate the findings of the Correlational Relationship (R) between the Bar Passing Rates and GPAs of the Years 2005-2009. This report of findings was done to see if there are any significant relationships between the three variables (Bar Pass/Fail/Unknown, Overall GPA, and Bar GPA). The following procedures…

  10. Forecasting F10.7 with Solar Magnetic Flux Transport Modeling (Postprint)

    DTIC Science & Technology

    2012-04-03

    Charles N. Arge Joel B. Mozer Project Manager, RVBXS Chief, RVB This report is published in the interest of...within 6 hours of the F10.7 measurements during the years 1993 through 2010, the Spearman correlation coefficient, rs, for an empirical model of...estimation of the Earth-side solar magnetic field distribution used to forecast F10.7. Spearman correlation values of approximately 0.97, 0.95, and 0.93 are

  11. Substituent Effects on Thermal Decolorization Rates of Bisbenzospiropyrans

    PubMed Central

    Lu, Nina T.; Nguyen, Vi N.; Kumar, Satish; McCurdy, Alison

    2009-01-01

    A novel application of photochromic molecules is to mimic physiological oscillatory calcium signals by reversibly binding and releasing calcium ions in response to light. Substituent changes on the largely unexplored photochromic bisbenzospiropyran scaffold led to significant changes in thermal fading rates in several organic solvents. Excellent correlations have been found between fading rates and empirical Hammett constants as well as calculated ground-state energies. These correlations can be used to improve scaffold design. PMID:16238356

  12. Characterizing Cyclostationary Features of Digital Modulated Signals with Empirical Measurements using Spectral Correlation Function

    DTIC Science & Technology

    2011-06-01

    USING SPECTRAL CORRELATION FUNCTION THESIS Mujun Song, Captain, ROKA AFIT/GCE/ENG/11-09 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR...Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the...generator, Agilent E4438C, ESG Vector Signal Generator. Universal Software Radio Peripheral 2 (USRP2), which is a Software Defined Radio (SDR), is used

  13. Cross-correlations between crude oil and exchange markets for selected oil rich economies

    NASA Astrophysics Data System (ADS)

    Li, Jianfeng; Lu, Xinsheng; Zhou, Ying

    2016-07-01

    Using multifractal detrended cross-correlation analysis (MF-DCCA), this paper studies the cross-correlation behavior between crude oil market and five selected exchange rate markets. The dataset covers the period of January 1,1996-December 31,2014, and contains 4,633 observations for each of the series, including daily closing prices of crude oil, Australian Dollars, Canadian Dollars, Mexican Pesos, Russian Rubles, and South African Rand. Our empirical results obtained from cross-correlation statistic and cross-correlation coefficient have confirmed the existence of cross-correlations, and the MF-DCCA results have demonstrated a strong multifractality between cross-correlated crude oil market and exchange rate markets in both short term and long term. Using rolling window analysis, we have also found the persistent cross-correlations between the exchange rates and crude oil returns, and the cross-correlation scaling exponents exhibit volatility during some time periods due to its sensitivity to sudden events.

  14. Brief Communication: A Simplified Approach to Transient Convective Droplet Evaporation and Burning

    NASA Technical Reports Server (NTRS)

    Madooglu, K.; Karagozian, A. R.

    1994-01-01

    Empirical correlations for evaporation rates from single fuel droplets have existed since the 1930s. These correlations, which will be referred to in this article as Froessling/Ranz-Marshall types of correlations, are appropriate to the special cases of steady-state evaporation in the absence of chemical reaction. In a previous article by the authors, the quasi-steady evaporation and burning processes associated with a fuel drop in a convective environment are examined through a droplet model based on the boundary layer approach. For droplet Reynolds numbers of practical interest, this model produces very reasonable steady state as well as quasi-time-dependent droplet simulations, requiring relatively short computational times and yielding good agreement with the above-mentioned empirical correlations. The steady-state case, however, is usually relevant to practical combustor situations only when the drop has reached a nearly uniform temperature since the heating process of the drop cannot be considered to be quasi-steady. In the present study, the transient heating process of the droplet interior during evaporation and/or burning is taken into account, and thus calculations pertaining to the entire life-time of the droplet are carried out. It is of particular interest here to obtain simplified correlations to describe the transient behavior of evaporating and burning droplets; these may be incorporated with greater ease into spray calculations. Accordingly, we have chosen to use stagnation conditions in the present model in a modification of the Froessling/Ranz-Marshall correlations. These modified correlations, incorporating an effective transfer number, produce a fairly accurate representation of droplet evaporation and burning, while requiring only one tenth the computational effort used in a full boundary layer solution.

  15. The costs of the soviet empire.

    PubMed

    Wolf, C

    1985-11-29

    A comprehensive framework is developed and applied to estimate the economic costs incurred by the Soviet Union in acquiring, maintaining, and expanding its empire. The terms "empire" and "costs" are explicitly defined. Between 1971 and 1980, the average ratio between empire costs and Soviet gross national product was about 3.5 percent; as a ratio to Soviet military spending, empire costs averaged about 28 percent. The burden imposed on Soviet economic growth by empire costs is also considered, as well as rates of change in these costs, and the important political, military, and strategic benefits associated by the Soviet leadership with maintenance and expansion of the empire. Prospective empire costs and changes in Soviet economic constraints resulting from the declining performance of the domestic economy are also considered.

  16. Development of Alabama traffic factors for use in mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2015-02-01

    The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...

  17. Does Mandatory Attendance Improve Student Performance?

    ERIC Educational Resources Information Center

    Marburger, Daniel R.

    2006-01-01

    Previous empirical literature indicates that student performance is inversely correlated with absenteeism. The author investigates the impact of enforcing an attendance policy on absenteeism and student performance. The evidence suggests that an enforced mandatory attendance policy significantly reduces absenteeism and improves exam performance.

  18. DOTD implements soil measuring device to increase life of pavements : fact sheet.

    DOT National Transportation Integrated Search

    2011-11-01

    The resilient modulus (Mr) of : pavement materials and subgrades : is an important input parameter for : the design of pavement structures. : Highway agencies tried to seek : diff erent surrogates. Various empirical : correlations have been used to p...

  19. New calibration algorithms for dielectric-based microwave moisture sensors

    USDA-ARS?s Scientific Manuscript database

    New calibration algorithms for determining moisture content in granular and particulate materials from measurement of the dielectric properties at a single microwave frequency are proposed. The algorithms are based on identifying empirically correlations between the dielectric properties and the par...

  20. Burst and inter-burst duration statistics as empirical test of long-range memory in the financial markets

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kononovicius, A.

    2017-10-01

    We address the problem of long-range memory in the financial markets. There are two conceptually different ways to reproduce power-law decay of auto-correlation function: using fractional Brownian motion as well as non-linear stochastic differential equations. In this contribution we address this problem by analyzing empirical return and trading activity time series from the Forex. From the empirical time series we obtain probability density functions of burst and inter-burst duration. Our analysis reveals that the power-law exponents of the obtained probability density functions are close to 3 / 2, which is a characteristic feature of the one-dimensional stochastic processes. This is in a good agreement with earlier proposed model of absolute return based on the non-linear stochastic differential equations derived from the agent-based herding model.

  1. Asymmetric interaction and indeterminate fitness correlation between cooperative partners in the fig–fig wasp mutualism

    PubMed Central

    Wang, Rui-Wu; Sun, Bao-Fa; Zheng, Qi; Shi, Lei; Zhu, Lixing

    2011-01-01

    Empirical observations have shown that cooperative partners can compete for common resources, but what factors determine whether partners cooperate or compete remain unclear. Using the reciprocal fig–fig wasp mutualism, we show that nonlinear amplification of interference competition between fig wasps—which limits the fig wasps' ability to use a common resource (i.e. female flowers)—keeps the common resource unsaturated, making cooperation locally stable. When interference competition was manually prevented, the fitness correlation between figs and fig wasps went from positive to negative. This indicates that genetic relatedness or reciprocal exchange between cooperative players, which could create spatial heterogeneity or self-restraint, was not sufficient to maintain stable cooperation. Moreover, our analysis of field-collected data shows that the fitness correlation between cooperative partners varies stochastically, and that the mainly positive fitness correlation observed during the warm season shifts to a negative correlation during the cold season owing to an increase in the initial oviposition efficiency of each fig wasp. This implies that the discriminative sanction of less-cooperative wasps (i.e. by decreasing the egg deposition efficiency per fig wasp) but reward to cooperative wasps by fig, a control of the initial value, will facilitate a stable mutualism. Our finding that asymmetric interaction leading to an indeterminate fitness interaction between symbiont (i.e. cooperative actors) and host (i.e. recipient) has the potential to explain why conflict has been empirically observed in both well-documented intraspecific and interspecific cooperation systems. PMID:21490005

  2. Identification of AR(I)MA processes for modelling temporal correlations of GPS observations

    NASA Astrophysics Data System (ADS)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In many geodetic applications observations of the Global Positioning System (GPS) are routinely processed by means of the least-squares method. However, this algorithm delivers reliable estimates of unknown parameters und realistic accuracy measures only if both the functional and stochastic models are appropriately defined within GPS data processing. One deficiency of the stochastic model used in many GPS software products consists in neglecting temporal correlations of GPS observations. In practice the knowledge of the temporal stochastic behaviour of GPS observations can be improved by analysing time series of residuals resulting from the least-squares evaluation. This paper presents an approach based on the theory of autoregressive (integrated) moving average (AR(I)MA) processes to model temporal correlations of GPS observations using time series of observation residuals. A practicable integration of AR(I)MA models in GPS data processing requires the determination of the order parameters of AR(I)MA processes at first. In case of GPS, the identification of AR(I)MA processes could be affected by various factors impacting GPS positioning results, e.g. baseline length, multipath effects, observation weighting, or weather variations. The influences of these factors on AR(I)MA identification are empirically analysed based on a large amount of representative residual time series resulting from differential GPS post-processing using 1-Hz observation data collected within the permanent SAPOS® (Satellite Positioning Service of the German State Survey) network. Both short and long time series are modelled by means of AR(I)MA processes. The final order parameters are determined based on the whole residual database; the corresponding empirical distribution functions illustrate that multipath and weather variations seem to affect the identification of AR(I)MA processes much more significantly than baseline length and observation weighting. Additionally, the modelling results of temporal correlations using high-order AR(I)MA processes are compared with those by means of first order autoregressive (AR(1)) processes and empirically estimated autocorrelation functions.

  3. Work engagement, work commitment and their association with well-being in health care.

    PubMed

    Kanste, Outi

    2011-12-01

    The aim was to examine whether work engagement and work commitment can be empirically discriminated and how they are associated with well-being. The terminology used in literature and in practice is confused by the interchangeable use of these terms. Only few studies, like Hallberg and Schaufeli's study, have examined the relationships between work engagement and work commitment systematically by using empirical data. In this study, the data were gathered via self-reported questionnaire from the healthcare staff working in 14 health centres and four hospitals in Finland. The data consisted of 435 responses. The material was analysed by using structural equation modelling (SEM) and correlations. The items of work engagement and work commitment dimensions (identification with organization, willingness to exert in organization's favour, occupational commitment and job involvement) loaded on their own latent variables in SEM analysis, so the data supported this five-factor model. Work engagement and work commitment dimensions were positively related, sharing between 2 and 33% of their variances. These constructs also displayed different correlations with some indicators of well-being measured as personal accomplishment, psychological well-being, mental resources, internal work motivation and willingness to stay on at work. Work engagement had moderate positive correlation to personal accomplishment (r = 0.68, p < 0.001). Identification with organization (r = 0.40, p < 0.001), willingness to exert in organization's favour (r =0.44, p < 0.001) and occupational commitment (r =0.37, p < 0.001) had low correlations to personal accomplishment. The results support the notion that work engagement can be empirically discriminated from work commitment. They are distinct, yet related constructs that complement each other, describing different aspects of positive attitudes towards work. The results can be utilized in interventions aimed at quality of working life in health care as well as in studies investigating discriminant and construct validity. © 2011 The Author. Scandinavian Journal of Caring Sciences © 2011 Nordic College of Caring Science.

  4. A consistent hierarchy of generalized kinetic equation approximations to the master equation applied to surface catalysis.

    PubMed

    Herschlag, Gregory J; Mitran, Sorin; Lin, Guang

    2015-06-21

    We develop a hierarchy of approximations to the master equation for systems that exhibit translational invariance and finite-range spatial correlation. Each approximation within the hierarchy is a set of ordinary differential equations that considers spatial correlations of varying lattice distance; the assumption is that the full system will have finite spatial correlations and thus the behavior of the models within the hierarchy will approach that of the full system. We provide evidence of this convergence in the context of one- and two-dimensional numerical examples. Lower levels within the hierarchy that consider shorter spatial correlations are shown to be up to three orders of magnitude faster than traditional kinetic Monte Carlo methods (KMC) for one-dimensional systems, while predicting similar system dynamics and steady states as KMC methods. We then test the hierarchy on a two-dimensional model for the oxidation of CO on RuO2(110), showing that low-order truncations of the hierarchy efficiently capture the essential system dynamics. By considering sequences of models in the hierarchy that account for longer spatial correlations, successive model predictions may be used to establish empirical approximation of error estimates. The hierarchy may be thought of as a class of generalized phenomenological kinetic models since each element of the hierarchy approximates the master equation and the lowest level in the hierarchy is identical to a simple existing phenomenological kinetic models.

  5. Evolution of structure-reactivity correlations for the hydrogen abstraction reaction by chlorine atom.

    PubMed

    Poutsma, Marvin L

    2013-01-31

    Empirical structure-reactivity correlations are developed for log k(298), the gas-phase rate constants for the reaction (Cl(•) + HCR(3) → ClH + CR(3)(•)). It has long been recognized that correlation with Δ(r)H is weak. The poor performance of the linear Evans-Polanyi formulation is illustrated and was little improved by adding a quadratic term, for example, by making its slope smoothly dependent on Δ(r)H [η ≡ (Δ(r)H - Δ(r)H(min))/(Δ(r)H(max) - Δ(r)H(min))]. The "polar effect" ((δ-)Cl---H---CR(3)(δ+))(++) has also been long discussed, but there is no formalization of this dependence based on widely available independent variable(s). Using the sum of Hammett constants for the R substituents also gave at best modest correlations, either for σ(para) or for its dissection into F (field/inductive) and R (resonance) effects. Much greater success was achieved by combining these approaches with the preferred independent variable set being either [(Δ(r)H)(2), Δ(r)H, ΣF, and ΣR] or [η, Δ(r)H, ΣF, and ΣR]. For 64 rate constants that span 7 orders of magnitude, these correlation formulations give r(2) > 0.87 and a mean unsigned deviation of <0.5 log k units, with even better performance if primary, secondary, and tertiary reaction centers are treated separately.

  6. Systematic approach to developing empirical interatomic potentials for III-N semiconductors

    NASA Astrophysics Data System (ADS)

    Ito, Tomonori; Akiyama, Toru; Nakamura, Kohji

    2016-05-01

    A systematic approach to the derivation of empirical interatomic potentials is developed for III-N semiconductors with the aid of ab initio calculations. The parameter values of empirical potential based on bond order potential are determined by reproducing the cohesive energy differences among 3-fold coordinated hexagonal, 4-fold coordinated zinc blende, wurtzite, and 6-fold coordinated rocksalt structures in BN, AlN, GaN, and InN. The bond order p is successfully introduced as a function of the coordination number Z in the form of p = a exp(-bZn ) if Z ≤ 4 and p = (4/Z)α if Z ≥ 4 in empirical interatomic potential. Moreover, the energy difference between wurtzite and zinc blende structures can be successfully evaluated by considering interaction beyond the second-nearest neighbors as a function of ionicity. This approach is feasible for developing empirical interatomic potentials applicable to a system consisting of poorly coordinated atoms at surfaces and interfaces including nanostructures.

  7. An empirical-statistical model for laser cladding of Ti-6Al-4V powder on Ti-6Al-4V substrate

    NASA Astrophysics Data System (ADS)

    Nabhani, Mohammad; Razavi, Reza Shoja; Barekat, Masoud

    2018-03-01

    In this article, Ti-6Al-4V powder alloy was directly deposited on Ti-6Al-4V substrate using laser cladding process. In this process, some key parameters such as laser power (P), laser scanning rate (V) and powder feeding rate (F) play important roles. Using linear regression analysis, this paper develops the empirical-statistical relation between these key parameters and geometrical characteristics of single clad tracks (i.e. clad height, clad width, penetration depth, wetting angle, and dilution) as a combined parameter (PαVβFγ). The results indicated that the clad width linearly depended on PV-1/3 and powder feeding rate had no effect on it. The dilution controlled by a combined parameter as VF-1/2 and laser power was a dispensable factor. However, laser power was the dominant factor for the clad height, penetration depth, and wetting angle so that they were proportional to PV-1F1/4, PVF-1/8, and P3/4V-1F-1/4, respectively. Based on the results of correlation coefficient (R > 0.9) and analysis of residuals, it was confirmed that these empirical-statistical relations were in good agreement with the measured values of single clad tracks. Finally, these relations led to the design of a processing map that can predict the geometrical characteristics of the single clad tracks based on the key parameters.

  8. Modeling of dielectric properties of aqueous salt solutions with an equation of state.

    PubMed

    Maribo-Mogensen, Bjørn; Kontogeorgis, Georgios M; Thomsen, Kaj

    2013-09-12

    The static permittivity is the most important physical property for thermodynamic models that account for the electrostatic interactions between ions. The measured static permittivity in mixtures containing electrolytes is reduced due to kinetic depolarization and reorientation of the dipoles in the electrical field surrounding ions. Kinetic depolarization may explain 25-75% of the observed decrease in the permittivity of solutions containing salts, but since this is a dynamic property, this effect should not be included in the thermodynamic modeling of electrolytes. Kinetic depolarization has, however, been ignored in relation to thermodynamic modeling, and authors have either neglected the effect of salts on permittivity or used empirical correlations fitted to the measured static permittivity, leading to an overestimation of the reduction in the thermodynamic static permittivity. We present a new methodology for obtaining the static permittivity over wide ranges of temperatures, pressures, and compositions for use within an equation of state for mixed solvents containing salts. The static permittivity is calculated from a new extension of the framework developed by Onsager, Kirkwood, and Fröhlich to associating mixtures. Wertheim's association model as formulated in the statistical associating fluid theory is used to account for hydrogen-bonding molecules and ion-solvent association. Finally, we compare the Debye-Hückel Helmholtz energy obtained using an empirical model with the new physical model and show that the empirical models may introduce unphysical behavior in the equation of state.

  9. How expressions of forgiveness, purpose, and religiosity relate to emotional intelligence and self-concept in urban fifth-grade students.

    PubMed

    Van Dyke, Cydney J; Elias, Maurice J

    2008-10-01

    This study investigated how the tendency to express forgiveness, purpose, and religiosity in a free-response essay relates to emotional intelligence and self-concept in 89 5th-graders (mean age = 10.84 years) from an urban public school district in New Jersey. Readers coded essays for expressions of forgiveness, purpose, and religiosity using originally developed rubrics. These data were compared with self-reports on scales of emotional intelligence and self-concept. It was hypothesized that expressions of the predictor variables would correlate positively with emotional intelligence and self-concept. In contrast to expressions of purpose, which were common among students, expressions of forgiveness and religiosity were infrequent. Furthermore, forgiveness was not significantly related to either criterion variable; purpose was positively related to self-concept (but not to emotional intelligence); and religiosity was negatively related to emotional intelligence (but not to self-concept). Correlational analyses by gender revealed a possible trend toward more robust relationships being observed among females than males; however, the differences between the correlation coefficients observed among males and females failed to reach statistical significance. Several of the study's unanticipated findings suggest the need for further empirical work investigating the psychological correlates of these constructs in children. PsycINFO Database Record 2009 APA.

  10. Neutron-fragment and Neutron-neutron Correlations in Low-energy Fission

    NASA Astrophysics Data System (ADS)

    Lestone, J. P.

    2016-01-01

    A computational method has been developed to simulate neutron emission from thermal-neutron induced fission of 235U and from spontaneous fission of 252Cf. Measured pre-emission mass-yield curves, average total kinetic energies and their variances, both as functions of mass split, are used to obtain a representation of the distribution of fragment velocities. Measured average neutron multiplicities as a function of mass split and their dependence on total kinetic energy are used. Simulations can be made to reproduce measured factorial moments of neutron-multiplicity distributions with only minor empirical adjustments to some experimental inputs. The neutron-emission spectra in the rest-frame of the fragments are highly constrained by ENDF/B-VII.1 prompt-fission neutron-spectra evaluations. The n-f correlation measurements of Vorobyev et al. (2010) are consistent with predictions where all neutrons are assumed to be evaporated isotropically from the rest frame of fully accelerated fragments. Measured n-f and n-n correlations of others are a little weaker than the predictions presented here. These weaker correlations could be used to infer a weak scission-neutron source. However, the effect of neutron scattering on the experimental results must be studied in detail before moving away from a null hypothesis that all neutrons are evaporated from the fragments.

  11. FROM FINANCE TO COSMOLOGY: THE COPULA OF LARGE-SCALE STRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scherrer, Robert J.; Berlind, Andreas A.; Mao, Qingqing

    2010-01-01

    Any multivariate distribution can be uniquely decomposed into marginal (one-point) distributions, and a function called the copula, which contains all of the information on correlations between the distributions. The copula provides an important new methodology for analyzing the density field in large-scale structure. We derive the empirical two-point copula for the evolved dark matter density field. We find that this empirical copula is well approximated by a Gaussian copula. We consider the possibility that the full n-point copula is also Gaussian and describe some of the consequences of this hypothesis. Future directions for investigation are discussed.

  12. The urban health transition hypothesis: empirical evidence of an avian influenza Kuznets curve in Vietnam?

    PubMed

    Spencer, James Herbert

    2013-04-01

    The literature on development has focused on the concept of transition in understanding the emergent challenges facing poor but rapidly developing countries. Scholars have focused extensively on the health and urban transitions associated with this change and, in particular, its use for understanding emerging infectious diseases. However, few have developed explicit empirical measures to quantify the extent to which a transitions focus is useful for theory, policy, and practice. Using open source data on avian influenza in 2004 and 2005 and the Vietnam Census of Population and Housing, this paper introduces the Kuznets curve as a tool for empirically estimating transition and disease. Findings suggest that the Kuznets curve is a viable tool for empirically assessing the role of transitional dynamics in the emergence of new infectious diseases.

  13. Effect on Non-Uniform Heat Generation on Thermionic Reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schock, Alfred

    The penalty resulting from non-uniform heat generation in a thermionic reactor is examined. Operation at sub-optimum cesium pressure is shown to reduce this penalty, but at the risk of a condition analogous to burnout. For high pressure diodes, a simple empirical correlation between current, voltage and heat flux is developed and used to analyze the performance penalty associated with two different heat flux profiles, for series-and parallel-connected converters. The results demonstrate that series-connected converters require much finer power flattening than parallel converters. For example, a ±10% variation in heat generation across a series array can result in a 25 tomore » 50% power penalty.« less

  14. Job satisfaction and retention of social workers in public agencies, non-profit agencies, and private practice: the impact of workplace conditions and motivators.

    PubMed

    Vinokur-Kaplan, D; Jayaratne, S; Chess, W A

    1994-01-01

    The authors examine a selected array of agency-influenced work and employment conditions and assess their impact upon social workers' job satisfaction, motivation, and intention to seek new employment. The study makes correlations with past empirical studies on job satisfaction and retention, with staff development concerns as stated in social work administration textbooks, and with conditions subject to administrators' influence. Some specified motivational issues included are salary, fringe benefits, job security, physical surroundings, and safety. The analysis demonstrates the contribution of certain contextual and motivational factors to a prediction of job satisfaction or of intent to leave the organization.

  15. Interpreting neurodynamics: concepts and facts

    PubMed Central

    Rotter, Stefan

    2008-01-01

    The dynamics of neuronal systems, briefly neurodynamics, has developed into an attractive and influential research branch within neuroscience. In this paper, we discuss a number of conceptual issues in neurodynamics that are important for an appropriate interpretation and evaluation of its results. We demonstrate their relevance for selected topics of theoretical and empirical work. In particular, we refer to the notions of determinacy and stochasticity in neurodynamics across levels of microscopic, mesoscopic and macroscopic descriptions. The issue of correlations between neural, mental and behavioral states is also addressed in some detail. We propose an informed discussion of conceptual foundations with respect to neurobiological results as a viable step to a fruitful future philosophy of neuroscience. PMID:19003452

  16. Direct numerical simulation of annular flows

    NASA Astrophysics Data System (ADS)

    Batchvarov, Assen; Kahouadji, Lyes; Chergui, Jalel; Juric, Damir; Shin, Seungwon; Craster, Richard V.; Matar, Omar K.

    2017-11-01

    Vertical counter-current two-phase flows are investigated using direct numerical simulations. The computations are carried out using Blue, a front-tracking-based CFD solver. Preliminary results show good qualitative agreement with experimental observations in terms of interfacial phenomena; these include three-dimensional, large-amplitude wave formation, the development of long ligaments, and droplet entrainment. The flooding phenomena in these counter current systems are closely investigated. The onset of flooding in our simulations is compared to existing empirical correlations such as Kutateladze-type and Wallis-type. The effect of varying tube diameter and fluid properties on the flooding phenomena is also investigated in this work. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).

  17. A Water Model Study on Mixing Behavior of the Two-Layered Bath in Bottom Blown Copper Smelting Furnace

    NASA Astrophysics Data System (ADS)

    Shui, Lang; Cui, Zhixiang; Ma, Xiaodong; Jiang, Xu; Chen, Mao; Xiang, Yong; Zhao, Baojun

    2018-05-01

    The bottom-blown copper smelting furnace is a novel copper smelter developed in recent years. Many advantages of this furnace have been found, related to bath mixing behavior under its specific gas injection scheme. This study aims to use an oil-water double-phased laboratory-scale model to investigate the impact of industry-adjustable variables on bath mixing time, including lower layer thickness, gas flow rate, upper layer thickness and upper layer viscosity. Based on experimental results, an overall empirical relationship of mixing time in terms of these variables has been correlated, which provides the methodology for industry to optimize mass transfer in the furnace.

  18. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties

    PubMed Central

    Mehmandoust, Babak; Sanjari, Ehsan; Vatani, Mostafa

    2013-01-01

    The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results showed that the proposed correlation is more accurate than the literature methods for pure substances in a wide boiling range (20.3–722 K). PMID:25685493

  19. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties.

    PubMed

    Mehmandoust, Babak; Sanjari, Ehsan; Vatani, Mostafa

    2014-03-01

    The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results showed that the proposed correlation is more accurate than the literature methods for pure substances in a wide boiling range (20.3-722 K).

  20. Hypothesis testing for differentially correlated features.

    PubMed

    Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua

    2016-10-01

    In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. A Motivational Theory of Life-Span Development

    PubMed Central

    Heckhausen, Jutta; Wrosch, Carsten; Schulz, Richard

    2010-01-01

    This article had four goals. First, the authors identified a set of general challenges and questions that a life-span theory of development should address. Second, they presented a comprehensive account of their Motivational Theory of Life-Span Development. They integrated the model of optimization in primary and secondary control and the action-phase model of developmental regulation with their original life-span theory of control to present a comprehensive theory of development. Third, they reviewed the relevant empirical literature testing key propositions of the Motivational Theory of Life-Span Development. Finally, because the conceptual reach of their theory goes far beyond the current empirical base, they pointed out areas that deserve further and more focused empirical inquiry. PMID:20063963

  2. Theory of Financial Risk and Derivative Pricing

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2009-01-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  3. Theory of Financial Risk and Derivative Pricing - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2003-12-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  4. The Multi-Frequency Correlation Between Eua and sCER Futures Prices: Evidence from the Emd Approach

    NASA Astrophysics Data System (ADS)

    Zhang, Yue-Jun; Huang, Yi-Song

    2015-05-01

    Currently European Union Allowances (EUA) and secondary Certified Emission Reduction (sCER) have become two dominant carbon trading assets for investors and their linkage attracts much attention from academia and practitioners in recent years. Under this circumstance, we use the empirical mode decomposition (EMD) approach to decompose the two carbon futures contract prices and discuss their correlation from the multi-frequency perspective. The empirical results indicate that, first, the EUA and sCER futures price movements can be divided into those triggered by the long-term, medium-term and short-term market impacts. Second, the price movements in the EUA and sCER futures markets are primarily caused by the long-term impact, while the short-term impact can only explain a small fraction. Finally, the long-term (short-term) effect on EUA prices is statistically uncorrelated with the short-term (long-term) effect of sCER prices, and there is a medium or strong lead-and-lag correlation between the EUA and sCER price components with the same time scales. These results may provide some important insights of price forecast and arbitraging activities for carbon futures market investors, analysts and regulators.

  5. Correlating P-wave Velocity with the Physico-Mechanical Properties of Different Rocks

    NASA Astrophysics Data System (ADS)

    Khandelwal, Manoj

    2013-04-01

    In mining and civil engineering projects, physico-mechanical properties of the rock affect both the project design and the construction operation. Determination of various physico-mechanical properties of rocks is expensive and time consuming, and sometimes it is very difficult to get cores to perform direct tests to evaluate the rock mass. The purpose of this work is to investigate the relationships between the different physico-mechanical properties of the various rock types with the P-wave velocity. Measurement of P-wave velocity is relatively cheap, non-destructive and easy to carry out. In this study, representative rock mass samples of igneous, sedimentary, and metamorphic rocks were collected from the different locations of India to obtain an empirical relation between P-wave velocity and uniaxial compressive strength, tensile strength, punch shear, density, slake durability index, Young's modulus, Poisson's ratio, impact strength index and Schmidt hammer rebound number. A very strong correlation was found between the P-wave velocity and different physico-mechanical properties of various rock types with very high coefficients of determination. To check the sensitivity of the empirical equations, Students t test was also performed, which confirmed the validity of the proposed correlations.

  6. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Conditions for Viral Influence Spreading through Multiplex Correlated Social Networks

    NASA Astrophysics Data System (ADS)

    Hu, Yanqing; Havlin, Shlomo; Makse, Hernán A.

    2014-04-01

    A fundamental problem in network science is to predict how certain individuals are able to initiate new networks to spring up "new ideas." Frequently, these changes in trends are triggered by a few innovators who rapidly impose their ideas through "viral" influence spreading, producing cascades of followers and fragmenting an old network to create a new one. Typical examples include the rise of scientific ideas or abrupt changes in social media, like the rise of Facebook to the detriment of Myspace. How this process arises in practice has not been conclusively demonstrated. Here, we show that a condition for sustaining a viral spreading process is the existence of a multiplex-correlated graph with hidden "influence links." Analytical solutions predict percolation-phase transitions, either abrupt or continuous, where networks are disintegrated through viral cascades of followers, as in empirical data. Our modeling predicts the strict conditions to sustain a large viral spreading via a scaling form of the local correlation function between multilayers, which we also confirm empirically. Ultimately, the theory predicts the conditions for viral cascading in a large class of multiplex networks ranging from social to financial systems and markets.

  8. On the Cause of Geodetic Satellite Accelerations and Other Correlated Unmodeled Phenomena

    NASA Astrophysics Data System (ADS)

    Mayer, A. F.

    2005-12-01

    An oversight in the development of the Einstein field equations requires a well-defined amendment to general relativity that very slightly modifies the weak-field Schwarzschild geometry yielding unambiguous new predictions of gravitational relativistic phenomena. The secular accelerations of LAGEOS, Etalon and other geodetic satellites are definitively explained as a previously unmodeled relativistic effect of the gravitational field. Observed dynamic variations may be correlated to the complex dynamic relationship between the satellite angular momentum vector and the solar gravitational gradient associated with the orbital motion of the Earth and the natural precession of the satellite orbit. The Pioneer Anomaly, semidiurnal saw-toothed pseudo-range residuals of GPS satellites, peculiar results of radio occultation experiments, secular accelerations of Solar System moons, the conspicuous excess redshift of white dwarf stars and other documented empirical observations are all correlated to the same newly modeled subtle relativistic energy effect. Modern challenges in the determination and maintenance of an accurate and reliable terrestrial reference frame, difficulties with global time synchronization at nanosecond resolution and the purported existence of unlikely excessive undulations of the Geoid relative to the Ellipsoid are all related to this previously unknown phenomenon inherent to the gravitational field. Doppler satellite measurements made by the TRANSIT system (the precursor to GPS) were significantly affected; WGS 84 coordinates and other geodetic data now assumed to be correct to high accuracy require correction based on the new theoretical developments.

  9. Application of a high-throughput relative chemical stability assay to screen therapeutic protein formulations by assessment of conformational stability and correlation to aggregation propensity.

    PubMed

    Rizzo, Joseph M; Shi, Shuai; Li, Yunsong; Semple, Andrew; Esposito, Jessica J; Yu, Shenjiang; Richardson, Daisy; Antochshuk, Valentyn; Shameem, Mohammed

    2015-05-01

    In this study, an automated high-throughput relative chemical stability (RCS) assay was developed in which various therapeutic proteins were assessed to determine stability based on the resistance to denaturation post introduction to a chaotrope titration. Detection mechanisms of both intrinsic fluorescence and near UV circular dichroism (near-UV CD) are demonstrated. Assay robustness was investigated by comparing multiple independent assays and achieving r(2) values >0.95 for curve overlays. The complete reversibility of the assay was demonstrated by intrinsic fluorescence, near-UV CD, and biologic potency. To highlight the method utility, we compared the RCS assay with differential scanning calorimetry and dynamic scanning fluorimetry methodologies. Utilizing C1/2 values obtained from the RCS assay, formulation rank-ordering of 12 different mAb formulations was performed. The prediction of long-term stability on protein aggregation is obtained by demonstrating a good correlation with an r(2) of 0.83 between RCS and empirical aggregation propensity data. RCS promises to be an extremely useful tool to aid in candidate formulation development efforts based on the complete reversibility of the method to allow for multiple assessments without protein loss and the strong correlation between the C1/2 data obtained and accelerated stability under stressed conditions. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Extending Our Vision of Developmental Growth and Engaging in Empirical Scrutiny: Proposals for the Future of Faith Development Theory

    ERIC Educational Resources Information Center

    Streib, Heinz

    2004-01-01

    This article evaluates the portrait of faith development theory and research in James Fowler's article,"Faith Development at 30." Questions are raised: Does Fowler's emphasis on the practical-theological and pastoral focus of faith development contradict its aspiration and disposition for empirical scrutiny? Does Fowler's principal concern with…

  11. Modeling of Fume Formation from Shielded Metal Arc Welding Process

    NASA Astrophysics Data System (ADS)

    Sivapirakasam, S. P.; Mohan, Sreejith; Santhosh Kumar, M. C.; Surianarayanan, M.

    2017-04-01

    In this study, a semi-empirical model of fume formation rate (FFR) from a shielded metal arc welding (SMAW) process has been developed. The model was developed for a DC electrode positive (DCEP) operation and involves the calculations of droplet temperature, surface area of the droplet, and partial vapor pressures of the constituents of the droplet to predict the FFR. The model was further extended for predicting FFR from nano-coated electrodes. The model estimates the FFR for Fe and Mn assuming constant proportion of other elements in the electrode. Fe FFR was overestimated, while Mn FFR was underestimated. The contribution of spatters and other mechanism in the arc responsible for fume formation were neglected. A good positive correlation was obtained between the predicted and experimental FFR values which highlighted the usefulness of the model.

  12. Mediating effect of sustainable product development on relationship between quality management practices and organizational performance: Empirical study of Malaysian automotive industry

    NASA Astrophysics Data System (ADS)

    Ahmad, Mohd Akhir; Asaad, Mohd Norhasni; Saad, Rohaizah; Iteng, Rosman; Rahim, Mohd Kamarul Irwan Abdul

    2016-08-01

    Global competition in the automotive industry has encouraged companies to implement quality management practices in all managerial aspects to ensure customer satisfaction in products and reduce costs. Therefore, guaranteeing only product quality is insufficient without considering product sustainability, which involves economic, environment, and social elements. Companies that meet both objectives gain advantages in the modern business environment. This study addresses the issues regarding product quality and sustainability in small and medium-sized enterprises in the Malaysian automotive industry. A research was carried out in 91 SMEs automotive suppliers in throughout Malaysia. The analyzed using SPSS ver.23 has been proposed in correlation study. Specifically, this study investigates the relationship between quality management practices and organizational performance as well as the mediating effect of sustainable product development on this relationship.

  13. Antimicrobial Susceptibility Test with Plasmonic Imaging and Tracking of Single Bacterial Motions on Nanometer Scale.

    PubMed

    Syal, Karan; Iriya, Rafael; Yang, Yunze; Yu, Hui; Wang, Shaopeng; Haydel, Shelley E; Chen, Hong-Yuan; Tao, Nongjian

    2016-01-26

    Antimicrobial susceptibility tests (ASTs) are important for confirming susceptibility to empirical antibiotics and detecting resistance in bacterial isolates. Currently, most ASTs performed in clinical microbiology laboratories are based on bacterial culturing, which take days to complete for slowly growing microorganisms. A faster AST will reduce morbidity and mortality rates and help healthcare providers administer narrow spectrum antibiotics at the earliest possible treatment stage. We report the development of a nonculture-based AST using a plasmonic imaging and tracking (PIT) technology. We track the motion of individual bacterial cells tethered to a surface with nanometer (nm) precision and correlate the phenotypic motion with bacterial metabolism and antibiotic action. We show that antibiotic action significantly slows down bacterial motion, which can be quantified for development of a rapid phenotypic-based AST.

  14. Development of many-body polarizable force fields for Li-battery components: 1. Ether, alkane, and carbonate-based solvents.

    PubMed

    Borodin, Oleg; Smith, Grant D

    2006-03-30

    Classical many-body polarizable force fields were developed for n-alkanes, perflouroalkanes, polyethers, ketones, and linear and cyclic carbonates on the basis of quantum chemistry dimer energies of model compounds and empirical thermodynamic liquid-state properties. The dependence of the electron correlation contribution to the dimer binding energy on basis-set size and level of theory was investigated as a function of molecular separation for a number of alkane, ether, and ketone dimers. Molecular dynamics (MD) simulations of the force fields accurately predicted structural, dynamic, and transport properties of liquids and unentangled polymer melts. On average, gas-phase dimer binding energies predicted with the force field were between those from MP2/aug-cc-pvDz and MP2/aug-cc-pvTz quantum chemistry calculations.

  15. Development of a scaled-down aerobic fermentation model for scale-up in recombinant protein vaccine manufacturing.

    PubMed

    Farrell, Patrick; Sun, Jacob; Gao, Meg; Sun, Hong; Pattara, Ben; Zeiser, Arno; D'Amore, Tony

    2012-08-17

    A simple approach to the development of an aerobic scaled-down fermentation model is presented to obtain more consistent process performance during the scale-up of recombinant protein manufacture. Using a constant volumetric oxygen mass transfer coefficient (k(L)a) for the criterion of a scale-down process, the scaled-down model can be "tuned" to match the k(L)a of any larger-scale target by varying the impeller rotational speed. This approach is demonstrated for a protein vaccine candidate expressed in recombinant Escherichia coli, where process performance is shown to be consistent among 2-L, 20-L, and 200-L scales. An empirical correlation for k(L)a has also been employed to extrapolate to larger manufacturing scales. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Treatment and outcomes of infections by methicillin-resistant Staphylococcus aureus at an ambulatory clinic.

    PubMed

    Szumowski, John D; Cohen, Daniel E; Kanaya, Fumihide; Mayer, Kenneth H

    2007-02-01

    Community-acquired methicillin-resistant Staphylococcus aureus (MRSA) skin and soft tissue infections (SSTI) have become increasingly common. This study's objectives were to describe the clinical spectrum of MRSA in a community health center and to determine whether the use of specific antimicrobials correlated with increased probability of clinical resolution of SSTI. A retrospective chart review of 399 sequential cases of culture-confirmed S. aureus SSTI, including 227 cases of MRSA SSTI, among outpatients at Fenway Community Health (Boston, MA) from 1998 to 2005 was done. The proportion of S. aureus SSTI due to MRSA increased significantly from 1998 to 2005 (P<0.0001). Resistance to clindamycin was common (48.2% of isolates). At the beginning of the study period, most patients with MRSA SSTI empirically treated with antibiotics received a beta-lactam, whereas by 2005, 76% received trimethoprim-sulfamethoxazole (TMP-SMX) (P<0.0001). Initially, few MRSA isolates were sensitive to the empirical antibiotic, but 77% were susceptible by 2005 (P<0.0001). A significantly higher percentage of patients with MRSA isolates had clinical resolution on the empirical antibiotic by 2005 (P=0.037). Use of an empirical antibiotic to which the clinical isolate was sensitive was associated with increased odds of clinical resolution on empirical therapy (odds ratio=5.91), controlling for incision and drainage and HIV status. MRSA now accounts for the majority of SSTI due to S. aureus at Fenway, and improved rates of clinical resolution on empirical antibiotic therapy have paralleled increasing use of empirical TMP-SMX for these infections. TMP-SMX appears to be an appropriate empirical antibiotic for suspected MRSA SSTI, especially where clindamycin resistance is common.

  17. Integrating the philosophy and psychology of aesthetic experience: development of the aesthetic experience scale.

    PubMed

    Stamatopoulou, Despina

    2004-10-01

    This study assessed the dynamic relationship between person and object in aesthetic experience. Patterns of the structure of aesthetic experience were derived from a conceptual model based on philosophical and psychological ideas. These patterns were further informed by interviewing individuals with extensive involvement in aesthetic activities and 25 secondary students. Accordingly, patterns were tested by developing a large pool of items attempting to identify measurable structural components of aesthetic experience. Refined first in a pilot study, the 36-item questionnaire was administered to 652 Greek students, aged from 13 to 15 years. Correlation matrices and exploratory factor analyses on principal components were used to examine internal structural relationships. The obliquely rotated five-factor solution of the refined instrument accounted for the 44.1% of the total variance and was combatible with the conceptual model of aesthetic experience, indicating the plausibility of both. The internal consistency of the items was adequate and external correlational analysis offered preliminary support for subsequent development of a self-report measure that serves to operationalize the major constructs of aesthetic experience in the general adolescent population. The results also raise theoretical issues for those interested in empirical aesthetics, suggesting that in experiential functioning, expressive perception and affect may play a more constructive role in cognitive processes than is generally acknowledged.

  18. Household chaos, sociodemographic risk, coparenting, and parent-infant relations during infants' first year.

    PubMed

    Whitesell, Corey J; Teti, Douglas M; Crosby, Brian; Kim, Bo-Ram

    2015-04-01

    Household chaos is a construct often overlooked in studies of human development, despite its theoretical links with the integrity of individual well-being, family processes, and child development. The present longitudinal study examined relations between household chaos and well-established correlates of chaos (sociodemographic risk, major life events, and personal distress) and several constructs that, to date, are theoretically linked with chaos but never before assessed as correlates (quality of coparenting and emotional availability with infants at bedtime). In addressing this aim, we introduce a new measure of household chaos (the Descriptive In-home Survey of Chaos--Observer ReporteD, or DISCORD), wholly reliant on independent observer report, which draws from household chaos theory and prior empirical work but extends the measurement of chaos to include information about families' compliance with a home visiting protocol. Household chaos was significantly associated with socioeconomic risk, negative life events, less favorable coparenting, and less emotionally available bedtime parenting, but not with personal distress. These findings emphasize the need to examine household chaos as a direct and indirect influence on child and family outcomes, as a moderator of intervention attempts to improving parenting and child development, and as a target of intervention in its own right. (c) 2015 APA, all rights reserved).

  19. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Understanding Autonomy in Future Systems.

    PubMed

    Schaefer, Kristin E; Chen, Jessie Y C; Szalma, James L; Hancock, P A

    2016-05-01

    We used meta-analysis to assess research concerning human trust in automation to understand the foundation upon which future autonomous systems can be built. Trust is increasingly important in the growing need for synergistic human-machine teaming. Thus, we expand on our previous meta-analytic foundation in the field of human-robot interaction to include all of automation interaction. We used meta-analysis to assess trust in automation. Thirty studies provided 164 pairwise effect sizes, and 16 studies provided 63 correlational effect sizes. The overall effect size of all factors on trust development was ḡ = +0.48, and the correlational effect was [Formula: see text]  = +0.34, each of which represented medium effects. Moderator effects were observed for the human-related (ḡ  = +0.49; [Formula: see text] = +0.16) and automation-related (ḡ = +0.53; [Formula: see text] = +0.41) factors. Moderator effects specific to environmental factors proved insufficient in number to calculate at this time. Findings provide a quantitative representation of factors influencing the development of trust in automation as well as identify additional areas of needed empirical research. This work has important implications to the enhancement of current and future human-automation interaction, especially in high-risk or extreme performance environments. © 2016, Human Factors and Ergonomics Society.

  20. Analyses of the structure of group correlations in Korean financial markets

    NASA Astrophysics Data System (ADS)

    Ko, Jeung Su; Lim, Gyuchang; Kim, Kyungsik

    2012-12-01

    In this paper, we construct and analyze the structure of cross-correlations in two Korean stock markets, the Korea Composite Stock Price Index (KOSPI) and the Korea Securities Dealers Automated Quotation (KOSDAQ). We investigate a remarkable agreement between the theoretical prediction and the empirical data concerning the density of eigenvalues in the KOSPI and the KOSDAQ. We estimate daily cross-correlations with respect to price fluctuations of 629 KOSPI and 650 KOSDAQ stock entities for the period from 2006 to 2010. The research for the structure of group correlations undress the market-wide effect by using the Markowitz multi-factor model and network-based approach. We find stock entities that involve the same business sectors and verify the structure of group correlations by applying a network-based approach. In particular, the KOSPI has a dense correlation besides overall group correlations for stock entities, whereas both correlations are less for the KOSDAQ than for the KOSPI.

Top