Sample records for stochastic frontier analysis

  1. Fitting of full Cobb-Douglas and full VRTS cost frontiers by solving goal programming problem

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, B.; Mahaboob, B.; Subbarami Reddy, C.; Madhusudhana Rao, B.

    2017-11-01

    The present research article first defines two popular production functions viz, Cobb-Douglas and VRTS production frontiers and their dual cost functions and then derives their cost limited maximal outputs. This paper tells us that the cost limited maximal output is cost efficient. Here the one side goal programming problem is proposed by which the full Cobb-Douglas cost frontier, full VRTS frontier can be fitted. This paper includes the framing of goal programming by which stochastic cost frontier and stochastic VRTS frontiers are fitted. Hasan et al. [1] used a parameter approach Stochastic Frontier Approach (SFA) to examine the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur stock Exchange (KLSE) market over the period 2005-2010. AshkanHassani [2] exposed Cobb-Douglas Production Functions application in construction schedule crashing and project risk analysis related to the duration of construction projects. Nan Jiang [3] applied Stochastic Frontier analysis to a panel of New Zealand dairy forms in 1998/99-2006/2007.

  2. Production and efficiency of large wildland fire suppression effort: A stochastic frontier analysis

    Treesearch

    Hari Katuwal; Dave Calkin; Michael S. Hand

    2016-01-01

    This study examines the production and efficiency of wildland fire suppression effort. We estimate the effectiveness of suppression resource inputs to produce controlled fire lines that contain large wildland fires using stochastic frontier analysis. Determinants of inefficiency are identified and the effects of these determinants on the daily production of...

  3. Efficiency in the Community College Sector: Stochastic Frontier Analysis

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Belfield, Clive

    2017-01-01

    This paper estimates technical efficiency scores across the community college sector in the United States. Using stochastic frontier analysis and data from the Integrated Postsecondary Education Data System for 2003-2010, we estimate efficiency scores for 950 community colleges and perform a series of sensitivity tests to check for robustness. We…

  4. The Relative Efficiencies of Research Universities of Science and Technology in China: Based on the Data Envelopment Analysis and Stochastic Frontier Analysis

    ERIC Educational Resources Information Center

    Chuanyi, Wang; Xiaohong, Lv; Shikui, Zhao

    2016-01-01

    This paper applies data envelopment analysis (DEA) and stochastic frontier analysis (SFA) to explore the relative efficiency of China's research universities of science and technology. According to the finding, when talent training is the only output, the efficiency of research universities of science and technology is far lower than that of…

  5. Stochastic Frontier Estimation of Efficient Learning in Video Games

    ERIC Educational Resources Information Center

    Hamlen, Karla R.

    2012-01-01

    Stochastic Frontier Regression Analysis was used to investigate strategies and skills that are associated with the minimization of time required to achieve proficiency in video games among students in grades four and five. Students self-reported their video game play habits, including strategies and skills used to become good at the video games…

  6. Efficiency of Finnish General Upper Secondary Schools: An Application of Stochastic Frontier Analysis with Panel Data

    ERIC Educational Resources Information Center

    Kirjavainen, Tanja

    2012-01-01

    Different stochastic frontier models for panel data are used to estimate education production functions and the efficiency of Finnish general upper secondary schools. Grades in the matriculation examination are used as an output and explained with the comprehensive school grade point average, parental socio-economic background, school resources,…

  7. Measuring hospital efficiency--comparing four European countries.

    PubMed

    Mateus, Céu; Joaquim, Inês; Nunes, Carla

    2015-02-01

    Performing international comparisons on efficiency usually has two main drawbacks: the lack of comparability of data from different countries and the appropriateness and adequacy of data selected for efficiency measurement. With inpatient discharges for four countries, some of the problems of data comparability usually found in international comparisons were mitigated. The objectives are to assess and compare hospital efficiency levels within and between countries, using stochastic frontier analysis with both cross-sectional and panel data. Data from English (2005-2008), Portuguese (2002-2009), Spanish (2003-2009) and Slovenian (2005-2009) hospital discharges and characteristics are used. Weighted hospital discharges were considered as outputs while the number of employees, physicians, nurses and beds were selected as inputs of the production function. Stochastic frontier analysis using both cross-sectional and panel data were performed, as well as ordinary least squares (OLS) analysis. The adequacy of the data was assessed with Kolmogorov-Smirnov and Breusch-Pagan/Cook-Weisberg tests. Data available results were redundant to perform efficiency measurements using stochastic frontier analysis with cross-sectional data. The likelihood ratio test reveals that in cross-sectional data stochastic frontier analysis (SFA) is not statistically different from OLS in Portuguese data, while SFA and OLS estimates are statistically different for Spanish, Slovenian and English data. In the panel data, the inefficiency term is statistically different from 0 in the four countries in analysis, though for Portugal it is still close to 0. Panel data are preferred over cross-section analysis because results are more robust. For all countries except Slovenia, beds and employees are relevant inputs for the production process. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  8. Occupational activity and cognitive reserve: implications in terms of prevention of cognitive aging and Alzheimer’s disease

    PubMed Central

    Adam, Stéphane; Bonsang, Eric; Grotz, Catherine; Perelman, Sergio

    2013-01-01

    This paper investigates the relationship between the concept of activity (including both professional and nonprofessional) and cognitive functioning among older European individuals. In this research, we used data collected during the first wave of SHARE (Survey on Health, Ageing and Retirement in Europe), and a measurement approach known as stochastic frontier analysis, derived from the economic literature. SHARE includes a large population (n > 25,000) geographically distributed across Europe, and analyzes several dimensions simultaneously, including physical and mental health activity. The main advantages of stochastic frontier analysis are that it allows estimation of parametric function relating cognitive scores and driving factors at the boundary and disentangles frontier noise and distance to frontier components, as well as testing the effect of potential factors on these distances simultaneously. The analysis reveals that all activities are positively related to cognitive functioning in elderly people. Our results are discussed in terms of prevention of cognitive aging and Alzheimer’s disease, and regarding the potential impact that some retirement programs might have on cognitive functioning in individuals across Europe. PMID:23671387

  9. Technical indicators of economic performance in dairy sheep farming.

    PubMed

    Theodoridis, A; Ragkos, A; Roustemis, D; Arsenos, G; Abas, Z; Sinapis, E

    2014-01-01

    In this study, the level of technical efficiency of 58 sheep farms rearing the Chios breed in Greece was measured through the application of the stochastic frontier analysis method. A Translog stochastic frontier production function was estimated using farm accounting data of Chios sheep farms and the impact of various socio-demographic and biophysical factors on the estimated efficiency of the farms was evaluated. The farms were classified into efficiency groups on the basis of the estimated level of efficiency and a technical and economic descriptive analysis was applied in order to illustrate an indicative picture of their structure and productivity. The results of the stochastic frontier model indicate that there are substantial production inefficiencies among the Chios sheep farms and that these farms could increase their production through the improvement of technical efficiency, whereas the results of the inefficiency effects model reveal that the farm-specific explanatory factors can partly explain the observed efficiency differentials. The measurement of technical inefficiency and the detection of its determinants can be used to form the basis of policy recommendations that could contribute to the development of the sector.

  10. Measuring the Efficiency of a Hospital based on the Econometric Stochastic Frontier Analysis (SFA) Method.

    PubMed

    Rezaei, Satar; Zandian, Hamed; Baniasadi, Akram; Moghadam, Telma Zahirian; Delavari, Somayeh; Delavari, Sajad

    2016-02-01

    Hospitals are the most expensive health services provider in the world. Therefore, the evaluation of their performance can be used to reduce costs. The aim of this study was to determine the efficiency of the hospitals at the Kurdistan University of Medical Sciences using stochastic frontier analysis (SFA). This was a cross-sectional and retrospective study that assessed the performance of Kurdistan teaching hospitals (n = 12) between 2007 and 2013. The Stochastic Frontier Analysis method was used to achieve this aim. The numbers of active beds, nurses, physicians, and other staff members were considered as input variables, while the inpatient admission was considered as the output. The data were analyzed using Frontier 4.1 software. The mean technical efficiency of the hospitals we studied was 0.67. The results of the Cobb-Douglas production function showed that the maximum elasticity was related to the active beds and the elasticity of nurses was negative. Also, the return to scale was increasing. The results of this study indicated that the performances of the hospitals were not appropriate in terms of technical efficiency. In addition, there was a capacity enhancement of the output of the hospitals, compared with the most efficient hospitals studied, of about33%. It is suggested that the effect of various factors, such as the quality of health care and the patients' satisfaction, be considered in the future studies to assess hospitals' performances.

  11. Frontier production function estimates for steam electric generation: a comparative analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopp, R.J.; Smith, V.K.

    1980-04-01

    The performance of three frontier steam electric generation estimators is compared in terms of the consideration given to new production technologies and their technical efficiency. The Cobb-Douglas, constant elasticity of substitution, and translog production functions are examined, using the Aigner-Chu linear programming, the sophisticated Aigner-Lovell-Schmidt stochastic frontier, and the direct method of adjusted ordinary least squares frontier estimators. The use of Cobb-Douglas specification is judged to have narrowed the perceived difference between competing estimators. The choice of frontier estimator is concluded to have a greater effect on the plant efficiency than functional form. 19 references. (DCK)

  12. Modeling stochastic frontier based on vine copulas

    NASA Astrophysics Data System (ADS)

    Constantino, Michel; Candido, Osvaldo; Tabak, Benjamin M.; da Costa, Reginaldo Brito

    2017-11-01

    This article models a production function and analyzes the technical efficiency of listed companies in the United States, Germany and England between 2005 and 2012 based on the vine copula approach. Traditional estimates of the stochastic frontier assume that data is multivariate normally distributed and there is no source of asymmetry. The proposed method based on vine copulas allow us to explore different types of asymmetry and multivariate distribution. Using data on product, capital and labor, we measure the relative efficiency of the vine production function and estimate the coefficient used in the stochastic frontier literature for comparison purposes. This production vine copula predicts the value added by firms with given capital and labor in a probabilistic way. It thereby stands in sharp contrast to the production function, where the output of firms is completely deterministic. The results show that, on average, S&P500 companies are more efficient than companies listed in England and Germany, which presented similar average efficiency coefficients. For comparative purposes, the traditional stochastic frontier was estimated and the results showed discrepancies between the coefficients obtained by the application of the two methods, traditional and frontier-vine, opening new paths of non-linear research.

  13. Scale, mergers and efficiency: the case of Dutch housing corporations.

    PubMed

    Veenstra, Jacob; Koolma, Hendrik M; Allers, Maarten A

    2017-01-01

    The efficiency of social housing providers is a contentious issue. In the Netherlands, there is a widespread belief that housing corporations have substantial potential for efficiency improvements. A related question is whether scale influences efficiency, since recent decades have shown a trend of mergers among corporations. This paper offers a framework to assess the effects of scale and mergers on the efficiency of Dutch housing corporations by using both a data envelopment analysis and a stochastic frontier analysis, using panel data for 2001-2012. The results indicate that most housing corporations operate under diseconomies of scale, implying that merging would be undesirable in most cases. However, merging may have beneficial effects on pure technical efficiency as it forces organizations to reconsider existing practices. A data envelopment analysis indeed confirms this hypothesis, but these results cannot be replicated by a stochastic frontier analysis, meaning that the evidence for this effect is not robust.

  14. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression.

    PubMed

    Şenel, Talat; Cengiz, Mehmet Ali

    2016-01-01

    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression.

  15. Evaluating Kuala Lumpur stock exchange oriented bank performance with stochastic frontiers

    NASA Astrophysics Data System (ADS)

    Baten, M. A.; Maznah, M. K.; Razamin, R.; Jastini, M. J.

    2014-12-01

    Banks play an essential role in the economic development and banks need to be efficient; otherwise, they may create blockage in the process of development in any country. The efficiency of banks in Malaysia is important and should receive greater attention. This study formulated an appropriate stochastic frontier model to investigate the efficiency of banks which are traded on Kuala Lumpur Stock Exchange (KLSE) market during the period 2005-2009. All data were analyzed to obtain the maximum likelihood method to estimate the parameters of stochastic production. Unlike the earlier studies which use balance sheet and income statements data, this study used market data as the input and output variables. It was observed that banks listed in KLSE exhibited a commendable overall efficiency level of 96.2% during 2005-2009 hence suggesting minimal input waste of 3.8%. Among the banks, the COMS (Cimb Group Holdings) bank is found to be highly efficient with a score of 0.9715 and BIMB (Bimb Holdings) bank is noted to have the lowest efficiency with a score of 0.9582. The results also show that Cobb-Douglas stochastic frontier model with truncated normal distributional assumption is preferable than Translog stochastic frontier model.

  16. Evaluating Kuala Lumpur stock exchange oriented bank performance with stochastic frontiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baten, M. A.; Maznah, M. K.; Razamin, R.

    Banks play an essential role in the economic development and banks need to be efficient; otherwise, they may create blockage in the process of development in any country. The efficiency of banks in Malaysia is important and should receive greater attention. This study formulated an appropriate stochastic frontier model to investigate the efficiency of banks which are traded on Kuala Lumpur Stock Exchange (KLSE) market during the period 2005–2009. All data were analyzed to obtain the maximum likelihood method to estimate the parameters of stochastic production. Unlike the earlier studies which use balance sheet and income statements data, this studymore » used market data as the input and output variables. It was observed that banks listed in KLSE exhibited a commendable overall efficiency level of 96.2% during 2005–2009 hence suggesting minimal input waste of 3.8%. Among the banks, the COMS (Cimb Group Holdings) bank is found to be highly efficient with a score of 0.9715 and BIMB (Bimb Holdings) bank is noted to have the lowest efficiency with a score of 0.9582. The results also show that Cobb-Douglas stochastic frontier model with truncated normal distributional assumption is preferable than Translog stochastic frontier model.« less

  17. The Cost Efficiency Impact of the University Operation Fund on Public Universities in Taiwan

    ERIC Educational Resources Information Center

    Kuo, Jenn-Shyong; Ho, Yi-Cheng

    2008-01-01

    This study uses the stochastic frontier multiple-product cost function that is modeled after Battese and Coelli [Battese, G. E., & Coelli, T. J. (1995). "A model for technical inefficiency effects in a stochastic frontier production for panel data." "Empirical Economics" 20(2), 325-332.] in order to empirically measure the…

  18. The efficiency of life insurance and family Takaful in Malaysia: Relative efficiency using the stochastic cost frontier analysis

    NASA Astrophysics Data System (ADS)

    Baharin, Roziana; Isa, Zaidi

    2013-04-01

    This paper focuses on the Stochastic cost Frontier Analysis (SFA) approach, in an attempt to measure the relationship between efficiency and organizational structure for Takaful and insurance operators in Malaysia's dual financial system. This study applied a flexible cost functional form i.e., Fourier Flexible Functional Form, for a sample consisting of 19 firms, chosen between 2002 and 2010, by employing the Battese and Coelli invariant efficiency model. The findings show that on average, there is a significant difference in cost efficiency between the Takaful industry and the insurance industry. It was found that Takaful has lower cost efficiency than conventional insurance, which shows that the organization form has an influence on efficiency. Overall, it was observed that the level of efficiency scores for both life insurance and family Takaful do not vary across time.

  19. Production and efficiency of large wildland fire suppression effort: A stochastic frontier analysis.

    PubMed

    Katuwal, Hari; Calkin, David E; Hand, Michael S

    2016-01-15

    This study examines the production and efficiency of wildland fire suppression effort. We estimate the effectiveness of suppression resource inputs to produce controlled fire lines that contain large wildland fires using stochastic frontier analysis. Determinants of inefficiency are identified and the effects of these determinants on the daily production of controlled fire line are examined. Results indicate that the use of bulldozers and fire engines increase the production of controlled fire line, while firefighter crews do not tend to contribute to controlled fire line production. Production of controlled fire line is more efficient if it occurs along natural or built breaks, such as rivers and roads, and within areas previously burned by wildfires. However, results also indicate that productivity and efficiency of the controlled fire line are sensitive to weather, landscape and fire characteristics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Stochastic Frontier Approach and Data Envelopment Analysis to Total Factor Productivity and Efficiency Measurement of Bangladeshi Rice

    PubMed Central

    Hossain, Md. Kamrul; Kamil, Anton Abdulbasah; Baten, Md. Azizul; Mustafa, Adli

    2012-01-01

    The objective of this paper is to apply the Translog Stochastic Frontier production model (SFA) and Data Envelopment Analysis (DEA) to estimate efficiencies over time and the Total Factor Productivity (TFP) growth rate for Bangladeshi rice crops (Aus, Aman and Boro) throughout the most recent data available comprising the period 1989–2008. Results indicate that technical efficiency was observed as higher for Boro among the three types of rice, but the overall technical efficiency of rice production was found around 50%. Although positive changes exist in TFP for the sample analyzed, the average growth rate of TFP for rice production was estimated at almost the same levels for both Translog SFA with half normal distribution and DEA. Estimated TFP from SFA is forecasted with ARIMA (2, 0, 0) model. ARIMA (1, 0, 0) model is used to forecast TFP of Aman from DEA estimation. PMID:23077500

  1. Managerial performance and cost efficiency of Japanese local public hospitals: a latent class stochastic frontier model.

    PubMed

    Besstremyannaya, Galina

    2011-09-01

    The paper explores the link between managerial performance and cost efficiency of 617 Japanese general local public hospitals in 1999-2007. Treating managerial performance as unobservable heterogeneity, the paper employs a panel data stochastic cost frontier model with latent classes. Financial parameters associated with better managerial performance are found to be positively significant in explaining the probability of belonging to the more efficient latent class. The analysis of latent class membership was consistent with the conjecture that unobservable technological heterogeneity reflected in the existence of the latent classes is related to managerial performance. The findings may support the cause for raising efficiency of Japanese local public hospitals by enhancing the quality of management. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Efficacy of a numerical value of a fixed-effect estimator in stochastic frontier analysis as an indicator of hospital production structure.

    PubMed

    Kawaguchi, Hiroyuki; Hashimoto, Hideki; Matsuda, Shinya

    2012-09-22

    The casemix-based payment system has been adopted in many countries, although it often needs complementary adjustment taking account of each hospital's unique production structure such as teaching and research duties, and non-profit motives. It has been challenging to numerically evaluate the impact of such structural heterogeneity on production, separately of production inefficiency. The current study adopted stochastic frontier analysis and proposed a method to assess unique components of hospital production structures using a fixed-effect variable. There were two stages of analyses in this study. In the first stage, we estimated the efficiency score from the hospital production function using a true fixed-effect model (TFEM) in stochastic frontier analysis. The use of a TFEM allowed us to differentiate the unobserved heterogeneity of individual hospitals as hospital-specific fixed effects. In the second stage, we regressed the obtained fixed-effect variable for structural components of hospitals to test whether the variable was explicitly related to the characteristics and local disadvantages of the hospitals. In the first analysis, the estimated efficiency score was approximately 0.6. The mean value of the fixed-effect estimator was 0.784, the standard deviation was 0.137, the range was between 0.437 and 1.212. The second-stage regression confirmed that the value of the fixed effect was significantly correlated with advanced technology and local conditions of the sample hospitals. The obtained fixed-effect estimator may reflect hospitals' unique structures of production, considering production inefficiency. The values of fixed-effect estimators can be used as evaluation tools to improve fairness in the reimbursement system for various functions of hospitals based on casemix classification.

  3. What does free cash flow tell us about hospital efficiency? A stochastic frontier analysis of cost inefficiency in California hospitals.

    PubMed

    Pratt, William R

    2010-01-01

    Hospitals are facing substantial financial and economic pressure as a result of health plan payment restructuring, unfunded mandates, and other factors. This article analyzes the relationship between free cash flow (FCF) and hospital efficiency given these financial challenges. Data from 270 California hospitals were used to estimate a stochastic frontier model of hospital cost efficiency that explicitly takes into account outpatient heterogeneity. The findings indicate that hospital FCF is significantly linked to firm efficiency/inefficiency. The results indicate that higher positive cash flows are related to lower cost inefficiency, but higher negative cash flows are related to higher cost inefficiency. Thus, cash flows not only impact the ability of hospitals to meet current liabilities, they are also related to the ability of the hospitals to use resources effectively.

  4. Relationship between recycling rate and air pollution: Waste management in the state of Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giovanis, Eleftherios, E-mail: giovanis95@gmail.com

    Highlights: • This study examines the relationship between recycling rate of solid waste and air pollution. • Fixed effects Stochastic Frontier Analysis model with panel data are employed. • The case study is a waste municipality survey in the state of Massachusetts during 2009–2012. • The findings support that a negative relationship between air pollution and recycling. - Abstract: This study examines the relationship between recycling rate of solid waste and air pollution using data from a waste municipality survey in the state of Massachusetts during the period 2009–2012. Two econometric approaches are applied. The first approach is a fixedmore » effects model, while the second is a Stochastic Frontier Analysis (SFA) with fixed effects model. The advantage of the first approach is the ability of controlling for stable time invariant characteristics of the municipalities, thereby eliminating potentially large sources of bias. The second approach is applied in order to estimate the technical efficiency and rank of each municipality accordingly. The regressions control for various demographic, economic and recycling services, such as income per capita, population density, unemployment, trash services, Pay-as-you-throw (PAYT) program and meteorological data. The findings support that a negative relationship between particulate particles in the air 2.5 μm or less in size (PM{sub 2.5}) and recycling rate is presented. In addition, the pollution is increased with increases on income per capita up to $23,000–$26,000, while after this point income contributes positively on air quality. Finally, based on the efficiency derived by the Stochastic Frontier Analysis (SFA) model, the municipalities which provide both drop off and curbside services for trash, food and yard waste and the PAYT program present better performance regarding the air quality.« less

  5. Measuring Efficiency of Health Systems of the Middle East and North Africa (MENA) Region Using Stochastic Frontier Analysis.

    PubMed

    Hamidi, Samer; Akinci, Fevzi

    2016-06-01

    The main purpose of this study is to measure the technical efficiency of twenty health systems in the Middle East and North Africa (MENA) region to inform evidence-based health policy decisions. In addition, the effects of alternative stochastic frontier model specification on the empirical results are examined. We conducted a stochastic frontier analysis to estimate the country-level technical efficiencies using secondary panel data for 20 MENA countries for the period of 1995-2012 from the World Bank database. We also tested the effect of alternative frontier model specification using three random-effects approaches: a time-invariant model where efficiency effects are assumed to be static with regard to time, and a time-varying efficiency model where efficiency effects have temporal variation, and one model to account for heterogeneity. The average estimated technical inefficiency of health systems in the MENA region was 6.9 % with a range of 5.7-7.9 % across the three models. Among the top performers, Lebanon, Qatar, and Morocco are ranked consistently high according to the three different inefficiency model specifications. On the opposite side, Sudan, Yemen and Djibouti ranked among the worst performers. On average, the two most technically efficient countries were Qatar and Lebanon. We found that the estimated technical efficiency scores vary substantially across alternative parametric models. Based on the findings reported in this study, most MENA countries appear to be operating, on average, with a reasonably high degree of technical efficiency compared with other countries in the region. However, there is evidence to suggest that there are considerable efficiency gains yet to be made by some MENA countries. Additional empirical research is needed to inform future health policies aimed at improving both the efficiency and sustainability of the health systems in the MENA region.

  6. Dairy farm cost efficiency.

    PubMed

    Tauer, L W; Mishra, A K

    2006-12-01

    A stochastic cost equation was estimated for US dairy farms using national data from the production year 2000 to determine how farmers might reduce their cost of production. Cost of producing a unit of milk was estimated into separate frontier (efficient) and inefficiency components, with both components estimated as a function of management and causation variables. Variables were entered as impacting the frontier component as well as the efficiency component of the stochastic curve because a priori both components could be impacted. A factor that has an impact on the cost frontier was the number of hours per day the milking facility is used. Using the milking facility for more hours per day decreased frontier costs; however, inefficiency increased with increased hours of milking facility use. Thus, farmers can decrease costs with increased utilization of the milking facility, but only if they are efficient in this strategy. Parlors compared with stanchions used for milking did not decrease frontier costs, but decreased costs because of increased efficiency, as did the use of a nutritionist. Use of rotational grazing decreased frontier costs but also increased inefficiency. Older farmers were less efficient.

  7. A comparison of DEA and SFA using micro- and macro-level perspectives: Efficiency of Chinese local banks

    NASA Astrophysics Data System (ADS)

    Silva, Thiago Christiano; Tabak, Benjamin Miranda; Cajueiro, Daniel Oliveira; Dias, Marina Villas Boas

    2017-03-01

    This study investigates to which extent results produced by a single frontier model are reliable, based on the application of data envelopment analysis and stochastic frontier approach to a sample of Chinese local banks. Our findings show they produce a consistent trend on global efficiency scores over the years. However, rank correlations indicate they diverge with respect to individual performance diagnoses. Therefore, these models provide steady information on the efficiency of the banking system as a whole, but they become divergent at the individual level.

  8. Cost efficiency of US hospitals: a stochastic frontier approach.

    PubMed

    Rosko, M D

    2001-09-01

    This study examined the impact of managed care and other environmental factors on hospital inefficiency in 1631 US hospitals during the period 1990-1996. A panel, stochastic frontier regression model was used to estimate inefficiency parameters and inefficiency scores. The results suggest that mean estimated inefficiency decreased by about 28% during the study period. Inefficiency was negatively associated with health maintenance organization (HMO) penetration and industry concentration. It was positively related with Medicare share and for-profit ownership status. Copyright 2001 John Wiley & Sons, Ltd.

  9. An Analysis of Costs in Institutions of Higher Education in England

    ERIC Educational Resources Information Center

    Johnes, Geraint; Johnes, Jill; Thanassoulis, Emmanuel

    2008-01-01

    Cost functions are estimated, using random effects and stochastic frontier methods, for English higher education institutions. The article advances on existing literature by employing finer disaggregation by subject, institution type and location, and by introducing consideration of quality effects. Estimates are provided of average incremental…

  10. TU-C-17A-08: Improving IMRT Planning and Reducing Inter-Planner Variability Using the Stochastic Frontier Method: Validation Based On Clinical and Simulated Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gagne, MC; Archambault, L; CHU de Quebec, Quebec, Quebec

    2014-06-15

    Purpose: Intensity modulated radiation therapy always requires compromises between PTV coverage and organs at risk (OAR) sparing. We previously developed metrics that correlate doses to OAR to specific patients’ morphology using stochastic frontier analysis (SFA). Here, we aim to examine the validity of this approach using a large set of realistically simulated dosimetric and geometric data. Methods: SFA describes a set of treatment plans as an asymmetric distribution with respect to a frontier defining optimal plans. Eighty head and neck IMRT plans were used to establish a metric predicting the mean dose to parotids as a function of simple geometricmore » parameters. A database of 140 parotids was used as a basis distribution to simulate physically plausible data of geometry and dose. Distributions comprising between 20 and 5000 were simulated and the SFA was applied to obtain new frontiers, which were compared to the original frontier. Results: It was possible to simulate distributions consistent with the original dataset. Below 160 organs, the SFA could not always describe distributions as asymmetric: a few cases showed a Gaussian or half-Gaussian distribution. In order to converge to a stable solution, the number of organs in a distribution must ideally be above 100, but in many cases stable parameters could be achieved with as low as 60 samples of organ data. Mean RMS value of the error of new frontiers was significantly reduced when additional organs are used. Conclusion: The number of organs in a distribution showed to have an impact on the effectiveness of the model. It is always possible to obtain a frontier, but if the number of organs in the distribution is small (< 160), it may not represent de lowest dose achievable. These results will be used to determine number of cases necessary to adapt the model to other organs.« less

  11. Exploring the Impact of Inadequate Funding for English Language Learners in Colorado School Districts

    ERIC Educational Resources Information Center

    Ramirez, Al; Carpenter, Dick M., II; Breckenridge, Maureen

    2014-01-01

    This study investigates the relationship between the academic achievement of all students and inadequate funding for English language learners in Colorado school districts. Several stochastic frontier analysis models were used in lieu of traditional production functions in order to achieve clearer estimates. The analyses detected only a few…

  12. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Performance of US teaching hospitals: a panel analysis of cost inefficiency.

    PubMed

    Rosko, Michael D

    2004-02-01

    This research summarizes an analysis of the impact of environment pressures on hospital inefficiency during the period 1990-1999. The panel design included 616 hospitals. Of these, 211 were academic medical centers and 415 were hospitals with smaller teaching programs. The primary sources of data were the American Hospital Association's Annual Survey of Hospitals and Medicare Cost Reports. Hospital inefficiency was estimated by a regression technique called stochastic frontier analysis. This technique estimates a "best practice cost frontier" for each hospital that is based on the hospital's outputs and input prices. The cost efficiency of each hospital was defined as the ratio of the stochastic frontier total costs to observed total costs. Average inefficiency declined from 14.35% in 1990 to 11.42% in 1998. It increased to 11.78% in 1999. Decreases in inefficiency were associated with the HMO penetration rate and time. Increases in inefficiency were associated with for-profit ownership status and Medicare share of admissions. The implementation of the provisions of the Balanced Budget Act of 1997 was followed by a small decrease in average hospital inefficiency. Analysis found that the SFA results were moderately sensitive to the specification of the teaching output variable. Thus, although the SFA technique can be useful for detecting differences in inefficiency between groups of hospitals (i.e., those with high versus those with low Medicare shares or for-profit versus not-for-profit hospitals), its relatively low precision indicates it should not be used for exact estimates of the magnitude of differences associated with inefficiency-effects variables.

  14. Relationships of efficiency to reproductive disorders in Danish milk production: a stochastic frontier analysis.

    PubMed

    Lawson, L G; Bruun, J; Coelli, T; Agger, J F; Lund, M

    2004-01-01

    Relationships of various reproductive disorders and milk production performance of Danish dairy farms were investigated. A stochastic frontier production function was estimated using data collected in 1998 from 514 Danish dairy farms. Measures of farm-level milk production efficiency relative to this production frontier were obtained, and relationships between milk production efficiency and the incidence risk of reproductive disorders were examined. There were moderate positive relationships between milk production efficiency and retained placenta, induction of estrus, uterine infections, ovarian cysts, and induction of birth. Inclusion of reproductive management variables showed that these moderate relationships disappeared, but directions of coefficients for almost all those variables remained the same. Dystocia showed a weak negative correlation with milk production efficiency. Farms that were mainly managed by young farmers had the highest average efficiency scores. The estimated milk losses due to inefficiency averaged 1142, 488, and 256 kg of energy-corrected milk per cow, respectively, for low-, medium-, and high-efficiency herds. It is concluded that the availability of younger cows, which enabled farmers to replace cows with reproductive disorders, contributed to high cow productivity in efficient farms. Thus, a high replacement rate more than compensates for the possible negative effect of reproductive disorders. The use of frontier production and efficiency/inefficiency functions to analyze herd data may enable dairy advisors to identify inefficient herds and to simulate the effect of alternative management procedures on the individual herd's efficiency.

  15. Returns and determinants of technical efficiency in small-scale Malabari goat production units in Kerala, India.

    PubMed

    Alex, Rani; Kunniyoor Cheemani, Raghavan; Thomas, Naicy

    2013-11-01

    A stochastic frontier production function was employed to measure technical efficiency and its determinants in smallholder Malabari goat production units in Kerala, India. Data were obtained from 100 goat farmers in northern Kerala, selected using multistage random sampling. The parameters of the stochastic frontier production function were estimated using the maximum likelihood method. Cost and return analysis showed that the major expenditure was feed and fodder, and veterinary expenses were secondary. The chief returns were the sale of live animals, milk and manure. Individual farm technical efficiency ranged from 0.34 to 0.97 with a mean of 0.88. The study found herd size (number of animal units) and centre (locality of farm) significantly affected technical efficiency, but sex of farmer, education, land size and family size did not. Technical efficiency decreased as herd size increased; half the units with five or more adult animals had technical efficiency below 60 %.

  16. A Cobb Douglas Stochastic Frontier Model on Measuring Domestic Bank Efficiency in Malaysia

    PubMed Central

    Hasan, Md. Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md. Azizul

    2012-01-01

    Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005–2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time. PMID:22900009

  17. A Cobb Douglas stochastic frontier model on measuring domestic bank efficiency in Malaysia.

    PubMed

    Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul

    2012-01-01

    Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005-2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time.

  18. Resources and Research Production in Higher Education: A Longitudinal Analysis of Chinese Universities, 2000-2010

    ERIC Educational Resources Information Center

    Zhang, Liang; Bao, Wei; Sun, Liang

    2016-01-01

    In this study we examined the resource-research relationship at China's research universities. The stochastic frontier production function was employed in analyses of a panel data set on a group of the most research-intensive universities in China from 2000 to 2010. Results suggested overall tight relationships between various resources (including…

  19. Higher Education and Efficiency in Europe: A Comparative Analysis

    ERIC Educational Resources Information Center

    Sánchez-Pérez, Rosario

    2012-01-01

    This paper analyses the efficiency of higher education in equalizing the feasible wages obtained for men and women in the labour market. To do that, It is estimated two stochastic frontiers. The first one measures the effect of higher education inside the group of men and women for six European countries. The results indicate that in Denmark,…

  20. Bachelor's Degree Productivity X-Inefficiency: The Role of State Higher Education Policy

    ERIC Educational Resources Information Center

    Titus, Marvin A.

    2010-01-01

    Using stochastic frontier analysis and dynamic fixed-effects panel modeling, this study examines how changes in the x-inefficiency of bachelor's degree production are influenced by changes in state higher education policy. The findings from this research show that increases in need-based state financial aid help to mitigate the convergence among…

  1. Evaluating the Impact of Hospital Efficiency on Wellness in the Military Health System.

    PubMed

    Bastian, Nathaniel D; Kang, Hyojung; Swenson, Eric R; Fulton, Lawrence V; Griffin, Paul M

    2016-08-01

    Like all health care delivery systems, the U.S. Department of Defense Military Health System (MHS) strives to achieve top preventative care and population health outcomes for its members while operating at an efficient level and containing costs. The objective of this study is to understand the overall efficiency performance of military hospitals and investigate the relationship between efficiency and wellness. This study uses data envelopment analysis and stochastic frontier analysis to compare the efficiency of 128 military treatment facilities from the Army, Navy, and Air Force during the period of 2011 to 2013. Fixed effects panel regression is used to determine the association between the hospital efficiency and wellness scores. The results indicate that data envelopment analysis and stochastic frontier analysis efficiency scores are congruent in direction. Both results indicate that the majority of the MHS hospitals and clinics can potentially improve their productive efficiency by managing their input resources better. When comparing the performance of the three military branches of service, Army hospitals as a group outperformed their Navy and Air Force counterparts; thus, best practices from the Army should be shared across service components. The findings also suggest no statistically significant, positive association between efficiency and wellness over time in the MHS. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  2. What We Can Learn from the Use of Student Data in Efficiency Analysis within the Context of Higher Education?

    ERIC Educational Resources Information Center

    Barra, Cristian; Zotti, Roberto

    2017-01-01

    The main purpose of the paper is to estimate the efficiency of a big public university in Italy using individual student-level data, modeling exogenous variables in human capital formation through a heteroscedastic stochastic frontier approach. Specifically, a production function for tertiary education has been estimated with emphasis on…

  3. Are Public Master's Institutions Cost Efficient? A Stochastic Frontier and Spatial Analysis

    ERIC Educational Resources Information Center

    Titus, Marvin A.; Vamosiu, Adriana; McClure, Kevin R.

    2017-01-01

    The current study examines costs, measured by educational and general (E&G) spending, and cost efficiency at 252 public master's institutions in the United States over a nine-year (2004-2012) period. We use a multi-product quadratic cost function and results from a random-effects model with a first-order autoregressive (AR1) disturbance term…

  4. Continuous-time mean-variance portfolio selection with value-at-risk and no-shorting constraints

    NASA Astrophysics Data System (ADS)

    Yan, Wei

    2012-01-01

    An investment problem is considered with dynamic mean-variance(M-V) portfolio criterion under discontinuous prices which follow jump-diffusion processes according to the actual prices of stocks and the normality and stability of the financial market. The short-selling of stocks is prohibited in this mathematical model. Then, the corresponding stochastic Hamilton-Jacobi-Bellman(HJB) equation of the problem is presented and the solution of the stochastic HJB equation based on the theory of stochastic LQ control and viscosity solution is obtained. The efficient frontier and optimal strategies of the original dynamic M-V portfolio selection problem are also provided. And then, the effects on efficient frontier under the value-at-risk constraint are illustrated. Finally, an example illustrating the discontinuous prices based on M-V portfolio selection is presented.

  5. Cost inefficiency in Washington hospitals: a stochastic frontier approach using panel data.

    PubMed

    Li, T; Rosenman, R

    2001-06-01

    We analyze a sample of Washington State hospitals with a stochastic frontier panel data model, specifying the cost function as a generalized Leontief function which, according to a Hausman test, performs better in this case than the translog form. A one-stage FGLS estimation procedure which directly models the inefficiency effects improves the efficiency of our estimates. We find that hospitals with higher casemix indices or more beds are less efficient while for-profit hospitals and those with higher proportion of Medicare patient days are more efficient. Relative to the most efficient hospital, the average hospital is only about 67% efficient.

  6. Technical efficiency and resources allocation in university hospitals in Tehran, 2009-2012.

    PubMed

    Rezapour, Aziz; Ebadifard Azar, Farbod; Yousef Zadeh, Negar; Roumiani, YarAllah; Bagheri Faradonbeh, Saeed

    2015-01-01

    Assessment of hospitals' performance in achieving its goals is a basic necessity. Measuring the efficiency of hospitals in order to boost resource productivity in healthcare organizations is extremely important. The aim of this study was to measure technical efficiency and determining status of resource allocation in some university hospitals, in Tehran, Iran. This study was conducted in 2012; the research population consisted of all hospitals affiliated to Iran and Tehran medical sciences universities of. Required data, such as human and capital resources information and also production variables (hospital outputs) were collected from data centers of studied hospitals. Data were analyzed using data envelopment analysis (DEA) method, Deap2,1 software; and the stochastic frontier analysis (SFA) method, Frontier 4,1 software. According to DEA method, average of technical, management (pure) and scale efficiency of the studied hospitals during the study period were calculated 0.87, 0.971, and 0.907, respectively. All kinds of efficiency did not follow a fixed trend over the study time and were constantly changing. In the stochastic frontier's production function analysis, the technical efficiency of the studied industry during the study period was estimated to be 0.389. This study represented hospitals with the highest and lowest efficiency. Reference hospitals (more efficient states) were indicated for the inefficient centers. According to the findings, it was found that in the hospitals that do not operate efficiently, there is a capacity to improve the technical efficiency by removing excess inputs without changes in the level of outputs. However, by the optimal allocation of resources in most studied hospitals, very important economy of scale can be achieved.

  7. Technical efficiency and resources allocation in university hospitals in Tehran, 2009-2012

    PubMed Central

    Rezapour, Aziz; Ebadifard Azar, Farbod; Yousef Zadeh, Negar; Roumiani, YarAllah; Bagheri Faradonbeh, Saeed

    2015-01-01

    Background: Assessment of hospitals’ performance in achieving its goals is a basic necessity. Measuring the efficiency of hospitals in order to boost resource productivity in healthcare organizations is extremely important. The aim of this study was to measure technical efficiency and determining status of resource allocation in some university hospitals, in Tehran, Iran. Methods: This study was conducted in 2012; the research population consisted of all hospitals affiliated to Iran and Tehran medical sciences universities of. Required data, such as human and capital resources information and also production variables (hospital outputs) were collected from data centers of studied hospitals. Data were analyzed using data envelopment analysis (DEA) method, Deap2,1 software; and the stochastic frontier analysis (SFA) method, Frontier 4,1 software. Results: According to DEA method, average of technical, management (pure) and scale efficiency of the studied hospitals during the study period were calculated 0.87, 0.971, and 0.907, respectively. All kinds of efficiency did not follow a fixed trend over the study time and were constantly changing. In the stochastic frontier's production function analysis, the technical efficiency of the studied industry during the study period was estimated to be 0.389. Conclusion: This study represented hospitals with the highest and lowest efficiency. Reference hospitals (more efficient states) were indicated for the inefficient centers. According to the findings, it was found that in the hospitals that do not operate efficiently, there is a capacity to improve the technical efficiency by removing excess inputs without changes in the level of outputs. However, by the optimal allocation of resources in most studied hospitals, very important economy of scale can be achieved. PMID:26793657

  8. Cost and technical efficiency of physician practices: a stochastic frontier approach using panel data.

    PubMed

    Heimeshoff, Mareike; Schreyögg, Jonas; Kwietniewski, Lukas

    2014-06-01

    This is the first study to use stochastic frontier analysis to estimate both the technical and cost efficiency of physician practices. The analysis is based on panel data from 3,126 physician practices for the years 2006 through 2008. We specified the technical and cost frontiers as translog function, using the one-step approach of Battese and Coelli to detect factors that influence the efficiency of general practitioners and specialists. Variables that were not analyzed previously in this context (e.g., the degree of practice specialization) and a range of control variables such as a patients' case-mix were included in the estimation. Our results suggest that it is important to investigate both technical and cost efficiency, as results may depend on the type of efficiency analyzed. For example, the technical efficiency of group practices was significantly higher than that of solo practices, whereas the results for cost efficiency differed. This may be due to indivisibilities in expensive technical equipment, which can lead to different types of health care services being provided by different practice types (i.e., with group practices using more expensive inputs, leading to higher costs per case despite these practices being technically more efficient). Other practice characteristics such as participation in disease management programs show the same impact throughout both cost and technical efficiency: participation in disease management programs led to an increase in both, technical and cost efficiency, and may also have had positive effects on the quality of care. Future studies should take quality-related issues into account.

  9. Stochastic Frontier Model Approach for Measuring Stock Market Efficiency with Different Distributions

    PubMed Central

    Hasan, Md. Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md. Azizul

    2012-01-01

    The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE) market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time- varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation. PMID:22629352

  10. Stochastic frontier model approach for measuring stock market efficiency with different distributions.

    PubMed

    Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul

    2012-01-01

    The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE) market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time-varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation.

  11. Frontiers in Applied and Computational Mathematics 05’

    DTIC Science & Technology

    2005-03-01

    dynamics, forcing subsets to have the same oscillation numbers and interleaving spiking times . Our analysis follows the theory of coupled systems of...continuum is described by a continuous- time stochastic process, as are their internal dynamics. Soluble factors, such as cytokines, are represent- ed...scale of a partide pas- sage time through the reaction zone. Both are realistic for many systems of physical interest. A higher order theory includes

  12. An Analysis of Efficiency in Senior Secondary Schools in the Gambia 2006-2008: Educational Inputs and Production of Credits in English and Mathematics Subjects

    ERIC Educational Resources Information Center

    Sillah, B. M. S.

    2012-01-01

    This paper employs a stochastic production frontier model to assess the efficiency of the senior secondary schools in the Gambia. It examines their efficiency in using and mixing the educational inputs of average teacher salary, average teacher education, average teacher experience and students-to-teacher ratio in producing the number of students…

  13. Technical efficiency of teaching hospitals in Iran: the use of Stochastic Frontier Analysis, 1999–2011

    PubMed Central

    Goudarzi, Reza; Pourreza, Abolghasem; Shokoohi, Mostafa; Askari, Roohollah; Mahdavi, Mahdi; Moghri, Javad

    2014-01-01

    Background: Hospitals are highly resource-dependent settings, which spend a large proportion of healthcare financial resources. The analysis of hospital efficiency can provide insight into how scarce resources are used to create health values. This study examines the Technical Efficiency (TE) of 12 teaching hospitals affiliated with Tehran University of Medical Sciences (TUMS) between 1999 and 2011. Methods: The Stochastic Frontier Analysis (SFA) method was applied to estimate the efficiency of TUMS hospitals. A best function, referred to as output and input parameters, was calculated for the hospitals. Number of medical doctors, nurses, and other personnel, active beds, and outpatient admissions were considered as the input variables and number of inpatient admissions as an output variable. Results: The mean level of TE was 59% (ranging from 22 to 81%). During the study period the efficiency increased from 61 to 71%. Outpatient admission, other personnel and medical doctors significantly and positively affected the production (P< 0.05). Concerning the Constant Return to Scale (CRS), an optimal production scale was found, implying that the productions of the hospitals were approximately constant. Conclusion: Findings of this study show a remarkable waste of resources in the TUMS hospital during the decade considered. This warrants policy-makers and top management in TUMS to consider steps to improve the financial management of the university hospitals. PMID:25114947

  14. Data-Driven Benchmarking of Building Energy Efficiency Utilizing Statistical Frontier Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavousian, A; Rajagopal, R

    2014-01-01

    Frontier methods quantify the energy efficiency of buildings by forming an efficient frontier (best-practice technology) and by comparing all buildings against that frontier. Because energy consumption fluctuates over time, the efficiency scores are stochastic random variables. Existing applications of frontier methods in energy efficiency either treat efficiency scores as deterministic values or estimate their uncertainty by resampling from one set of measurements. Availability of smart meter data (repeated measurements of energy consumption of buildings) enables using actual data to estimate the uncertainty in efficiency scores. Additionally, existing applications assume a linear form for an efficient frontier; i.e.,they assume that themore » best-practice technology scales up and down proportionally with building characteristics. However, previous research shows that buildings are nonlinear systems. This paper proposes a statistical method called stochastic energy efficiency frontier (SEEF) to estimate a bias-corrected efficiency score and its confidence intervals from measured data. The paper proposes an algorithm to specify the functional form of the frontier, identify the probability distribution of the efficiency score of each building using measured data, and rank buildings based on their energy efficiency. To illustrate the power of SEEF, this paper presents the results from applying SEEF on a smart meter data set of 307 residential buildings in the United States. SEEF efficiency scores are used to rank individual buildings based on energy efficiency, to compare subpopulations of buildings, and to identify irregular behavior of buildings across different time-of-use periods. SEEF is an improvement to the energy-intensity method (comparing kWh/sq.ft.): whereas SEEF identifies efficient buildings across the entire spectrum of building sizes, the energy-intensity method showed bias toward smaller buildings. The results of this research are expected to assist researchers and practitioners compare and rank (i.e.,benchmark) buildings more robustly and over a wider range of building types and sizes. Eventually, doing so is expected to result in improved resource allocation in energy-efficiency programs.« less

  15. Measuring Efficiency of Secondary Healthcare Providers in Slovenia

    PubMed Central

    Blatnik, Patricia; Bojnec, Štefan; Tušak, Matej

    2017-01-01

    Abstract The chief aim of this study was to analyze secondary healthcare providers' efficiency, focusing on the efficiency analysis of Slovene general hospitals. We intended to present a complete picture of technical, allocative, and cost or economic efficiency of general hospitals. Methods We researched the aspects of efficiency with two econometric methods. First, we calculated the necessary quotients of efficiency with the stochastic frontier analyze (SFA), which are realized by econometric evaluation of stochastic frontier functions; then, with the data envelopment analyze (DEA), we calculated the necessary quotients that are based on the linear programming method. Results Results on measures of efficiency showed that the two chosen methods produced two different conclusions. The SFA method concluded Celje General Hospital is the most efficient general hospital, whereas the DEA method concluded Brežice General Hospital was the hospital to be declared as the most efficient hospital. Conclusion Our results are a useful tool that can aid managers, payers, and designers of healthcare policy to better understand how general hospitals operate. The participants can accordingly decide with less difficulty on any further business operations of general hospitals, having the best practices of general hospitals at their disposal. PMID:28730180

  16. Cost inefficiency under financial strain: a stochastic frontier analysis of hospitals in Washington State through the Great Recession.

    PubMed

    Izón, Germán M; Pardini, Chelsea A

    2017-06-01

    The importance of increasing cost efficiency for community hospitals in the United States has been underscored by the Great Recession and the ever-changing health care reimbursement environment. Previous studies have shown mixed evidence with regards to the relationship between linking hospitals' reimbursement to quality of care and cost efficiency. Moreover, current evidence suggests that not only inherently financially disadvantaged hospitals (e.g., safety-net providers), but also more financially stable providers, experienced declines to their financial viability throughout the recession. However, little is known about how hospital cost efficiency fared throughout the Great Recession. This study contributes to the literature by using stochastic frontier analysis to analyze cost inefficiency of Washington State hospitals between 2005 and 2012, with controls for patient burden of illness, hospital process of care quality, and hospital outcome quality. The quality measures included in this study function as central measures for the determination of recently implemented pay-for-performance programs. The average estimated level of hospital cost inefficiency before the Great Recession (10.4 %) was lower than it was during the Great Recession (13.5 %) and in its aftermath (14.1 %). Further, the estimated coefficients for summary process of care quality indexes for three health conditions (acute myocardial infarction, pneumonia, and heart failure) suggest that higher quality scores are associated with increased cost inefficiency.

  17. Frontier-based techniques in measuring hospital efficiency in Iran: a systematic review and meta-regression analysis

    PubMed Central

    2013-01-01

    Background In recent years, there has been growing interest in measuring the efficiency of hospitals in Iran and several studies have been conducted on the topic. The main objective of this paper was to review studies in the field of hospital efficiency and examine the estimated technical efficiency (TE) of Iranian hospitals. Methods Persian and English databases were searched for studies related to measuring hospital efficiency in Iran. Ordinary least squares (OLS) regression models were applied for statistical analysis. The PRISMA guidelines were followed in the search process. Results A total of 43 efficiency scores from 29 studies were retrieved and used to approach the research question. Data envelopment analysis was the principal frontier efficiency method in the estimation of efficiency scores. The pooled estimate of mean TE was 0.846 (±0.134). There was a considerable variation in the efficiency scores between the different studies performed in Iran. There were no differences in efficiency scores between data envelopment analysis (DEA) and stochastic frontier analysis (SFA) techniques. The reviewed studies are generally similar and suffer from similar methodological deficiencies, such as no adjustment for case mix and quality of care differences. The results of OLS regression revealed that studies that included more variables and more heterogeneous hospitals generally reported higher TE. Larger sample size was associated with reporting lower TE. Conclusions The features of frontier-based techniques had a profound impact on the efficiency scores among Iranian hospital studies. These studies suffer from major methodological deficiencies and were of sub-optimal quality, limiting their validity and reliability. It is suggested that improving data collection and processing in Iranian hospital databases may have a substantial impact on promoting the quality of research in this field. PMID:23945011

  18. Mean-variance portfolio selection for defined-contribution pension funds with stochastic salary.

    PubMed

    Zhang, Chubing

    2014-01-01

    This paper focuses on a continuous-time dynamic mean-variance portfolio selection problem of defined-contribution pension funds with stochastic salary, whose risk comes from both financial market and nonfinancial market. By constructing a special Riccati equation as a continuous (actually a viscosity) solution to the HJB equation, we obtain an explicit closed form solution for the optimal investment portfolio as well as the efficient frontier.

  19. Impact of HMO penetration and other environmental factors on hospital X-inefficiency.

    PubMed

    Rosko, M D

    2001-12-01

    This study examined the impact of health maintenance organization (HMO) market penetration and other internal and external environmental factors on hospital X-inefficiency in a national sample (N = 1,966) of urban U.S. hospitals in 1997. Stochastic frontier analysis, a frontier regression technique, was used to measure X-inefficiency and estimate parameters of the correlates of X-inefficiency. Log-likelihood restriction tests were used to test a variety of assumptions about the empirical model that guided its selection. Average estimated X-inefficiency in study hospitals was 12.96 percent. Increases in managed care penetration, dependence on Medicare and Medicaid, membership in a multihospital system, and location in areas where competitive pressures and the pool of uncompensated care are greater were associated with less X-inefficiency. Not-for-profit ownership was associated with increased X-inefficiency.

  20. Efficiency and Productivity Analysis of Multidivisional Firms

    NASA Astrophysics Data System (ADS)

    Gong, Binlei

    Multidivisional firms are those who have footprints in multiple segments and hence using multiple technologies to convert inputs to outputs, which makes it difficult to estimate the resource allocations, aggregated production functions, and technical efficiencies of this type of companies. This dissertation aims to explore and reveal such unobserved information by several parametric and semiparametric stochastic frontier analyses and some other structural models. In the empirical study, this dissertation analyzes the productivity and efficiency for firms in the global oilfield market.

  1. Neurocomputing

    NASA Technical Reports Server (NTRS)

    Hecht-Nielsen, Robert

    1990-01-01

    The present work is intended to give technologists, research scientists, and mathematicians a graduate-level overview of the field of neurocomputing. After exploring the relationship of this field to general neuroscience, attention is given to neural network building blocks, the self-adaptation equations of learning laws, the data-transformation structures of associative networks, and the multilayer data-transformation structures of mapping networks. Also treated are the neurocomputing frontiers of spatiotemporal, stochastic, and hierarchical networks, 'neurosoftware', the creation of neural network-based computers, and neurocomputing applications in sensor processing, control, and data analysis.

  2. A stochastic frontier analysis of technical efficiency of fish cage culture in Peninsular Malaysia.

    PubMed

    Islam, Gazi Md Nurul; Tai, Shzee Yew; Kusairi, Mohd Noh

    2016-01-01

    Cage culture plays an important role in achieving higher output and generating more export earnings in Malaysia. However, the cost of fingerlings, feed and labour have increased substantially for cage culture in the coastal areas in Peninsular Malaysia. This paper uses farm level data gathered from Manjung, Perak and Kota Tinggi, Johor to investigate the technical efficiency of brackish water fish cage culture using the stochastic frontier approach. The technical efficiency was estimated and specifically the factors affecting technical inefficiencies of fish cage culture system in Malaysia was investigated. On average, 37 percent of the sampled fish cage farms are technically efficient. The results suggest very high degrees of technical inefficiency exist among the cage culturists. This implies that great potential exists to increase fish production through improved efficiency in cage culture management in Peninsular Malaysia. The results indicate that farmers obtained grouper fingerlings from other neighboring countries due to scarcity of fingerlings from wild sources. The cost of feeding for grouper (Epinephelus fuscoguttatus) requires relatively higher costs compared to seabass (Lates calcarifer) production in cage farms in the study areas. Initiatives to undertake extension programmes at the farm level are needed to help cage culturists in utilizing their resources more efficiently in order to substantially enhance their fish production.

  3. Modeling the value for money of changing clinical practice change: a stochastic application in diabetes care.

    PubMed

    Hoomans, Ties; Abrams, Keith R; Ament, Andre J H A; Evers, Silvia M A A; Severens, Johan L

    2009-10-01

    Decision making about resource allocation for guideline implementation to change clinical practice is inevitably undertaken in a context of uncertainty surrounding the cost-effectiveness of both clinical guidelines and implementation strategies. Adopting a total net benefit approach, a model was recently developed to overcome problems with the use of combined ratio statistics when analyzing decision uncertainty. To demonstrate the stochastic application of the model for informing decision making about the adoption of an audit and feedback strategy for implementing a guideline recommending intensive blood glucose control in type 2 diabetes in primary care in the Netherlands. An integrated Bayesian approach to decision modeling and evidence synthesis is adopted, using Markov Chain Monte Carlo simulation in WinBUGs. Data on model parameters is gathered from various sources, with effectiveness of implementation being estimated using pooled, random-effects meta-analysis. Decision uncertainty is illustrated using cost-effectiveness acceptability curves and frontier. Decisions about whether to adopt intensified glycemic control and whether to adopt audit and feedback alter for the maximum values that decision makers are willing to pay for health gain. Through simultaneously incorporating uncertain economic evidence on both guidance and implementation strategy, the cost-effectiveness acceptability curves and cost-effectiveness acceptability frontier show an increase in decision uncertainty concerning guideline implementation. The stochastic application in diabetes care demonstrates that the model provides a simple and useful tool for quantifying and exploring the (combined) uncertainty associated with decision making about adopting guidelines and implementation strategies and, therefore, for informing decisions about efficient resource allocation to change clinical practice.

  4. Mean-Variance Portfolio Selection for Defined-Contribution Pension Funds with Stochastic Salary

    PubMed Central

    Zhang, Chubing

    2014-01-01

    This paper focuses on a continuous-time dynamic mean-variance portfolio selection problem of defined-contribution pension funds with stochastic salary, whose risk comes from both financial market and nonfinancial market. By constructing a special Riccati equation as a continuous (actually a viscosity) solution to the HJB equation, we obtain an explicit closed form solution for the optimal investment portfolio as well as the efficient frontier. PMID:24782667

  5. Relationship between recycling rate and air pollution: Waste management in the state of Massachusetts.

    PubMed

    Giovanis, Eleftherios

    2015-06-01

    This study examines the relationship between recycling rate of solid waste and air pollution using data from a waste municipality survey in the state of Massachusetts during the period 2009-2012. Two econometric approaches are applied. The first approach is a fixed effects model, while the second is a Stochastic Frontier Analysis (SFA) with fixed effects model. The advantage of the first approach is the ability of controlling for stable time invariant characteristics of the municipalities, thereby eliminating potentially large sources of bias. The second approach is applied in order to estimate the technical efficiency and rank of each municipality accordingly. The regressions control for various demographic, economic and recycling services, such as income per capita, population density, unemployment, trash services, Pay-as-you-throw (PAYT) program and meteorological data. The findings support that a negative relationship between particulate particles in the air 2.5 μm or less in size (PM2.5) and recycling rate is presented. In addition, the pollution is increased with increases on income per capita up to $23,000-$26,000, while after this point income contributes positively on air quality. Finally, based on the efficiency derived by the Stochastic Frontier Analysis (SFA) model, the municipalities which provide both drop off and curbside services for trash, food and yard waste and the PAYT program present better performance regarding the air quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The impact of trade costs on rare earth exports : a stochastic frontier estimation approach.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanyal, Prabuddha; Brady, Patrick Vane; Vugrin, Eric D.

    The study develops a novel stochastic frontier modeling approach to the gravity equation for rare earth element (REE) trade between China and its trading partners between 2001 and 2009. The novelty lies in differentiating betweenbehind the border' trade costs by China and theimplicit beyond the border costs' of China's trading partners. Results indicate that the significance level of the independent variables change dramatically over the time period. While geographical distance matters for trade flows in both periods, the effect of income on trade flows is significantly attenuated, possibly capturing the negative effects of financial crises in the developed world. Second,more » the total export losses due tobehind the border' trade costs almost tripled over the time period. Finally, looking atimplicit beyond the border' trade costs, results show China gaining in some markets, although it is likely that some countries are substituting away from Chinese REE exports.« less

  7. Hospital efficiency and transaction costs: a stochastic frontier approach.

    PubMed

    Ludwig, Martijn; Groot, Wim; Van Merode, Frits

    2009-07-01

    The make-or-buy decision of organizations is an important issue in the transaction cost theory, but is usually not analyzed from an efficiency perspective. Hospitals frequently have to decide whether to outsource or not. The main question we address is: Is the make-or-buy decision affected by the efficiency of hospitals? A one-stage stochastic cost frontier equation is estimated for Dutch hospitals. The make-or-buy decisions of ten different hospital services are used as explanatory variables to explain efficiency of hospitals. It is found that for most services the make-or-buy decision is not related to efficiency. Kitchen services are an important exception to this. Large hospitals tend to outsource less, which is supported by efficiency reasons. For most hospital services, outsourcing does not significantly affect the efficiency of hospitals. The focus on the make-or-buy decision may therefore be less important than often assumed.

  8. Modelling on optimal portfolio with exchange rate based on discontinuous stochastic process

    NASA Astrophysics Data System (ADS)

    Yan, Wei; Chang, Yuwen

    2016-12-01

    Considering the stochastic exchange rate, this paper is concerned with the dynamic portfolio selection in financial market. The optimal investment problem is formulated as a continuous-time mathematical model under mean-variance criterion. These processes follow jump-diffusion processes (Weiner process and Poisson process). Then the corresponding Hamilton-Jacobi-Bellman(HJB) equation of the problem is presented and its efferent frontier is obtained. Moreover, the optimal strategy is also derived under safety-first criterion.

  9. FORMULATION AND ESTIMATION OF STOCHASTIC FRONTIER PRODUCTION FUNCTION MODELS. (R826610)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  10. Vocational Training and Agricultural Productivity: Evidence from Rice Production in Vietnam

    ERIC Educational Resources Information Center

    Ulimwengu, John; Badiane, Ousmane

    2010-01-01

    The paper examines the impact of farmers' educational attainment on agricultural productivity. More specifically, it evaluates how farmers with vocational training perform compared to those with traditional educational training. A stochastic production frontier and inefficiency effects model is estimated using nationally representative household…

  11. Payment schemes and cost efficiency: evidence from Swiss public hospitals.

    PubMed

    Meyer, Stefan

    2015-03-01

    This paper aims at analysing the impact of prospective payment schemes on cost efficiency of acute care hospitals in Switzerland. We study a panel of 121 public hospitals subject to one of four payment schemes. While several hospitals are still reimbursed on a per diem basis for the treatment of patients, most face flat per-case rates-or mixed schemes, which combine both elements of reimbursement. Thus, unlike previous studies, we are able to simultaneously analyse and isolate the cost-efficiency effects of different payment schemes. By means of stochastic frontier analysis, we first estimate a hospital cost frontier. Using the two-stage approach proposed by Battese and Coelli (Empir Econ 20:325-332, 1995), we then analyse the impact of these payment schemes on the cost efficiency of hospitals. Controlling for hospital characteristics, local market conditions in the 26 Swiss states (cantons), and a time trend, we show that, compared to per diem, hospitals which are reimbursed by flat payment schemes perform better in terms of cost efficiency. Our results suggest that mixed schemes create incentives for cost containment as well, although to a lesser extent. In addition, our findings indicate that cost-efficient hospitals are primarily located in cantons with competitive markets, as measured by the Herfindahl-Hirschman index in inpatient care. Furthermore, our econometric model shows that we obtain biased estimates from frontier analysis if we do not account for heteroscedasticity in the inefficiency term.

  12. Technical efficiency of rural primary health care system for diabetes treatment in Iran: a stochastic frontier analysis.

    PubMed

    Qorbani, Mostafa; Farzadfar, Farshad; Majdzadeh, Reza; Mohammad, Kazem; Motevalian, Abbas

    2017-01-01

    Our aim was to explore the technical efficiency (TE) of the Iranian rural primary healthcare (PHC) system for diabetes treatment coverage rate using the stochastic frontier analysis (SFA) as well as to examine the strength and significance of the effect of human resources density on diabetes treatment. In the SFA model diabetes treatment coverage rate, as a output, is a function of health system inputs (Behvarz worker density, physician density, and rural health center density) and non-health system inputs (urbanization rate, median age of population, and wealth index) as a set of covariates. Data about the rate of self-reported diabetes treatment coverage was obtained from the Non-Communicable Disease Surveillance Survey, data about health system inputs were collected from the health census database and data about non-health system inputs were collected from the census data and household survey. In 2008, rate of diabetes treatment coverage was 67% (95% CI: 63%-71%) nationally, and at the provincial level it varied from 44% to 81%. The TE score at the national level was 87.84%, with considerable variation across provinces (from 59.65% to 98.28%).Among health system and non-health system inputs, only the Behvarz density (per 1000 population)was significantly associated with diabetes treatment coverage (β (95%CI): 0.50 (0.29-0.70), p  < 0.001). Our findings show that although the rural PHC system can considered efficient in diabetes treatment at the national level, a wide variation exists in TE at the provincial level. Because the only variable that is predictor of TE is the Behvarz density, the PHC system may extend the diabetes treatment coverage by using this group of health care workers.

  13. Stochastic Estimation of Cost Frontier: Evidence from Bangladesh

    ERIC Educational Resources Information Center

    Mamun, Shamsul Arifeen Khan

    2012-01-01

    In the literature of higher education cost function study, enough knowledge is created in the area of economy scale in the context of developed countries but the knowledge of input demand is lacking. On the other hand, empirical knowledge in the context of developing countries is very meagre. The paper fills up the knowledge gap, estimating a…

  14. Stochastic Optimally Tuned Range-Separated Hybrid Density Functional Theory.

    PubMed

    Neuhauser, Daniel; Rabani, Eran; Cytter, Yael; Baer, Roi

    2016-05-19

    We develop a stochastic formulation of the optimally tuned range-separated hybrid density functional theory that enables significant reduction of the computational effort and scaling of the nonlocal exchange operator at the price of introducing a controllable statistical error. Our method is based on stochastic representations of the Coulomb convolution integral and of the generalized Kohn-Sham density matrix. The computational cost of the approach is similar to that of usual Kohn-Sham density functional theory, yet it provides a much more accurate description of the quasiparticle energies for the frontier orbitals. This is illustrated for a series of silicon nanocrystals up to sizes exceeding 3000 electrons. Comparison with the stochastic GW many-body perturbation technique indicates excellent agreement for the fundamental band gap energies, good agreement for the band edge quasiparticle excitations, and very low statistical errors in the total energy for large systems. The present approach has a major advantage over one-shot GW by providing a self-consistent Hamiltonian that is central for additional postprocessing, for example, in the stochastic Bethe-Salpeter approach.

  15. Efficiency in the European agricultural sector: environment and resources.

    PubMed

    Moutinho, Victor; Madaleno, Mara; Macedo, Pedro; Robaina, Margarita; Marques, Carlos

    2018-04-22

    This article intends to compute agriculture technical efficiency scores of 27 European countries during the period 2005-2012, using both data envelopment analysis (DEA) and stochastic frontier analysis (SFA) with a generalized cross-entropy (GCE) approach, for comparison purposes. Afterwards, by using the scores as dependent variable, we apply quantile regressions using a set of possible influencing variables within the agricultural sector able to explain technical efficiency scores. Results allow us to conclude that although DEA and SFA are quite distinguishable methodologies, and despite attained results are different in terms of technical efficiency scores, both are able to identify analogously the worst and better countries. They also suggest that it is important to include resources productivity and subsidies in determining technical efficiency due to its positive and significant exerted influence.

  16. Multicriteria approaches for a private equity fund

    NASA Astrophysics Data System (ADS)

    Tammer, Christiane; Tannert, Johannes

    2012-09-01

    We develop a new model for a Private Equity Fund based on stochastic differential equations. In order to find efficient strategies for the fund manager we formulate a multicriteria optimization problem for a Private Equity Fund. Using the e-constraint method we solve this multicriteria optimization problem. Furthermore, a genetic algorithm is applied in order to get an approximation of the efficient frontier.

  17. Cost drivers and resource allocation in military health care systems.

    PubMed

    Fulton, Larry; Lasdon, Leon S; McDaniel, Reuben R

    2007-03-01

    This study illustrates the feasibility of incorporating technical efficiency considerations in the funding of military hospitals and identifies the primary drivers for hospital costs. Secondary data collected for 24 U.S.-based Army hospitals and medical centers for the years 2001 to 2003 are the basis for this analysis. Technical efficiency was measured by using data envelopment analysis; subsequently, efficiency estimates were included in logarithmic-linear cost models that specified cost as a function of volume, complexity, efficiency, time, and facility type. These logarithmic-linear models were compared against stochastic frontier analysis models. A parsimonious, three-variable, logarithmic-linear model composed of volume, complexity, and efficiency variables exhibited a strong linear relationship with observed costs (R(2) = 0.98). This model also proved reliable in forecasting (R(2) = 0.96). Based on our analysis, as much as $120 million might be reallocated to improve the United States-based Army hospital performance evaluated in this study.

  18. Stochastic Formalism for Thermally Driven Distribution Frontier: A Nonempirical Approach to the Potential Escape Problem

    NASA Astrophysics Data System (ADS)

    Akashi, Ryosuke; Nagornov, Yuri S.

    2018-06-01

    We develop a non-empirical scheme to search for the minimum-energy escape paths from the minima of the potential surface to unknown saddle points nearby. A stochastic algorithm is constructed to move the walkers up the surface through the potential valleys. This method employs only the local gradient and diagonal part of the Hessian matrix of the potential. An application to a two-dimensional model potential is presented to demonstrate the successful finding of the paths to the saddle points. The present scheme could serve as a starting point toward first-principles simulation of rare events across the potential basins free from empirical collective variables.

  19. Characteristics of High- and Low-Efficiency Hospitals.

    PubMed

    Rosko, Michael; Wong, Herbert S; Mutter, Ryan

    2017-01-01

    We compared performance, operating characteristics, and market environments of low- and high-efficiency hospitals in the 37 states that supplied inpatient data to the Healthcare Cost and Utilization Project from 2006 to 2010. Hospital cost-inefficiency estimates using stochastic frontier analysis were generated. Hospitals were then grouped into the 100 most- and 100 least-efficient hospitals for subsequent analysis. Compared with the least efficient hospitals, high-efficiency hospitals tended to have lower average costs, higher labor productivity, and higher profit margins. The most efficient hospitals tended to be nonteaching, investor-owned, and members of multihospital systems. Hospitals in the high-efficiency group were located in areas with lower health maintenance organization penetration and less competition, and they had a higher share of Medicaid and Medicare admissions. Results of the analysis suggest there are opportunities for public policies to support improved efficiency in the hospital sector.

  20. Assessing Technical Inefficiency in Private, Not-For-Profit, Bachelor's and Master's Universities in the United States Using Stochastic Frontier Estimation

    ERIC Educational Resources Information Center

    Refenes, James L.

    2017-01-01

    This research explored the technical inefficiency of 813 private, not-for-profit, four-year, bachelor's and master's colleges and universities in the U.S. using data from 2006 to 2011. The goal of the study was to describe and explain the level of technical inefficiency in this group of institutions that can be identified using a (SFE) method and…

  1. Inefficiency, heterogeneity and spillover effects in maternal care in India: a spatial stochastic frontier analysis.

    PubMed

    Kinfu, Yohannes; Sawhney, Monika

    2015-03-25

    Institutional delivery is one of the key and proven strategies to reduce maternal deaths. Since the 1990s, the government of India has made substantial investment on maternal care to reduce the huge burden of maternal deaths in the country. However, despite the effort access to institutional delivery in India remains below the global average. In addition, even in places where health investments have been comparable, inter- and intra-state difference in access to maternal care services remain wide and substantial. This raises a fundamental question on whether the sub-national units themselves differ in terms of the efficiency with which they use available resources, and if so, why? Data obtained from round 3 of the country's District Level Health and Facility Survey was analyzed to measure the level and determinants of inefficiency of institutional delivery in the country. Analysis was conducted using spatial stochastic frontier models that correct for heterogeneity and spatial interactions between sub-national units. Inefficiency differences in maternal care services between and within states are substantial. The top one third of districts in the country has a mean efficiency score of 90 per cent or more, while the bottom 10 per cent of districts exhibit mean inefficiency score of as high as over 75 per cent or more. Overall mean inefficiency is about 30 per cent. The result also reveals the existence of both heterogeneity and spatial correlation in institutional delivery in the country. Given the high level of inefficiency in the system, further progress in improving coverage of institutional delivery in the country should focus both on improving the efficiency of resource utilization--especially where inefficiency levels are extremely high--and on bringing new resources in to the system. The additional investment should specifically focus on those parts of the country where coverage rates are still low but efficiency levels are already at a high level. In addition, given that inefficiency was also associated inversely with literacy and urbanization and positively related with proportion of households belonging to poor households, investment in these areas can also improve coverage of institutional delivery in the country.

  2. Efficiency of dairy farms participating and not participating in veterinary herd health management programs.

    PubMed

    Derks, Marjolein; Hogeveen, Henk; Kooistra, Sake R; van Werven, Tine; Tauer, Loren W

    2014-12-01

    This paper compares farm efficiencies between dairies who were participating in a veterinary herd health management (VHHM) program with dairies not participating in such a program, to determine whether participation has an association with farm efficiency. In 2011, 572 dairy farmers received a questionnaire concerning the participation and execution of a VHHM program on their farms. Data from the questionnaire were combined with farm accountancy data from 2008 through 2012 from farms that used calendar year accounting periods, and were analyzed using Stochastic Frontier Analysis (SFA). Two separate models were specified: model 1 was the basic stochastic frontier model (output: total revenue; input: feed costs, land costs, cattle costs, non-operational costs), without explanatory variables embedded into the efficiency component of the error term. Model 2 was an expansion of model 1 which included explanatory variables (number of FTE; total kg milk delivered; price of concentrate; milk per hectare; cows per FTE; nutritional yield per hectare) inserted into the efficiency component of the joint error term. Both models were estimated with the financial parameters expressed per 100 kg fat and protein corrected milk and per cow. Land costs, cattle costs, feed costs and non-operational costs were statistically significant and positive in all models (P<0.01). Frequency distributions of the efficiency scores for the VHHM dairies and the non-VHHM dairies were plotted in a kernel density plot, and differences were tested using the Kolmogorov-Smirnov two-sample test. VHHM dairies had higher total revenue per cow, but not per 100 kg milk. For all SFA models, the difference in distribution was not statistically different between VHHM dairies and non-VHHM dairies (P values 0.94, 0.35, 0.95 and 0.89 for the basic and complete model per 100 kg fat and protein corrected milk and per cow respectively). Therefore we conclude that with our data farm participation in VHHM is not related to overall farm efficiency. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Detecting stochastic backgrounds of gravitational waves with pulsar timing arrays

    NASA Astrophysics Data System (ADS)

    Siemens, Xavier

    2016-03-01

    For the past decade the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) has been using the Green Bank Telescope and the Arecibo Observatory to monitor millisecond pulsars. NANOGrav, along with two other international collaborations, the European Pulsar Timing Array and the Parkes Pulsar Timing Array in Australia, form a consortium of consortia: the International Pulsar Timing Array (IPTA). The goal of the IPTA is to directly detect low-frequency gravitational waves which cause small changes to the times of arrival of radio pulses from millisecond pulsars. In this talk I will discuss the work of NANOGrav and the IPTA, as well as our sensitivity to stochastic backgrounds of gravitational waves. I will show that a detection of the background produced by supermassive black hole binaries is possible by the end of the decade. Supported by the NANOGrav Physics Frontiers Center.

  4. Frontiers in Fluctuation Spectroscopy: Measuring protein dynamics and protein spatio-temporal connectivity

    NASA Astrophysics Data System (ADS)

    Digman, Michelle

    Fluorescence fluctuation spectroscopy has evolved from single point detection of molecular diffusion to a family of microscopy imaging correlation tools (i.e. ICS, RICS, STICS, and kICS) useful in deriving spatial-temporal dynamics of proteins in living cells The advantage of the imaging techniques is the simultaneous measurement of all points in an image with a frame rate that is increasingly becoming faster with better sensitivity cameras and new microscopy modalities such as the sheet illumination technique. A new frontier in this area is now emerging towards a high level of mapping diffusion rates and protein dynamics in the 2 and 3 dimensions. In this talk, I will discuss the evolution of fluctuation analysis from the single point source to mapping diffusion in whole cells and the technology behind this technique. In particular, new methods of analysis exploit correlation of molecular fluctuations originating from measurement of fluctuation correlations at distant points (pair correlation analysis) and methods that exploit spatial averaging of fluctuations in small regions (iMSD). For example the pair correlation fluctuation (pCF) analyses done between adjacent pixels in all possible radial directions provide a window into anisotropic molecular diffusion. Similar to the connectivity atlas of neuronal connections from the MRI diffusion tensor imaging these new tools will be used to map the connectome of protein diffusion in living cells. For biological reaction-diffusion systems, live single cell spatial-temporal analysis of protein dynamics provides a mean to observe stochastic biochemical signaling in the context of the intracellular environment which may lead to better understanding of cancer cell invasion, stem cell differentiation and other fundamental biological processes. National Institutes of Health Grant P41-RRO3155.

  5. High Energy Physics

    Science.gov Websites

    Collider Physics Cosmic Frontier Cosmic Frontier Theory & Computing Detector R&D Electronic Design Theory Seminar Argonne >High Energy Physics Cosmic Frontier Theory & Computing Homepage General Cosmic Frontier Theory & Computing Group led the analysis to begin mapping dark matter. There have

  6. Stochastic and deterministic causes of streamer branching in liquid dielectrics

    NASA Astrophysics Data System (ADS)

    Jadidian, Jouya; Zahn, Markus; Lavesson, Nils; Widlund, Ola; Borg, Karl

    2013-08-01

    Streamer branching in liquid dielectrics is driven by stochastic and deterministic factors. The presence of stochastic causes of streamer branching such as inhomogeneities inherited from noisy initial states, impurities, or charge carrier density fluctuations is inevitable in any dielectric. A fully three-dimensional streamer model presented in this paper indicates that deterministic origins of branching are intrinsic attributes of streamers, which in some cases make the branching inevitable depending on shape and velocity of the volume charge at the streamer frontier. Specifically, any given inhomogeneous perturbation can result in streamer branching if the volume charge layer at the original streamer head is relatively thin and slow enough. Furthermore, discrete nature of electrons at the leading edge of an ionization front always guarantees the existence of a non-zero inhomogeneous perturbation ahead of the streamer head propagating even in perfectly homogeneous dielectric. Based on the modeling results for streamers propagating in a liquid dielectric, a gauge on the streamer head geometry is introduced that determines whether the branching occurs under particular inhomogeneous circumstances. Estimated number, diameter, and velocity of the born branches agree qualitatively with experimental images of the streamer branching.

  7. Snowmass Computing Frontier: Computing for the Cosmic Frontier, Astrophysics, and Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connolly, A.; Habib, S.; Szalay, A.

    2013-11-12

    This document presents (off-line) computing requrements and challenges for Cosmic Frontier science, covering the areas of data management, analysis, and simulations. We invite contributions to extend the range of covered topics and to enhance the current descriptions.

  8. The improving efficiency frontier of inpatient rehabilitation hospitals.

    PubMed

    Harrison, Jeffrey P; Kirkpatrick, Nicole

    2011-01-01

    This study uses a linear programming technique called data envelopment analysis to identify changes in the efficiency frontier of inpatient rehabilitation hospitals after implementation of the prospective payment system. The study provides a time series analysis of the efficiency frontier for inpatient rehabilitation hospitals in 2003 immediately after implementation of PPS and then again in 2006. Results indicate that the efficiency frontier of inpatient rehabilitation hospitals increased from 84% in 2003 to 85% in 2006. Similarly, an analysis of slack or inefficiency shows improvements in output efficiency over the study period. This clearly documents that efficiency in the inpatient rehabilitation hospital industry after implementation of PPS is improving. Hospital executives, health care policymakers, taxpayers, and other stakeholders benefit from studies that improve health care efficiency.

  9. Insider versus outsider executive succession: The relationship to hospital efficiency.

    PubMed

    Ford, Eric W; Lowe, Kevin B; Silvera, Geoffrey B; Babik, Dmytro; Huerta, Timothy R

    The relationship between Chief Executive Officer (CEO) succession and hospitals' competitive performance is an area of interest for health services researchers. Of particular interest is the impact on overall strategic direction and health system performance that results from selecting a CEO from inside the firm as opposed to seeking outside leadership. Empirical work-to-date has yielded mixed results. Much of this variability has been attributed to design flaws; however, in the absence of a clear message from the evidence, the preference for hiring "outsiders" continues to grow. This paper investigates on the extent to which insider CEO succession versus outsider succession impacts hospitals' competitive advantage vis-à-vis a sample of organizations that compete in the same sector. A hospital matching protocol based on propensity scores is used to control for endogeneity and makes comparisons of productivity across organizations through the use of stochastic frontier estimation. Succession negatively impacts hospitals' productivity, and firms with outsider CEO succession events closed the gap toward the competitive advantage frontier faster than comparable firms with insider successions. More research needs to be done on succession planning and its impact on CEO turnover.

  10. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    PubMed

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  11. From cold to hot: Climatic effects and productivity in Wisconsin dairy farms.

    PubMed

    Qi, L; Bravo-Ureta, B E; Cabrera, V E

    2015-12-01

    This study examined the effects of climatic conditions on dairy farm productivity using panel data for the state of Wisconsin along with alternative stochastic frontier models. A noteworthy feature of this analysis is that Wisconsin is a major dairy-producing area where winters are typically very cold and snowy and summers are hot and humid. Thus, it is an ideal geographical region for examining the effects of a range of climatic factors on dairy production. We identified the effects of temperature and precipitation, both jointly and separately, on milk output. The analysis showed that increasing temperature in summer or in autumn is harmful for dairy production, whereas warmer winters and warmer springs are beneficial. In contrast, more precipitation had a consistent adverse effect on dairy productivity. Overall, the analysis showed that over the past 17 yr, changes in climatic conditions have had a negative effect on Wisconsin dairy farms. Alternative scenarios predict that climate change would lead to a 5 to 11% reduction in dairy production per year between 2020 and 2039 after controlling for other factors. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Sand dredging and environmental efficiency of artisanal fishermen in Lagos state, Nigeria.

    PubMed

    Sowunmi, Fatai A; Hogarh, Jonathan N; Agbola, Peter O; Atewamba, Calvin

    2016-03-01

    Environmentally detrimental input (water turbidity) and conventional production inputs were considered within the framework of stochastic frontier analysis to estimate technical and environmental efficiencies of fishermen in sand dredging and non-dredging areas. Environmental efficiency was low among fishermen in the sand dredging areas. Educational status and experience in fishing and sand dredging were the factors influencing environmental efficiency in the sand dredging areas. Average quantity of fish caught per labour- hour was higher among fishermen in the non-dredging areas. Fishermen in the fishing community around the dredging areas travelled long distance in order to reduce the negative effect of sand dredging on their fishing activity. The study affirmed large household size among fishermen. The need to regulate the activities of sand dredgers by restricting license for sand dredging to non-fishing communities as well as intensifying family planning campaign in fishing communities to reduce the negative effect of high household size on fishing is imperative for the sustainability of artisanal fishing.

  13. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Bujor, F.; de Stefano, J.; Dewhurst, A. L.; Dykstra, D.; Front, D.; Gallas, E.; Gamboa, C. F.; Luehring, F.; Walker, R.

    2012-12-01

    The ATLAS experiment deployed Frontier technology worldwide during the initial year of LHC collision data taking to enable user analysis jobs running on the Worldwide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken, such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond user analysis and subsystem specific tasks such as calibration and alignment, extending into production processing areas, such as initial reconstruction and trigger reprocessing. With a more robust and tuned system, we are better equipped to satisfy the still growing number of diverse clients and the demands of increasingly sophisticated processing and analysis.

  14. Efficient Provision of Employment Service Outputs: A Production Frontier Analysis.

    ERIC Educational Resources Information Center

    Cavin, Edward S.; Stafford, Frank P.

    1985-01-01

    This article develops a production frontier model for the Employment Service and assesses the relative efficiency of the 51 State Employment Security Agencies in attaining program outcomes close to that frontier. This approach stands in contrast to such established practices as comparing programs to their own previous performance. (Author/CT)

  15. Stochastic Lotka-Volterra equations: A model of lagged diffusion of technology in an interconnected world

    NASA Astrophysics Data System (ADS)

    Chakrabarti, Anindya S.

    2016-01-01

    We present a model of technological evolution due to interaction between multiple countries and the resultant effects on the corresponding macro variables. The world consists of a set of economies where some countries are leaders and some are followers in the technology ladder. All of them potentially gain from technological breakthroughs. Applying Lotka-Volterra (LV) equations to model evolution of the technology frontier, we show that the way technology diffuses creates repercussions in the partner economies. This process captures the spill-over effects on major macro variables seen in the current highly globalized world due to trickle-down effects of technology.

  16. A stochastic frontier approach to study the relationship between gastrointestinal nematode infections and technical efficiency of dairy farms.

    PubMed

    van der Voort, Mariska; Van Meensel, Jef; Lauwers, Ludwig; Vercruysse, Jozef; Van Huylenbroeck, Guido; Charlier, Johannes

    2014-01-01

    The impact of gastrointestinal (GI) nematode infections in dairy farming has traditionally been assessed using partial productivity indicators. But such approaches ignore the impact of infection on the performance of the whole farm. In this study, efficiency analysis was used to study the association of the GI nematode Ostertagia ostertagi on the technical efficiency of dairy farms. Five years of accountancy data were linked to GI nematode infection data gained from a longitudinal parasitic monitoring campaign. The level of exposure to GI nematodes was based on bulk-tank milk ELISA tests, which measure the antibodies to O. ostertagi and was expressed as an optical density ratio (ODR). Two unbalanced data panels were created for the period 2006 to 2010. The first data panel contained 198 observations from the Belgian Farm Accountancy Data Network (Brussels, Belgium) and the second contained 622 observations from the Boerenbond Flemish farmers' union (Leuven, Belgium) accountancy system (Tiber Farm Accounting System). We used the stochastic frontier analysis approach and defined inefficiency effect models specified with the Cobb-Douglas and transcendental logarithmic (Translog) functional form. To assess the efficiency scores, milk production was considered as the main output variable. Six input variables were used: concentrates, roughage, pasture, number of dairy cows, animal health costs, and labor. The ODR of each individual farm served as an explanatory variable of inefficiency. An increase in the level of exposure to GI nematodes was associated with a decrease in technical efficiency. Exposure to GI nematodes constrains the productivity of pasture, health, and labor but does not cause inefficiency in the use of concentrates, roughage, and dairy cows. Lowering the level of infection in the interquartile range (0.271 ODR) was associated with an average milk production increase of 27, 19, and 9L/cow per year for Farm Accountancy Data Network farms and 63, 49, and 23L/cow per year for Tiber Farm Accounting System farms in the low- (0-90), medium- (90-95), and high- (95-99) efficiency score groups, respectively. The potential milk increase associated with reducing the level of infection was higher for highly efficient farms (6.7% of the total possible milk increase when becoming fully technically efficient) than for less efficient farms (3.8% of the total possible milk increase when becoming fully technically efficient). Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jadidian, Jouya; Zahn, Markus; Lavesson, Nils

    Streamer branching in liquid dielectrics is driven by stochastic and deterministic factors. The presence of stochastic causes of streamer branching such as inhomogeneities inherited from noisy initial states, impurities, or charge carrier density fluctuations is inevitable in any dielectric. A fully three-dimensional streamer model presented in this paper indicates that deterministic origins of branching are intrinsic attributes of streamers, which in some cases make the branching inevitable depending on shape and velocity of the volume charge at the streamer frontier. Specifically, any given inhomogeneous perturbation can result in streamer branching if the volume charge layer at the original streamer headmore » is relatively thin and slow enough. Furthermore, discrete nature of electrons at the leading edge of an ionization front always guarantees the existence of a non-zero inhomogeneous perturbation ahead of the streamer head propagating even in perfectly homogeneous dielectric. Based on the modeling results for streamers propagating in a liquid dielectric, a gauge on the streamer head geometry is introduced that determines whether the branching occurs under particular inhomogeneous circumstances. Estimated number, diameter, and velocity of the born branches agree qualitatively with experimental images of the streamer branching.« less

  18. Determining the optimal model for role-substitution in NHS dental services in the United Kingdom.

    PubMed

    Brocklehurst, Paul; Birch, Stephen; McDonald, Ruth; Tickle, Martin

    2013-09-24

    Role-substitution describes a model of dental care where Dental Care Professionals (DCPs) provide some of the clinical activity previously undertaken by General Dental Practitioners. This has the potential to increase technical efficiency, the capacity to care and reduce costs. Technical efficiency is defined as the production of the maximum amount of output from a given amount of input so that the service operates at the production frontier i.e. optimal level of productivity. Academic research into technical efficiency is becoming increasingly utilised in health care, although no studies have investigated the efficiency of NHS dentistry or role-substitution in high-street dental practices. The aim of this study is to examine the barriers and enablers that exist for role-substitution in general dental practices in the NHS and to determine the most technically efficient model for role-substitution. A screening questionnaire will be sent to DCPs to determine the type and location of role-substitutive models employed in NHS dental practices in the United Kingdom (UK). Semi-structured interviews will then be conducted with practice owners, DCPs and patients at selected sites identified by the questionnaire. Detail will be recorded about the organisational structure of the dental team, the number of NHS hours worked and the clinical activity undertaken. The interviews will continue until saturation and will record the views and attitudes of the members of the dental team. Final numbers of interviews will be determined by saturation.The second work-stream will examine the technical efficiency of the selected practices using Data Envelopment Analysis and Stochastic Frontier Modeling. The former is a non-parametric technique and is considered to be a highly flexible approach for applied health applications. The latter is parametric and is based on frontier regression models that estimate a conventional cost function. Maximising health for a given level and mix of resources is an ethical imperative for health service planners. This study will determine the technical efficiency of role-substitution and so address one of the key recommendations of the Independent Review of NHS dentistry in England.

  19. Determining the optimal model for role-substitution in NHS dental services in the United Kingdom

    PubMed Central

    2013-01-01

    Background Role-substitution describes a model of dental care where Dental Care Professionals (DCPs) provide some of the clinical activity previously undertaken by General Dental Practitioners. This has the potential to increase technical efficiency, the capacity to care and reduce costs. Technical efficiency is defined as the production of the maximum amount of output from a given amount of input so that the service operates at the production frontier i.e. optimal level of productivity. Academic research into technical efficiency is becoming increasingly utilised in health care, although no studies have investigated the efficiency of NHS dentistry or role-substitution in high-street dental practices. The aim of this study is to examine the barriers and enablers that exist for role-substitution in general dental practices in the NHS and to determine the most technically efficient model for role-substitution. Methods/design A screening questionnaire will be sent to DCPs to determine the type and location of role-substitutive models employed in NHS dental practices in the United Kingdom (UK). Semi-structured interviews will then be conducted with practice owners, DCPs and patients at selected sites identified by the questionnaire. Detail will be recorded about the organisational structure of the dental team, the number of NHS hours worked and the clinical activity undertaken. The interviews will continue until saturation and will record the views and attitudes of the members of the dental team. Final numbers of interviews will be determined by saturation. The second work-stream will examine the technical efficiency of the selected practices using Data Envelopment Analysis and Stochastic Frontier Modeling. The former is a non-parametric technique and is considered to be a highly flexible approach for applied health applications. The latter is parametric and is based on frontier regression models that estimate a conventional cost function. Discussion Maximising health for a given level and mix of resources is an ethical imperative for health service planners. This study will determine the technical efficiency of role-substitution and so address one of the key recommendations of the Independent Review of NHS dentistry in England. PMID:24063247

  20. Spatially and temporally resolved exciton dynamics and transport in single nanostructures and assemblies

    NASA Astrophysics Data System (ADS)

    Huang, Libai

    2015-03-01

    The frontier in solar energy conversion now lies in learning how to integrate functional entities across multiple length scales to create optimal devices. To address this new frontier, I will discuss our recent efforts on elucidating multi-scale energy transfer, migration, and dissipation processes with simultaneous femtosecond temporal resolution and nanometer spatial resolution. We have developed ultrafast microscopy that combines ultrafast spectroscopy with optical microscopy to map exciton dynamics and transport with simultaneous ultrafast time resolution and diffraction-limited spatial resolution. We have employed pump-probe transient absorption microscopy to elucidate morphology and structure dependent exciton dynamics and transport in single nanostructures and molecular assemblies. More specifically, (1) We have applied transient absorption microscopy (TAM) to probe environmental and structure dependent exciton relaxation pathways in sing-walled carbon nanotubes (SWNTs) by mapping dynamics in individual pristine SWNTs with known structures. (2) We have systematically measured and modeled the optical properties of the Frenkel excitons in self-assembled porphyrin tubular aggregates that represent an analog to natural photosynthetic antennae. Using a combination of ultrafast optical microscopy and stochastic exciton modeling, we address exciton transport and relaxation pathways, especially those related to disorder.

  1. Statistical characterization of planar two-dimensional Rayleigh-Taylor mixing layers

    NASA Astrophysics Data System (ADS)

    Sendersky, Dmitry

    2000-10-01

    The statistical evolution of a planar, randomly perturbed fluid interface subject to Rayleigh-Taylor instability is explored through numerical simulation in two space dimensions. The data set, generated by the front-tracking code FronTier, is highly resolved and covers a large ensemble of initial perturbations, allowing a more refined analysis of closure issues pertinent to the stochastic modeling of chaotic fluid mixing. We closely approach a two-fold convergence of the mean two-phase flow: convergence of the numerical solution under computational mesh refinement, and statistical convergence under increasing ensemble size. Quantities that appear in the two-phase averaged Euler equations are computed directly and analyzed for numerical and statistical convergence. Bulk averages show a high degree of convergence, while interfacial averages are convergent only in the outer portions of the mixing zone, where there is a coherent array of bubble and spike tips. Comparison with the familiar bubble/spike penetration law h = alphaAgt 2 is complicated by the lack of scale invariance, inability to carry the simulations to late time, the increasing Mach numbers of the bubble/spike tips, and sensitivity to the method of data analysis. Finally, we use the simulation data to analyze some constitutive properties of the mixing process.

  2. The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations.

    PubMed

    Liu, Chunping; Laporte, Audrey; Ferguson, Brian S

    2008-09-01

    In the health economics literature there is an ongoing debate over approaches used to estimate the efficiency of health systems at various levels, from the level of the individual hospital - or nursing home - up to that of the health system as a whole. The two most widely used approaches to evaluating the efficiency with which various units deliver care are non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Productivity researchers tend to have very strong preferences over which methodology to use for efficiency estimation. In this paper, we use Monte Carlo simulation to compare the performance of DEA and SFA in terms of their ability to accurately estimate efficiency. We also evaluate quantile regression as a potential alternative approach. A Cobb-Douglas production function, random error terms and a technical inefficiency term with different distributions are used to calculate the observed output. The results, based on these experiments, suggest that neither DEA nor SFA can be regarded as clearly dominant, and that, depending on the quantile estimated, the quantile regression approach may be a useful addition to the armamentarium of methods for estimating technical efficiency.

  3. Frontiers in Ecosystem Science: Energizing the Research Agenda

    NASA Astrophysics Data System (ADS)

    Weathers, K. C.; Groffman, P. M.; VanDolah, E.

    2014-12-01

    Ecosystem science has a long history as a core component of the discipline of Ecology, and although topics of research have fluctuated over the years, it retains a clear identity and continues to be a vital field. As science is becoming more interdisciplinary, particularly the science of global environmental change, ecosystem scientists are addressing new and important questions at the interface of multiple disciplines. Over the last two years, we organized a series of workshops and discussion groups at multiple scientific-society meetings, including AGU to identify frontiers in ecosystem research. The workshops featured short "soapbox" presentations where speakers highlighted key questions in ecosystem science. The presentations were recorded (video and audio) and subjected to qualitative text analysis for identification of frontier themes, attendees completed surveys, and a dozen additional "key informants" were interviewed about their views about frontiers of the discipline. Our effort produced 253 survey participants; the two largest groups of participants were full professors (24%) and graduate students (24%); no other specific group was > 10%. Formal text analysis of the soapbox presentations produced three major themes; "frontiers," "capacity building," and "barriers to implementation" with four or five sub-themes within each major theme. Key "frontiers" included; 1) better understanding of the drivers of ecosystem change, 2) better understanding of ecosystem process and function, 3) human dimensions of ecosystem science, and 4) problem-solving/applied research. Under "capacity building," key topics included: holistic approaches, cross-disciplinary collaboration, public support for research, data, training, and technology investment. Under "barriers" key topics included: limitations in theoretical thinking, insufficient funding/support, fragmentation across discipline, data access and data synthesis. In-depth interviews with 13 experts validated findings from analysis of soapbox presentations and surveys and also resulted in a conceptual model for understanding disciplinary frontiers.

  4. Assessing the Total Factor Productivity of Cotton Production in Egypt

    PubMed Central

    Rodríguez, Xosé A.; Elasraag, Yahia H.

    2015-01-01

    The main objective of this paper is to decompose the productivity growth of Egyptian cotton production. We employ the stochastic frontier approach and decompose the changes in total factor productivity (CTFP) growth into four components: technical progress (TP), changes in scale component (CSC), changes in allocative efficiency (CAE), and changes in technical efficiency (CTE). Considering a situation of scarce statistical information, we propose four alternative empirical models, with the purpose of looking for convergence in the results. The results provide evidence that in this production system total productivity does not increase, which is mainly due to the negative average contributions of CAE and TP. Policy implications are offered in light of the results. PMID:25625318

  5. Assessing the total factor productivity of cotton production in Egypt.

    PubMed

    Rodríguez, Xosé A; Elasraag, Yahia H

    2015-01-01

    The main objective of this paper is to decompose the productivity growth of Egyptian cotton production. We employ the stochastic frontier approach and decompose the changes in total factor productivity (CTFP) growth into four components: technical progress (TP), changes in scale component (CSC), changes in allocative efficiency (CAE), and changes in technical efficiency (CTE). Considering a situation of scarce statistical information, we propose four alternative empirical models, with the purpose of looking for convergence in the results. The results provide evidence that in this production system total productivity does not increase, which is mainly due to the negative average contributions of CAE and TP. Policy implications are offered in light of the results.

  6. A Mean variance analysis of arbitrage portfolios

    NASA Astrophysics Data System (ADS)

    Fang, Shuhong

    2007-03-01

    Based on the careful analysis of the definition of arbitrage portfolio and its return, the author presents a mean-variance analysis of the return of arbitrage portfolios, which implies that Korkie and Turtle's results ( B. Korkie, H.J. Turtle, A mean-variance analysis of self-financing portfolios, Manage. Sci. 48 (2002) 427-443) are misleading. A practical example is given to show the difference between the arbitrage portfolio frontier and the usual portfolio frontier.

  7. Research in High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Robert John; Toki, Walter; Harton, John

    This report summarizes research performed within the Department of Energy Office of Science's Intensity Frontier and Cosmic Frontier High Energy Physics research subprograms during the period 2014-17. The major research thrusts in the Intensity Frontier involved two currently active neutrino experiments T2K and NOvA; participation in development for the new Short-Baseline Neutrino program at Fermilab (SBN), which will begin full operation within the next one to two years; and physics tools, analysis and detector prototyping for the future Deep Underground Neutrino Experiment (DUNE). The major research thrusts in the Cosmic Frontier involved the Pierre Auger Observatory and the Directional Recoilmore » Identification From Tracks (DRIFT) dark matter search experiment.« less

  8. General-Purpose Front End for Real-Time Data Processing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    FRONTIER is a computer program that functions as a front end for any of a variety of other software of both the artificial intelligence (AI) and conventional data-processing types. As used here, front end signifies interface software needed for acquiring and preprocessing data and making the data available for analysis by the other software. FRONTIER is reusable in that it can be rapidly tailored to any such other software with minimum effort. Each component of FRONTIER is programmable and is executed in an embedded virtual machine. Each component can be reconfigured during execution. The virtual-machine implementation making FRONTIER independent of the type of computing hardware on which it is executed.

  9. Strategic assessment of the Highway Performance Monitoring System

    DOT National Transportation Integrated Search

    1995-02-01

    The appendix contains a series of border crossing profiles covering the major, and in some cases, minor crossings in the border frontier. The frontier itself is a definition created for the 6015 Study to aid in the analysis of trade and traffic flows...

  10. Measuring efficiency of cotton cultivation in Pakistan: a restricted production frontier study.

    PubMed

    Watto, Muhammad Arif; Mugera, Amin

    2014-11-01

    Massive groundwater pumping for irrigation has started lowering water tables rapidly in different regions of Pakistan. Declining water tables have thus prompted research efforts to improve agricultural productivity and efficiency to make efficient use of scarce water resources. This study employs a restricted stochastic production frontier to estimate the level of, and factors affecting, technical efficiency of groundwater-irrigated cotton farms in the Punjab province of Pakistan. The mean technical efficiency estimates indicate substantial technical inefficiencies among cotton growers. On average, tube-well owners and water buyers can potentially increase cotton production by 19% and 28%, respectively, without increasing the existing input level. The most influential factors affecting technical efficiency positively are the use of improved quality seed, consultation with extension field staff and farmers' perceptions concerning the availability of groundwater resources for irrigation in the future. This study proposes that adopting improved seed for new cotton varieties and providing better extension services regarding cotton production technology would help to achieve higher efficiency in cotton farming. Within the context of falling water tables, educating farmers about the actual crop water requirements and guiding them about groundwater resource availability may also help to achieve higher efficiencies. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  11. A class of stochastic optimization problems with one quadratic & several linear objective functions and extended portfolio selection model

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Li, Jun

    2002-09-01

    In this paper a class of stochastic multiple-objective programming problems with one quadratic, several linear objective functions and linear constraints has been introduced. The former model is transformed into a deterministic multiple-objective nonlinear programming model by means of the introduction of random variables' expectation. The reference direction approach is used to deal with linear objectives and results in a linear parametric optimization formula with a single linear objective function. This objective function is combined with the quadratic function using the weighted sums. The quadratic problem is transformed into a linear (parametric) complementary problem, the basic formula for the proposed approach. The sufficient and necessary conditions for (properly, weakly) efficient solutions and some construction characteristics of (weakly) efficient solution sets are obtained. An interactive algorithm is proposed based on reference direction and weighted sums. Varying the parameter vector on the right-hand side of the model, the DM can freely search the efficient frontier with the model. An extended portfolio selection model is formed when liquidity is considered as another objective to be optimized besides expectation and risk. The interactive approach is illustrated with a practical example.

  12. [Impact of the funding reform of teaching hospitals in Brazil].

    PubMed

    Lobo, M S C; Silva, A C M; Lins, M P E; Fiszman, R

    2009-06-01

    To assess the impact of funding reform on the productivity of teaching hospitals. Based on the Information System of Federal University Hospitals of Brazil, 2003 and 2006 efficiency and productivity were measured using frontier methods with a linear programming technique, data envelopment analysis, and input-oriented variable returns to scale model. The Malmquist index was calculated to detect changes during the study period: 'technical efficiency change,' or the relative variation of the efficiency of each unit; and 'technological change' after frontier shift. There was 51% mean budget increase and improvement of technical efficiency of teaching hospitals (previously 11, 17 hospitals reached the empirical efficiency frontier) but the same was not seen for the technology frontier. Data envelopment analysis set benchmark scores for each inefficient unit (before and after reform) and there was a positive correlation between technical efficiency and teaching intensity and dedication. The reform promoted management improvements but there is a need of further follow-up to assess the effectiveness of funding changes.

  13. Dynamic network data envelopment analysis for university hospitals evaluation

    PubMed Central

    Lobo, Maria Stella de Castro; Rodrigues, Henrique de Castro; André, Edgard Caires Gazzola; de Azeredo, Jônatas Almeida; Lins, Marcos Pereira Estellita

    2016-01-01

    ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier. PMID:27191158

  14. An urban energy performance evaluation system and its computer implementation.

    PubMed

    Wang, Lei; Yuan, Guan; Long, Ruyin; Chen, Hong

    2017-12-15

    To improve the urban environment and effectively reflect and promote urban energy performance, an urban energy performance evaluation system was constructed, thereby strengthening urban environmental management capabilities. From the perspectives of internalization and externalization, a framework of evaluation indicators and key factors that determine urban energy performance and explore the reasons for differences in performance was proposed according to established theory and previous studies. Using the improved stochastic frontier analysis method, an urban energy performance evaluation and factor analysis model was built that brings performance evaluation and factor analysis into the same stage for study. According to data obtained for the Chinese provincial capitals from 2004 to 2013, the coefficients of the evaluation indicators and key factors were calculated by the urban energy performance evaluation and factor analysis model. These coefficients were then used to compile the program file. The urban energy performance evaluation system developed in this study was designed in three parts: a database, a distributed component server, and a human-machine interface. Its functions were designed as login, addition, edit, input, calculation, analysis, comparison, inquiry, and export. On the basis of these contents, an urban energy performance evaluation system was developed using Microsoft Visual Studio .NET 2015. The system can effectively reflect the status of and any changes in urban energy performance. Beijing was considered as an example to conduct an empirical study, which further verified the applicability and convenience of this evaluation system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Frontiers in Sociology of Education

    ERIC Educational Resources Information Center

    Hallinan, Maureen T., Ed.

    2011-01-01

    Scholarly analysis in the sociology of education has burgeoned in recent decades. "Frontiers in Sociology of Education" aims to provide a roadmap for sociologists and other social scientists as they set bold new directions for future research on schools. In Part 1 of this forward-looking volume, the authors present cutting-edge research…

  16. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  17. Dual-mode nested search method for categorical uncertain multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  18. Post Pareto optimization-A case

    NASA Astrophysics Data System (ADS)

    Popov, Stoyan; Baeva, Silvia; Marinova, Daniela

    2017-12-01

    Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.

  19. Learning the Task Management Space of an Aircraft Approach Model

    NASA Technical Reports Server (NTRS)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  20. CORSAIR (COmet Rendezvous, Sample Acquisition, Investigation, and Return): A New Frontiers Mission Concept to Collect Samples from a Comet and Return Them to Earth for Study

    NASA Astrophysics Data System (ADS)

    Sandford, S. A.; Chabot, N. L.; Dello Russo, N.; Leary, J. C.; Reynolds, E. L.; Weaver, H. A.; Wooden, D. H.

    2017-07-01

    CORSAIR (COmet Rendezvous, Sample Acquisition, Investigation, and Return) is a mission concept submitted in response to NASA's New Frontiers 4 call. CORSAIR's proposed mission is to return comet nucleus samples to Earth for detailed analysis.

  1. Technical efficiency in milk production in underdeveloped production environment of India*.

    PubMed

    Bardhan, Dwaipayan; Sharma, Murari Lal

    2013-12-01

    The study was undertaken in Kumaon division of Uttarakhand state of India with the objective of estimating technical efficiency in milk production across different herd-size category households and factors influencing it. Total of 60 farm households having representation from different herd-size categories drawn from six randomly selected villages of plain and hilly regions of the division constituted the ultimate sampling units of the study. Stochastic frontier production function analysis was used to estimate the technical efficiency in milk production. Multivariate regression equations were fitted taking technical efficiency index as the regressand to identify the factors significantly influencing technical efficiency in milk production. The study revealed that variation in output across farms in the study area was due to difference in their technical efficiency levels. However, it was interesting to note that smallholder producers were more technically efficient in milk production than their larger counterparts, especially in the plains. Apart from herd size, intensity of market participation had significant and positive impact on technical efficiency in the plains. This provides definite indication that increasing the level of commercialization of dairy farms would have beneficial impact on their production efficiency.

  2. Empirical Study on the Sustainability of China's Grain Quality Improvement: The Role of Transportation, Labor, and Agricultural Machinery.

    PubMed

    Zhang, Ming; Duan, Fang; Mao, Zisen

    2018-02-05

    As a major part of farming sustainability, the issues of grain production and its quality improvement have been important in many countries. This paper aims to address these issues in China. Based on the data from the main production provinces and by applying the stochastic frontier analysis methodology, we find that the improvement of transportation and the use of agricultural machinery have become the main driving forces for grain quality improvement in China. After further studying different provinces' potentials of grain quality improvement, we show that grain quality has increased steadily. Therefore, we can conclude China's grain quality improvement is indeed sustainable. Furthermore, different grains like rice, wheat, and corn share similar characteristics in terms of quality improvement, but the improvement rate for rice is relatively low, while those of corn and wheat are relatively high. Moreover, the overall change of efficiency gain of grain quality improvement is not significant for different provinces. The efficiency gains of the quality improvements for rice and wheat even decrease slightly. In addition, we find that only expanding grain quality improvement potential can simultaneously achieve the dual objectives of improving grain quality and increasing yield.

  3. Empirical Study on the Sustainability of China’s Grain Quality Improvement: The Role of Transportation, Labor, and Agricultural Machinery

    PubMed Central

    Zhang, Ming; Duan, Fang; Mao, Zisen

    2018-01-01

    As a major part of farming sustainability, the issues of grain production and its quality improvement have been important in many countries. This paper aims to address these issues in China. Based on the data from the main production provinces and by applying the stochastic frontier analysis methodology, we find that the improvement of transportation and the use of agricultural machinery have become the main driving forces for grain quality improvement in China. After further studying different provinces’ potentials of grain quality improvement, we show that grain quality has increased steadily. Therefore, we can conclude China’s grain quality improvement is indeed sustainable. Furthermore, different grains like rice, wheat, and corn share similar characteristics in terms of quality improvement, but the improvement rate for rice is relatively low, while those of corn and wheat are relatively high. Moreover, the overall change of efficiency gain of grain quality improvement is not significant for different provinces. The efficiency gains of the quality improvements for rice and wheat even decrease slightly. In addition, we find that only expanding grain quality improvement potential can simultaneously achieve the dual objectives of improving grain quality and increasing yield. PMID:29401727

  4. Social, Biological and Physical Meta-Mechanisms a tale of Tails

    NASA Astrophysics Data System (ADS)

    West, Bruce J.

    The tale concerns the uncertainty of knowledge in the natural, social and life sciences and the tails are associated with the statistical distributions and correlation functions describing these scientific uncertainties. The tails in many phenomena are mentioned, including the long-range correlations in DNA sequences, the longtime memory in human gait and heart beats, the patterns over time in the births of babies to teenagers, as well as in the sexual pairings of homosexual men, and the volatility in financial markets among many other exemplars. I shall argue that these phenomena are so complex that no one is able to understand them completely. However, insights and partial knowledge about such complex mechanistic understanding of the phenomena being studied. These strategies include the development of models, using the fractal stochastic processes, chaotic dynamical systems, and the fractional calculus; all of which are tied together, using the concept of scaling, and therein hangs the tale. The perspective adopted in this lecture is not the dogmatic presentation often found in text books, in large part because there is no "right answer" to the questions being posed. Rather than answers, there are clues, indications, suggestions and tracks in the snow, as there always are at the frontiers of science. Is is my perspective of this frontier that I will be presenting and which is laid out in detail in Physiology, Promiscuity and Prophecy at the Millennium: A Tale of Tails25.

  5. Academic Performance and Burnout: An Efficient Frontier Analysis of Resource Use Efficiency among Employed University Students

    ERIC Educational Resources Information Center

    Galbraith, Craig S.; Merrill, Gregory B.

    2015-01-01

    We examine the impact of university student burnout on academic achievement. With a longitudinal sample of working undergraduate university business and economics students, we use a two-step analytical process to estimate the efficient frontiers of student productivity given inputs of labour and capital and then analyse the potential determinants…

  6. Beyond Frontiers: Comparing the Efficiency of Higher Education Decision-Making Units across More than One Country

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Johnes, Geraint

    2009-01-01

    We employ Data Envelopment Analysis to compute the technical efficiency of Italian and English higher education institutions. Our results show that, in relation to the country-specific frontier, institutions in both countries are typically very efficient. However, institutions in England are more efficient than those in Italy when we compare…

  7. An Analysis of Stochastic Duels Involving Fixed Rates of Fire

    DTIC Science & Technology

    The thesis presents an analysis of stochastic duels involving two opposing weapon systems with constant rates of fire. The duel was developed as a...process stochastic duels . The analysis was then extended to the two versus one duel where the three weapon systems were assumed to have fixed rates of fire.

  8. Lateral movements in Rayleigh-Taylor instabilities due to frontiers. Numerical analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, D.; Binda, L.; Zalts, A.; El Hasi, C.; D'Onofrio, A.

    2018-01-01

    Numerical simulations were performed for Rayleigh-Taylor (RT) hydrodynamic instabilities when a frontier is present. The frontier formed by the interface between two fluids prevents the free movement of the fingers created by the instability. As a consequence, transversal movements at the rear of the fingers are observed in this area. These movements produce collapse of the fingers (two or more fingers join in one finger) or oscillations in the case that there is no collapse. The transversal velocity of the fingers, the amplitude of the oscillations, and the wave number of the RT instabilities as a function of the Rayleigh number (Ra) were studied near the frontier. We verified numerically that in classical RT instabilities, without a frontier, these lateral movements do not occur; only with a physical frontier, the transversal displacements of the fingers appear. The transverse displacement velocity and the initial wave number increase with Ra. This leads to the collapse of the fingers, diminishing the wave number of the instabilities at the interface. Instead, no significant changes in the amplitude of the oscillations are observed modifying Ra. The numerical results are independent of the type or origin of the frontier (gas-liquid, liquid-liquid, or solid-liquid). The numerical results are in good agreement with the experimental results reported by Binda et al. [Chaos 28, 013107 (2018)]. Based on these results, it was possible to determine the cause of the transverse displacements, which had not been explained until now.

  9. A stochastic model for optimizing composite predictors based on gene expression profiles.

    PubMed

    Ramanathan, Murali

    2003-07-01

    This project was done to develop a mathematical model for optimizing composite predictors based on gene expression profiles from DNA arrays and proteomics. The problem was amenable to a formulation and solution analogous to the portfolio optimization problem in mathematical finance: it requires the optimization of a quadratic function subject to linear constraints. The performance of the approach was compared to that of neighborhood analysis using a data set containing cDNA array-derived gene expression profiles from 14 multiple sclerosis patients receiving intramuscular inteferon-beta1a. The Markowitz portfolio model predicts that the covariance between genes can be exploited to construct an efficient composite. The model predicts that a composite is not needed for maximizing the mean value of a treatment effect: only a single gene is needed, but the usefulness of the effect measure may be compromised by high variability. The model optimized the composite to yield the highest mean for a given level of variability or the least variability for a given mean level. The choices that meet this optimization criteria lie on a curve of composite mean vs. composite variability plot referred to as the "efficient frontier." When a composite is constructed using the model, it outperforms the composite constructed using the neighborhood analysis method. The Markowitz portfolio model may find potential applications in constructing composite biomarkers and in the pharmacogenomic modeling of treatment effects derived from gene expression endpoints.

  10. An Efficient, Noniterative Method of Identifying the Cost-Effectiveness Frontier.

    PubMed

    Suen, Sze-chuan; Goldhaber-Fiebert, Jeremy D

    2016-01-01

    Cost-effectiveness analysis aims to identify treatments and policies that maximize benefits subject to resource constraints. However, the conventional process of identifying the efficient frontier (i.e., the set of potentially cost-effective options) can be algorithmically inefficient, especially when considering a policy problem with many alternative options or when performing an extensive suite of sensitivity analyses for which the efficient frontier must be found for each. Here, we describe an alternative one-pass algorithm that is conceptually simple, easier to implement, and potentially faster for situations that challenge the conventional approach. Our algorithm accomplishes this by exploiting the relationship between the net monetary benefit and the cost-effectiveness plane. To facilitate further evaluation and use of this approach, we also provide scripts in R and Matlab that implement our method and can be used to identify efficient frontiers for any decision problem. © The Author(s) 2015.

  11. An Efficient, Non-iterative Method of Identifying the Cost-Effectiveness Frontier

    PubMed Central

    Suen, Sze-chuan; Goldhaber-Fiebert, Jeremy D.

    2015-01-01

    Cost-effectiveness analysis aims to identify treatments and policies that maximize benefits subject to resource constraints. However, the conventional process of identifying the efficient frontier (i.e., the set of potentially cost-effective options) can be algorithmically inefficient, especially when considering a policy problem with many alternative options or when performing an extensive suite of sensitivity analyses for which the efficient frontier must be found for each. Here, we describe an alternative one-pass algorithm that is conceptually simple, easier to implement, and potentially faster for situations that challenge the conventional approach. Our algorithm accomplishes this by exploiting the relationship between the net monetary benefit and the cost-effectiveness plane. To facilitate further evaluation and use of this approach, we additionally provide scripts in R and Matlab that implement our method and can be used to identify efficient frontiers for any decision problem. PMID:25926282

  12. Problems of Mathematical Finance by Stochastic Control Methods

    NASA Astrophysics Data System (ADS)

    Stettner, Łukasz

    The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.

  13. Specialty hospitals emulating focused factories: a case study.

    PubMed

    Kumar, Sameer

    2010-01-01

    For 15 years general hospital managers faced new competition from for-profit specialty hospitals that operate on a "focused factory" model, which threaten to siphon-off the most profitable patients. This paper aims to discuss North American specialty hospitals and to review rising costs impact on general hospital operations. The focus is to discover whether specialty hospitals are more efficient than general hospitals; if so, how significant is the difference and also what can general hospitals do in light of the rising specialty hospitals. The case study involves stochastic frontier regression analysis using Cobb-Douglas and Translog cost functions to compare Minnesota general and specialty hospital efficiency. Analysis is based on data from 117 general and 19 specialty hospitals. The results suggest that specialty hospitals are significantly more efficient than general hospitals. Overall, general hospitals were found to be more than twice as inefficient compared with specialty hospitals in the sample. Some cost-cutting factors highlighted can be implemented to trim rising costs. The case study highlights some managerial levers that general hospital operational managers might use to control rising costs. This also helps them compete with specialty hospitals by reducing overheads and other major costs. The study is based on empirical modeling for an important healthcare operational challenge and provides additional in-depth information that has health policy implications. The analysis and findings enable healthcare managers to guide their institutions in a new direction during a time of change within the industry.

  14. Site Productivity and Tree Mortality on New Frontiers of Gypsy Moth Infestation

    Treesearch

    David A. Gansner; David A. Gansner

    1987-01-01

    Recent analysis of forest stand losses to gypsy moth has provided basic information for analyzing the relationship between forest site productivity and tree mortality on new frontiers of infestation. Poor timber-growing sites had the lowest rates of mortality. Oak mortality (number of trees) amounted to 18 percent on poor sites compared with 26 percent on medium and 28...

  15. Computational study of frontier orbitals, moments, chemical reactivity and thermodynamic parameters of sildenafil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sachdeva, Ritika, E-mail: ritika.sachdeva21@gmail.com; Kaur, Prabhjot; Singh, V. P.

    2016-05-06

    Analysis of frontier orbitals of sildenafil has been carried using Density Functional Theory. On the basis of HOMO-LUMO energy, values of global chemical reactivity descriptors such as electronegativity, chemical hardness, softness, chemical potential, electrophilicity index have been calculated. Calculated values of dipole moment, polarizability, hyperpolarizability have also been reported for sildenafil along with its thermodynamic parameters.

  16. Insect cyborgs: a new frontier in flight control systems

    NASA Astrophysics Data System (ADS)

    Reissman, Timothy; Crawford, Jackie H.; Garcia, Ephrahim

    2007-04-01

    The development of a micro-UAV via a cybernetic organism, primarily the Manduca sexta moth, is presented. An observer to gather output data of the system response of the moth is given by means of an image following system. The visual tracking was implemented to gather the required information about the time history of the moth's six degrees of freedom. This was performed with three cameras tracking a white line as a marker on the moth's thorax to maximize contrast between the moth and the marker. Evaluation of the implemented six degree of freedom visual tracking system finds precision greater than 0.1 mm within three standard deviations and accuracy on the order of 1 mm. Acoustic and visual response systems are presented to lay the groundwork for creating a stochastic response catalog of the organisms to varied stimuli.

  17. Stochastic analysis of a novel nonautonomous periodic SIRI epidemic system with random disturbances

    NASA Astrophysics Data System (ADS)

    Zhang, Weiwei; Meng, Xinzhu

    2018-02-01

    In this paper, a new stochastic nonautonomous SIRI epidemic model is formulated. Given that the incidence rates of diseases may change with the environment, we propose a novel type of transmission function. The main aim of this paper is to obtain the thresholds of the stochastic SIRI epidemic model. To this end, we investigate the dynamics of the stochastic system and establish the conditions for extinction and persistence in mean of the disease by constructing some suitable Lyapunov functions and using stochastic analysis technique. Furthermore, we show that the stochastic system has at least one nontrivial positive periodic solution. Finally, numerical simulations are introduced to illustrate our results.

  18. Exploration of a High Luminosity 100 TeV Proton Antiproton Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveros, Sandra J.; Summers, Don; Cremaldi, Lucien

    New physics is being explored with the Large Hadron Collider at CERN and with Intensity Frontier programs at Fermilab and KEK. The energy scale for new physics is known to be in the multi-TeV range, signaling the need for a future collider which well surpasses this energy scale. We explore a 10more » $$^{\\,34}$$ cm$$^{-2}$$ s$$^{-1}$$ luminosity, 100 TeV $$p\\bar{p}$$ collider with 7$$\\times$$ the energy of the LHC but only 2$$\\times$$ as much NbTi superconductor, motivating the choice of 4.5 T single bore dipoles. The cross section for many high mass states is 10 times higher in $$p\\bar{p}$$ than $pp$ collisions. Antiquarks for production can come directly from an antiproton rather than indirectly from gluon splitting. The higher cross sections reduce the synchrotron radiation in superconducting magnets and the number of events per beam crossing, because lower beam currents can produce the same rare event rates. Events are more centrally produced, allowing a more compact detector with less space between quadrupole triplets and a smaller $$\\beta^{*}$$ for higher luminosity. A Fermilab-like $$\\bar p$$ source would disperse the beam into 12 momentum channels to capture more antiprotons. Because stochastic cooling time scales as the number of particles, 12 cooling ring sets would be used. Each set would include phase rotation to lower momentum spreads, equalize all momentum channels, and stochastically cool. One electron cooling ring would follow the stochastic cooling rings. Finally antiprotons would be recycled during runs without leaving the collider ring by joining them to new bunches with synchrotron damping.« less

  19. Fusion of Hard and Soft Information in Nonparametric Density Estimation

    DTIC Science & Technology

    2015-06-10

    and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for

  20. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    NASA Astrophysics Data System (ADS)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  1. DFT and TD-DFT computation of charge transfer complex between o-phenylenediamine and 3,5-dinitrosalicylic acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afroz, Ziya; Zulkarnain,; Ahmad, Afaq, E-mail: afaqahmad3@gmail.com

    2016-05-23

    DFT and TD-DFT studies of o-phenylenediamine (PDA), 3,5-dinitrosalicylic acid (DNSA) and their charge transfer complex have been carried out at B3LYP/6-311G(d,p) level of theory. Molecular geometry and various other molecular properties like natural atomic charges, ionization potential, electron affinity, band gap, natural bond orbital (NBO) and frontier molecular analysis have been presented at same level of theory. Frontier molecular orbital and natural bond orbital analysis show the charge delocalization from PDA to DNSA.

  2. Modular Electron Donor Group Tuning Of Frontier Energy Levels In Diarylaminofluorenone Push-Pull Molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homnick, Paul J.; Lahti, P. M.

    2012-01-01

    Push–pull organic molecules composed of electron donor diarylamines at the 2- and 2,7-positions of fluorenone exhibit intramolecular charge-transfer behaviour in static absorption and emission spectra. Electrochemical and spectral data combined in a modular electronic analysis model show how the donor HOMO and acceptor LUMO act as major determinants of the frontier molecular orbital energy levels.

  3. Exponential stability of impulsive stochastic genetic regulatory networks with time-varying delays and reaction-diffusion

    DOE PAGES

    Cao, Boqiang; Zhang, Qimin; Ye, Ming

    2016-11-29

    We present a mean-square exponential stability analysis for impulsive stochastic genetic regulatory networks (GRNs) with time-varying delays and reaction-diffusion driven by fractional Brownian motion (fBm). By constructing a Lyapunov functional and using linear matrix inequality for stochastic analysis we derive sufficient conditions to guarantee the exponential stability of the stochastic model of impulsive GRNs in the mean-square sense. Meanwhile, the corresponding results are obtained for the GRNs with constant time delays and standard Brownian motion. Finally, an example is presented to illustrate our results of the mean-square exponential stability analysis.

  4. Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance

    DTIC Science & Technology

    2003-07-21

    Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance Vincent A. Cicirello CMU-RI-TR-03-27 Submitted in partial fulfillment...AND SUBTITLE Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...lead to the development of a search control framework, called QD-BEACON that uses online -generated statistical models of search performance to

  5. Stochastic simulation and analysis of biomolecular reaction networks

    PubMed Central

    Frazier, John M; Chushak, Yaroslav; Foy, Brent

    2009-01-01

    Background In recent years, several stochastic simulation algorithms have been developed to generate Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks. However, the effects of various stochastic simulation and data analysis conditions on the observed dynamics of complex biomolecular reaction networks have not recieved much attention. In order to investigate these issues, we employed a a software package developed in out group, called Biomolecular Network Simulator (BNS), to simulate and analyze the behavior of such systems. The behavior of a hypothetical two gene in vitro transcription-translation reaction network is investigated using the Gillespie exact stochastic algorithm to illustrate some of the factors that influence the analysis and interpretation of these data. Results Specific issues affecting the analysis and interpretation of simulation data are investigated, including: (1) the effect of time interval on data presentation and time-weighted averaging of molecule numbers, (2) effect of time averaging interval on reaction rate analysis, (3) effect of number of simulations on precision of model predictions, and (4) implications of stochastic simulations on optimization procedures. Conclusion The two main factors affecting the analysis of stochastic simulations are: (1) the selection of time intervals to compute or average state variables and (2) the number of simulations generated to evaluate the system behavior. PMID:19534796

  6. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    DOE PAGES

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; ...

    2017-01-25

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to notmore » only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. Furthermore, the algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.« less

  7. Economic Risk Analysis of Agricultural Tillage Systems Using the SMART Stochastic Efficiency Software Package

    USDA-ARS?s Scientific Manuscript database

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...

  8. [Analysis of cost and efficiency of a medical nursing unit using time-driven activity-based costing].

    PubMed

    Lim, Ji Young; Kim, Mi Ja; Park, Chang Gi

    2011-08-01

    Time-driven activity-based costing was applied to analyze the nursing activity cost and efficiency of a medical unit. Data were collected at a medical unit of a general hospital. Nursing activities were measured using a nursing activities inventory and classified as 6 domains using Easley-Storfjell Instrument. Descriptive statistics were used to identify general characteristics of the unit, nursing activities and activity time, and stochastic frontier model was adopted to estimate true activity time. The average efficiency of the medical unit using theoretical resource capacity was 77%, however the efficiency using practical resource capacity was 96%. According to these results, the portion of non-added value time was estimated 23% and 4% each. The sums of total nursing activity costs were estimated 109,860,977 won in traditional activity-based costing and 84,427,126 won in time-driven activity-based costing. The difference in the two cost calculating methods was 25,433,851 won. These results indicate that the time-driven activity-based costing provides useful and more realistic information about the efficiency of unit operation compared to traditional activity-based costing. So time-driven activity-based costing is recommended as a performance evaluation framework for nursing departments based on cost management.

  9. Stability analysis for stochastic BAM nonlinear neural network with delays

    NASA Astrophysics Data System (ADS)

    Lv, Z. W.; Shu, H. S.; Wei, G. L.

    2008-02-01

    In this paper, stochastic bidirectional associative memory neural networks with constant or time-varying delays is considered. Based on a Lyapunov-Krasovskii functional and the stochastic stability analysis theory, we derive several sufficient conditions in order to guarantee the global asymptotically stable in the mean square. Our investigation shows that the stochastic bidirectional associative memory neural networks are globally asymptotically stable in the mean square if there are solutions to some linear matrix inequalities(LMIs). Hence, the global asymptotic stability of the stochastic bidirectional associative memory neural networks can be easily checked by the Matlab LMI toolbox. A numerical example is given to demonstrate the usefulness of the proposed global asymptotic stability criteria.

  10. Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing

    2014-09-01

    We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.

  11. Specialty and full-service hospitals: a comparative cost analysis.

    PubMed

    Carey, Kathleen; Burgess, James F; Young, Gary J

    2008-10-01

    To compare the costs of physician-owned cardiac, orthopedic, and surgical single specialty hospitals with those of full-service hospital competitors. The primary data sources are the Medicare Cost Reports for 1998-2004 and hospital inpatient discharge data for three of the states where single specialty hospitals are most prevalent, Texas, California, and Arizona. The latter were obtained from the Texas Department of State Health Services, the California Office of Statewide Health Planning and Development, and the Agency for Healthcare Research and Quality Healthcare Cost and Utilization Project. Additional data comes from the American Hospital Association Annual Survey Database. We identified all physician-owned cardiac, orthopedic, and surgical specialty hospitals in these three states as well as all full-service acute care hospitals serving the same market areas, defined using Dartmouth Hospital Referral Regions. We estimated a hospital cost function using stochastic frontier regression analysis, and generated hospital specific inefficiency measures. Application of t-tests of significance compared the inefficiency measures of specialty hospitals with those of full-service hospitals to make general comparisons between these classes of hospitals. Results do not provide evidence that specialty hospitals are more efficient than the full-service hospitals with whom they compete. In particular, orthopedic and surgical specialty hospitals appear to have significantly higher levels of cost inefficiency. Cardiac hospitals, however, do not appear to be different from competitors in this respect. Policymakers should not embrace the assumption that physician-owned specialty hospitals produce patient care more efficiently than their full-service hospital competitors.

  12. A New Strategy to Evaluate Technical Efficiency in Hospitals Using Homogeneous Groups of Casemix : How to Evaluate When There is Not DRGs?

    PubMed

    Villalobos-Cid, Manuel; Chacón, Max; Zitko, Pedro; Instroza-Ponta, Mario

    2016-04-01

    The public health system has restricted economic resources. Because of that, it is necessary to know how the resources are being used and if they are properly distributed. Several works have applied classical approaches based in Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) for this purpose. However, if we have hospitals with different casemix, this is not the best approach. In order to avoid biases in the comparisons, other works have recommended the use of hospital production data corrected by the weights from Diagnosis Related Groups (DRGs), to adjust the casemix of hospitals. However, not all countries have this tool fully implemented, which limits the efficiency evaluation. This paper proposes a new approach for evaluating the efficiency of hospitals. It uses a graph-based clustering algorithm to find groups of hospitals that have similar production profiles. Then, DEA is used to evaluate the technical efficiency of each group. The proposed approach is tested using the production data from 2014 of 193 Chilean public hospitals. The results allowed to identify different performance profiles of each group, that differs from other studies that employs data from partially implemented DRGs. Our results are able to deliver a better description of the resource management of the different groups of hospitals. We have created a website with the results ( bioinformatic.diinf.usach.cl/publichealth ). Data can be requested to the authors.

  13. Efficiency and hospital effectiveness in improving Hospital Consumer Assessment of Healthcare Providers and Systems ratings.

    PubMed

    Al-Amin, Mona; Makarem, Suzanne C; Rosko, Michael

    2016-01-01

    Efficiency has emerged as a central goal to the operations of health care organizations. There are two competing perspectives on the relationship between efficiency and organizational performance. Some argue that organizational slack is a waste and that efficiency contributes to organizational performance, whereas others maintain that slack acts as a buffer, allowing organizations to adapt to environmental demands and contributing to organizational performance. As value-based purchasing becomes more prevalent, health care organizations are incented to become more efficient and, at the same time, improve their patients' experiences and outcomes. Unused slack resources might facilitate the timely implementation of these improvements. Building on previous research on organizational slack and inertia, we test whether efficiency and other organizational factors predict organizational effectiveness in improving Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) ratings. We rely on data from the American Hospital Association and HCAHPS. We estimate hospital cost-efficiency by Stochastic Frontier Analysis and use regression analysis to determine whether efficiency, competition, hospital size, and other organizational factors are significant predictors of hospital effectiveness. Our findings indicate that efficiency and hospital size have a significant negative association with organizational ability to improve HCAHPS ratings. Although achieving organizational efficiency is necessary for health care organizations, given the changes that are currently occurring in the U.S. health care system, it is important for health care managers to maintain a certain level of slack to respond to environmental demands and have the resources needed to improve their performance.

  14. Methods for High-Order Multi-Scale and Stochastic Problems Analysis, Algorithms, and Applications

    DTIC Science & Technology

    2016-10-17

    finite volume schemes, discontinuous Galerkin finite element method, and related methods, for solving computational fluid dynamics (CFD) problems and...approximation for finite element methods. (3) The development of methods of simulation and analysis for the study of large scale stochastic systems of...laws, finite element method, Bernstein-Bezier finite elements , weakly interacting particle systems, accelerated Monte Carlo, stochastic networks 16

  15. Analysis of the stochastic excitability in the flow chemical reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  16. Analysis of the stochastic excitability in the flow chemical reactor

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2015-11-01

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  17. Historic Frontier Processes active in Future Space-Based Mineral Extraction

    NASA Astrophysics Data System (ADS)

    Gray, D. M.

    2000-01-01

    The forces that shaped historic mining frontiers are in many cases not bound by geographic or temporal limits. The forces that helped define historic frontiers are active in today's physical and virtual frontiers, and will be present in future space-based frontiers. While frontiers derived from position and technology are primarily economic in nature, non-economic conditions affect the success or failure of individual frontier endeavors, local "mining camps" and even entire frontiers. Frontiers can be defined as the line of activity that divides the established markets and infrastructure of civilization from the unclaimed resources and potential wealth of a wilderness. At the frontier line, ownership of resources is established. The resource can then be developed using capital, energy and information. In a mining setting, the resource is concentrated for economic shipment to the markets of civilization. Profits from the sale of the resource are then used to fund further development of the resource and/or pay investors. Both positional and technical frontiers develop as a series of generations. The profits from each generation of development provides the capital and/or investment incentive for the next round of development. Without profit, the self-replicating process of frontiers stops.

  18. Application of an NLME-Stochastic Deconvolution Approach to Level A IVIVC Modeling.

    PubMed

    Kakhi, Maziar; Suarez-Sharp, Sandra; Shepard, Terry; Chittenden, Jason

    2017-07-01

    Stochastic deconvolution is a parameter estimation method that calculates drug absorption using a nonlinear mixed-effects model in which the random effects associated with absorption represent a Wiener process. The present work compares (1) stochastic deconvolution and (2) numerical deconvolution, using clinical pharmacokinetic (PK) data generated for an in vitro-in vivo correlation (IVIVC) study of extended release (ER) formulations of a Biopharmaceutics Classification System class III drug substance. The preliminary analysis found that numerical and stochastic deconvolution yielded superimposable fraction absorbed (F abs ) versus time profiles when supplied with exactly the same externally determined unit impulse response parameters. In a separate analysis, a full population-PK/stochastic deconvolution was applied to the clinical PK data. Scenarios were considered in which immediate release (IR) data were either retained or excluded to inform parameter estimation. The resulting F abs profiles were then used to model level A IVIVCs. All the considered stochastic deconvolution scenarios, and numerical deconvolution, yielded on average similar results with respect to the IVIVC validation. These results could be achieved with stochastic deconvolution without recourse to IR data. Unlike numerical deconvolution, this also implies that in crossover studies where certain individuals do not receive an IR treatment, their ER data alone can still be included as part of the IVIVC analysis. Published by Elsevier Inc.

  19. Symbolic bones and interethnic violence in a frontier zone, northwest Mexico, ca. 500–900 C.E.

    PubMed Central

    Nelson, Ben A.; Martin, Debra L.

    2015-01-01

    Although extensive deposits of disarticulated, commingled human bones are common in the prehispanic Northern Frontier of Mesoamerica, detailed bioarchaeological analyses of them are not. To our knowledge, this article provides the first such analysis of bone from a full residential-ceremonial complex and evaluates multiple hypotheses about its significance, concluding that the bones actively represented interethnic violence as well as other relationships among persons living and dead. Description of these practices is important to the discussion of multiethnic societies because the frontier was a context where urbanism and complexity were emerging and groups with the potential to form multiethnic societies were interacting, possibly in the same ways that groups did before the formation of larger multiethnic city-states in the core of Mesoamerica. PMID:25941398

  20. Symbolic bones and interethnic violence in a frontier zone, northwest Mexico, ca. 500-900 C.E.

    PubMed

    Nelson, Ben A; Martin, Debra L

    2015-07-28

    Although extensive deposits of disarticulated, commingled human bones are common in the prehispanic Northern Frontier of Mesoamerica, detailed bioarchaeological analyses of them are not. To our knowledge, this article provides the first such analysis of bone from a full residential-ceremonial complex and evaluates multiple hypotheses about its significance, concluding that the bones actively represented interethnic violence as well as other relationships among persons living and dead. Description of these practices is important to the discussion of multiethnic societies because the frontier was a context where urbanism and complexity were emerging and groups with the potential to form multiethnic societies were interacting, possibly in the same ways that groups did before the formation of larger multiethnic city-states in the core of Mesoamerica.

  1. Are large farms more efficient? Tenure security, farm size and farm efficiency: evidence from northeast China

    NASA Astrophysics Data System (ADS)

    Zhou, Yuepeng; Ma, Xianlei; Shi, Xiaoping

    2017-04-01

    How to increase production efficiency, guarantee grain security, and increase farmers' income using the limited farmland is a great challenge that China is facing. Although theory predicts that secure property rights and moderate scale management of farmland can increase land productivity, reduce farm-related costs, and raise farmer's income, empirical studies on the size and magnitude of these effects are scarce. A number of studies have examined the impacts of land tenure or farm size on productivity or efficiency, respectively. There are also a few studies linking farm size, land tenure and efficiency together. However, to our best knowledge, there are no studies considering tenure security and farm efficiency together for different farm scales in China. In addition, there is little study analyzing the profit frontier. In this study, we particularly focus on the impacts of land tenure security and farm size on farm profit efficiency, using farm level data collected from 23 villages, 811 households in Liaoning in 2015. 7 different farm scales have been identified to further represent small farms, median farms, moderate-scale farms, and large farms. Technical efficiency is analyzed with stochastic frontier production function. The profit efficiency is regressed on a set of explanatory variables which includes farm size dummies, land tenure security indexes, and household characteristics. We found that: 1) The technical efficiency scores for production efficiency (average score = 0.998) indicate that it is already very close to the production frontier, and thus there is little room to improve production efficiency. However, there is larger space to raise profit efficiency (average score = 0.768) by investing more on farm size expansion, seed, hired labor, pesticide, and irrigation. 2) Farms between 50-80 mu are most efficient from the viewpoint of profit efficiency. The so-called moderate-scale farms (100-150 mu) according to the governmental guideline show no advantage in efficiency. 3) Formal land certificates and farmer's participation in land rental market are found to be important determinants of the profit efficiency across different scale of farms. 4) Fertilizer use has been excessive in Liaoning and could lead to the decline of crop profit.

  2. Irksome and Unpopular Duties: Pakistan’s Frontier Corps, Local Security Forces and Counterinsurgency

    DTIC Science & Technology

    2012-05-01

    the Karakorum mountain range in North West Frontier Province (now Khyber Pakhtunkhwa) to the Makran coast in Balochistan . I he Khan <>l Lalpura and...the corps along geographical lines, creating Frontier Corps-North West Frontier Province and Frontier Corps- Balochistan . Pakistan also created...major combat operations, including the Indo-Pakistani wars of 1948, 1965, and 1971. The Frontier Corps fought against separatists in Balochistan in the

  3. NCI Releases Video: Proteogenomics Research - On the Frontier of Precision Medicine | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI), part of the National Institutes of Health, announces the release of an educational video titled “Proteogenomics Research: On the Frontier of Precision Medicine."  Launched at the HUPO2017 Global Leadership Gala Dinner, catalyzed in part by the Cancer Moonshot initiative and featuring as keynote speaker the 47th Vice President of the United States of America Joseph R.

  4. Critical frontier of the Potts and percolation models on triangular-type and kagome-type lattices. II. Numerical analysis

    NASA Astrophysics Data System (ADS)

    Ding, Chengxiang; Fu, Zhe; Guo, Wenan; Wu, F. Y.

    2010-06-01

    In the preceding paper, one of us (F. Y. Wu) considered the Potts model and bond and site percolation on two general classes of two-dimensional lattices, the triangular-type and kagome-type lattices, and obtained closed-form expressions for the critical frontier with applications to various lattice models. For the triangular-type lattices Wu’s result is exact, and for the kagome-type lattices Wu’s expression is under a homogeneity assumption. The purpose of the present paper is twofold: First, an essential step in Wu’s analysis is the derivation of lattice-dependent constants A,B,C for various lattice models, a process which can be tedious. We present here a derivation of these constants for subnet networks using a computer algorithm. Second, by means of a finite-size scaling analysis based on numerical transfer matrix calculations, we deduce critical properties and critical thresholds of various models and assess the accuracy of the homogeneity assumption. Specifically, we analyze the q -state Potts model and the bond percolation on the 3-12 and kagome-type subnet lattices (n×n):(n×n) , n≤4 , for which the exact solution is not known. Our numerical determination of critical properties such as conformal anomaly and magnetic correlation length verifies that the universality principle holds. To calibrate the accuracy of the finite-size procedure, we apply the same numerical analysis to models for which the exact critical frontiers are known. The comparison of numerical and exact results shows that our numerical values are correct within errors of our finite-size analysis, which correspond to 7 or 8 significant digits. This in turn infers that the homogeneity assumption determines critical frontiers with an accuracy of 5 decimal places or higher. Finally, we also obtained the exact percolation thresholds for site percolation on kagome-type subnet lattices (1×1):(n×n) for 1≤n≤6 .

  5. Methods of Stochastic Analysis of Complex Regimes in the 3D Hindmarsh-Rose Neuron Model

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina; Ryashko, Lev; Slepukhina, Evdokia

    A problem of the stochastic nonlinear analysis of neuronal activity is studied by the example of the Hindmarsh-Rose (HR) model. For the parametric region of tonic spiking oscillations, it is shown that random noise transforms the spiking dynamic regime into the bursting one. This stochastic phenomenon is specified by qualitative changes in distributions of random trajectories and interspike intervals (ISIs). For a quantitative analysis of the noise-induced bursting, we suggest a constructive semi-analytical approach based on the stochastic sensitivity function (SSF) technique and the method of confidence domains that allows us to describe geometrically a distribution of random states around the deterministic attractors. Using this approach, we develop a new algorithm for estimation of critical values for the noise intensity corresponding to the qualitative changes in stochastic dynamics. We show that the obtained estimations are in good agreement with the numerical results. An interplay between noise-induced bursting and transitions from order to chaos is discussed.

  6. A Hybrid Stochastic-Neuro-Fuzzy Model-Based System for In-Flight Gas Turbine Engine Diagnostics

    DTIC Science & Technology

    2001-04-05

    Margin (ADM) and (ii) Fault Detection Margin (FDM). Key Words: ANFIS, Engine Health Monitoring , Gas Path Analysis, and Stochastic Analysis Adaptive Network...The paper illustrates the application of a hybrid Stochastic- Fuzzy -Inference Model-Based System (StoFIS) to fault diagnostics and prognostics for both...operational history monitored on-line by the engine health management (EHM) system. To capture the complex functional relationships between different

  7. Stochastic Investigation of Natural Frequency for Functionally Graded Plates

    NASA Astrophysics Data System (ADS)

    Karsh, P. K.; Mukhopadhyay, T.; Dey, S.

    2018-03-01

    This paper presents the stochastic natural frequency analysis of functionally graded plates by applying artificial neural network (ANN) approach. Latin hypercube sampling is utilised to train the ANN model. The proposed algorithm for stochastic natural frequency analysis of FGM plates is validated and verified with original finite element method and Monte Carlo simulation (MCS). The combined stochastic variation of input parameters such as, elastic modulus, shear modulus, Poisson ratio, and mass density are considered. Power law is applied to distribute the material properties across the thickness. The present ANN model reduces the sample size and computationally found efficient as compared to conventional Monte Carlo simulation.

  8. Learning to Be Homesteaders: Frontier Women in Oklahoma

    ERIC Educational Resources Information Center

    Smith, Joan

    2010-01-01

    In "The Female Frontier" (1988), Glenda Riley notes that the typical historical account of life on the frontier puts men at the center of the experience. In contrast to a male frontier thesis, Riley posits that women played highly significant, though largely domestic, roles in the settling and development of the frontier, and that…

  9. Stochastic computing with biomolecular automata

    PubMed Central

    Adar, Rivka; Benenson, Yaakov; Linshiz, Gregory; Rosner, Amit; Tishby, Naftali; Shapiro, Ehud

    2004-01-01

    Stochastic computing has a broad range of applications, yet electronic computers realize its basic step, stochastic choice between alternative computation paths, in a cumbersome way. Biomolecular computers use a different computational paradigm and hence afford novel designs. We constructed a stochastic molecular automaton in which stochastic choice is realized by means of competition between alternative biochemical pathways, and choice probabilities are programmed by the relative molar concentrations of the software molecules coding for the alternatives. Programmable and autonomous stochastic molecular automata have been shown to perform direct analysis of disease-related molecular indicators in vitro and may have the potential to provide in situ medical diagnosis and cure. PMID:15215499

  10. Analysis of a novel stochastic SIRS epidemic model with two different saturated incidence rates

    NASA Astrophysics Data System (ADS)

    Chang, Zhengbo; Meng, Xinzhu; Lu, Xiao

    2017-04-01

    This paper presents a stochastic SIRS epidemic model with two different nonlinear incidence rates and double epidemic asymmetrical hypothesis, and we devote to develop a mathematical method to obtain the threshold of the stochastic epidemic model. We firstly investigate the boundness and extinction of the stochastic system. Furthermore, we use Ito's formula, the comparison theorem and some new inequalities techniques of stochastic differential systems to discuss persistence in mean of two diseases on three cases. The results indicate that stochastic fluctuations can suppress the disease outbreak. Finally, numerical simulations about different noise disturbance coefficients are carried out to illustrate the obtained theoretical results.

  11. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  12. Stochastic Spiking Neural Networks Enabled by Magnetic Tunnel Junctions: From Nontelegraphic to Telegraphic Switching Regimes

    NASA Astrophysics Data System (ADS)

    Liyanagedera, Chamika M.; Sengupta, Abhronil; Jaiswal, Akhilesh; Roy, Kaushik

    2017-12-01

    Stochastic spiking neural networks based on nanoelectronic spin devices can be a possible pathway to achieving "brainlike" compact and energy-efficient cognitive intelligence. The computational model attempt to exploit the intrinsic device stochasticity of nanoelectronic synaptic or neural components to perform learning or inference. However, there has been limited analysis on the scaling effect of stochastic spin devices and its impact on the operation of such stochastic networks at the system level. This work attempts to explore the design space and analyze the performance of nanomagnet-based stochastic neuromorphic computing architectures for magnets with different barrier heights. We illustrate how the underlying network architecture must be modified to account for the random telegraphic switching behavior displayed by magnets with low barrier heights as they are scaled into the superparamagnetic regime. We perform a device-to-system-level analysis on a deep neural-network architecture for a digit-recognition problem on the MNIST data set.

  13. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  14. A study on technical efficiency of a DMU (review of literature)

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, B.; Mahaboob, B.; Subbarami Reddy, C.; Sankar, J. Ravi

    2017-11-01

    In this research paper the concept of technical efficiency (due to Farell) [1] of a decision making unit (DMU) has been introduced and the measure of technical and cost efficiencies are derived. Timmer’s [2] deterministic approach to estimate the Cobb-Douglas production frontier has been proposed. The idea of extension of Timmer’s [2] method to any production frontier which is linear in parameters has been presented here. The estimation of parameters of Cobb-Douglas production frontier by linear programming approach has been discussed in this paper. Mark et al. [3] proposed a non-parametric method to assess efficiency. Nuti et al. [4] investigated the relationships among technical efficiency scores, weighted per capita cost and overall performance Gahe Zing Samuel Yank et al. [5] used Data envelopment analysis to assess technical assessment in banking sectors.

  15. Note on Professor Sizer's Paper.

    ERIC Educational Resources Information Center

    Balderston, Frederick E.

    1979-01-01

    Issues suggested by John Sizer's paper, an overview of the assessment of institutional performance, include: the efficient-frontier approach, multiple-criterion decision-making models, performance analysis approached as path analysis, and assessment of academic quality. (JMD)

  16. Design of a High Luminosity 100 TeV Proton-Antiproton Collider

    NASA Astrophysics Data System (ADS)

    Oliveros Tautiva, Sandra Jimena

    Currently new physics is being explored with the Large Hadron Collider at CERN and with Intensity Frontier programs at Fermilab and KEK. The energy scale for new physics is known to be in the multi-TeV range, signaling the need for a future collider which well surpasses this energy scale. A 10 34 cm-2 s-1 luminosity 100 TeV proton-antiproton collider is explored with 7x the energy of the LHC. The dipoles are 4.5 T to reduce cost. A proton-antiproton collider is selected as a future machine for several reasons. The cross section for many high mass states is 10 times higher in pp than pp collisions. Antiquarks for production can come directly from an antiproton rather than indirectly from gluon splitting. The higher cross sections reduce the synchrotron radiation in superconducting magnets and the number of events per bunch crossing, because lower beam currents can produce the same rare event rates. Events are also more centrally produced, allowing a more compact detector with less space between quadrupole triplets and a smaller beta* for higher luminosity. To adjust to antiproton beam losses (burn rate), a Fermilab-like antiproton source would be adapted to disperse the beam into 12 different momentum channels, using electrostatic septa, to increase antiproton momentum capture 12 times. At Fermilab, antiprotons were stochastically cooled in one Debuncher and one Accumulator ring. Because the stochastic cooling time scales as the number of particles, two options of 12 independent cooling systems are presented. One electron cooling ring might follow the stochastic cooling rings for antiproton stacking. Finally antiprotons in the collider ring would be recycled during runs without leaving the collider ring, by joining them to new bunches with snap bunch coalescence and synchrotron damping. These basic ideas are explored in this work on a future 100 TeV proton-antiproton collider and the main parameters are presented.

  17. Design of a High Luminosity 100 TeV Proton Antiproton Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveros Tuativa, Sandra Jimena

    2017-04-01

    Currently new physics is being explored with the Large Hadron Collider at CERN and with Intensity Frontier programs at Fermilab and KEK. The energy scale for new physics is known to be in the multi-TeV range, signaling the need for a future collider which well surpasses this energy scale. A 10more » $$^{\\,34}$$ cm$$^{-2}$$ s$$^{-1}$$ luminosity 100 TeV proton-antiproton collider is explored with 7$$\\times$$ the energy of the LHC. The dipoles are 4.5\\,T to reduce cost. A proton-antiproton collider is selected as a future machine for several reasons. The cross section for many high mass states is 10 times higher in $$p\\bar{p}$$ than $pp$ collisions. Antiquarks for production can come directly from an antiproton rather than indirectly from gluon splitting. The higher cross sections reduce the synchrotron radiation in superconducting magnets and the number of events per bunch crossing, because lower beam currents can produce the same rare event rates. Events are also more centrally produced, allowing a more compact detector with less space between quadrupole triplets and a smaller $$\\beta^{*}$$ for higher luminosity. To adjust to antiproton beam losses (burn rate), a Fermilab-like antiproton source would be adapted to disperse the beam into 12 different momentum channels, using electrostatic septa, to increase antiproton momentum capture 12 times. At Fermilab, antiprotons were stochastically cooled in one Debuncher and one Accumulator ring. Because the stochastic cooling time scales as the number of particles, two options of 12 independent cooling systems are presented. One electron cooling ring might follow the stochastic cooling rings for antiproton stacking. Finally antiprotons in the collider ring would be recycled during runs without leaving the collider ring, by joining them to new bunches with snap bunch coalescence and synchrotron damping. These basic ideas are explored in this work on a future 100 TeV proton-antiproton collider and the main parameters are presented.« less

  18. Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier

    PubMed Central

    Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley

    2014-01-01

    Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach. PMID:25520961

  19. Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier.

    PubMed

    Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley

    2013-01-01

    Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach.

  20. Stochastic resonance energy harvesting for a rotating shaft subject to random and periodic vibrations: influence of potential function asymmetry and frequency sweep

    NASA Astrophysics Data System (ADS)

    Kim, Hongjip; Che Tai, Wei; Zhou, Shengxi; Zuo, Lei

    2017-11-01

    Stochastic resonance is referred to as a physical phenomenon that is manifest in nonlinear systems whereby a weak periodic signal can be significantly amplified with the aid of inherent noise or vice versa. In this paper, stochastic resonance is considered to harvest energy from two typical vibrations in rotating shafts: random whirl vibration and periodic stick-slip vibration. Stick-slip vibrations impose a constant offset in centrifugal force and distort the potential function of the harvester, leading to potential function asymmetry. A numerical analysis based on a finite element method was conducted to investigate stochastic resonance with potential function asymmetry. Simulation results revealed that a harvester with symmetric potential function generates seven times higher power than that with asymmetric potential function. Furthermore, a frequency-sweep analysis also showed that stochastic resonance has hysteretic behavior, resulting in frequency difference between up-sweep and down-sweep excitations. An electromagnetic energy harvesting system was constructed to experimentally verify the numerical analysis. In contrast to traditional stochastic resonance harvesters, the proposed harvester uses magnetic force to compensate the offset in the centrifugal force. System identification was performed to obtain the parameters needed in the numerical analysis. With the identified parameters, the numerical simulations showed good agreement with the experiment results with around 10% error, which verified the effect of potential function asymmetry and frequency sweep excitation condition on stochastic resonance. Finally, attributed to compensating the centrifugal force offset, the proposed harvester generated nearly three times more open-circuit output voltage than its traditional counterpart.

  1. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  2. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  3. Stochastic analysis of concentration field in a wake region.

    PubMed

    Yassin, Mohamed F; Elmi, Abdirashid A

    2011-02-01

    Identifying geographic locations in urban areas from which air pollutants enter the atmosphere is one of the most important information needed to develop effective mitigation strategies for pollution control. Stochastic analysis is a powerful tool that can be used for estimating concentration fluctuation in plume dispersion in a wake region around buildings. Only few studies have been devoted to evaluate applications of stochastic analysis to pollutant dispersion in an urban area. This study was designed to investigate the concentration fields in the wake region using obstacle model such as an isolated building model. We measured concentration fluctuations at centerline of various downwind distances from the source, and different heights with the frequency of 1 KHz. Concentration fields were analyzed stochastically, using the probability density functions (pdf). Stochastic analysis was performed on the concentration fluctuation and the pdf of mean concentration, fluctuation intensity, and crosswind mean-plume dispersion. The pdf of the concentration fluctuation data have shown a significant non-Gaussian behavior. The lognormal distribution appeared to be the best fit to the shape of concentration measured in the boundary layer. We observed that the plume dispersion pdf near the source was shorter than the plume dispersion far from the source. Our findings suggest that the use of stochastic technique in complex building environment can be a powerful tool to help understand the distribution and location of air pollutants.

  4. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    NASA Astrophysics Data System (ADS)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  5. Stochastic response surface methodology: A study in the human health area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa

    2015-03-10

    In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.

  6. Impulsive synchronization of stochastic reaction-diffusion neural networks with mixed time delays.

    PubMed

    Sheng, Yin; Zeng, Zhigang

    2018-07-01

    This paper discusses impulsive synchronization of stochastic reaction-diffusion neural networks with Dirichlet boundary conditions and hybrid time delays. By virtue of inequality techniques, theories of stochastic analysis, linear matrix inequalities, and the contradiction method, sufficient criteria are proposed to ensure exponential synchronization of the addressed stochastic reaction-diffusion neural networks with mixed time delays via a designed impulsive controller. Compared with some recent studies, the neural network models herein are more general, some restrictions are relaxed, and the obtained conditions enhance and generalize some published ones. Finally, two numerical simulations are performed to substantiate the validity and merits of the developed theoretical analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. On some stochastic formulations and related statistical moments of pharmacokinetic models.

    PubMed

    Matis, J H; Wehrly, T E; Metzler, C M

    1983-02-01

    This paper presents the deterministic and stochastic model for a linear compartment system with constant coefficients, and it develops expressions for the mean residence times (MRT) and the variances of the residence times (VRT) for the stochastic model. The expressions are relatively simple computationally, involving primarily matrix inversion, and they are elegant mathematically, in avoiding eigenvalue analysis and the complex domain. The MRT and VRT provide a set of new meaningful response measures for pharmacokinetic analysis and they give added insight into the system kinetics. The new analysis is illustrated with an example involving the cholesterol turnover in rats.

  8. q-Gaussian distributions and multiplicative stochastic processes for analysis of multiple financial time series

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2010-12-01

    This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.

  9. The relationship between hospital specialization and hospital efficiency: do different measures of specialization lead to different results?

    PubMed

    Lindlbauer, Ivonne; Schreyögg, Jonas

    2014-12-01

    This study investigated the relationship between hospital specialization and technical efficiency using different measures of specialization, including two novel approaches based on patient volumes rather than patient proportions. It was motivated by the observation that most studies to date have quantified hospital specialization using information about hospital patients grouped into different categories based on their diagnosis, and in doing so have used proportions-thus indirectly assuming that these categories are dependent on one other. In order to account for the diversification of organizations and the idea that hospitals can be specialized in terms of professional expertise or technical equipment within a given diagnosis category, we developed our two specialization measures based on patient volume in each category. Using a one-step stochastic frontier approach on randomly selected data from the annual reports of 1,239 acute care German hospitals for the years 2000 through 2010, we estimated the relationship of inefficiency to exogenous variables, such as specialization. The results show that specialization as quantified by our novel measures has effects on efficiency that are the opposite of those obtained using earlier measures of specialization. These results underscore the importance of always providing an exact definition of specialization when studying its effects. Additionally, a Monte Carlo simulation based on three scenarios is provided to facilitate the choice of a specialization measure for further analysis.

  10. Efficiency of health care system at the sub-state level in Madhya Pradesh, India.

    PubMed

    Purohit, Brijesh C

    2010-01-01

    This paper attempts a sub-state-level analysis of health system for a low-income Indian state, namely, Madhya Pradesh. The objective of our study is to establish efficiency parameters that may help health policy makers to improve district-level and thus state-level health system performance. It provides an idealized yardstick to evaluate the performance of the health sector by using stochastic frontier technique. The study was carried out in two stages of estimation, and our results suggest that life expectancy in the Indian state could be enhanced considerably by correcting the factors that are adversely influencing sub-state-level health system efficiency. Our results indicate that main factors within the health system for discrepancy in interdistrict performance are inequitable distribution of supplies, availability of skilled attention at birth, and inadequate staffing relative to patient load of rural population at primary health centers. Overcoming these factors through additional resources in the deficient districts, mobilized partly from grants in aid and partly from patient welfare societies, may help the state to improve life expectancy speedily and more equitably. Besides the direct inputs from the health sector, a more conducive environment for gender development, reducing inequality in opportunities for women in health, education and other rights may provide the necessary impetus towards reducing maternal morbidity and mortality and add to overall life expectancy in the state.

  11. A damage analysis for brittle materials using stochastic micro-structural information

    NASA Astrophysics Data System (ADS)

    Lin, Shih-Po; Chen, Jiun-Shyan; Liang, Shixue

    2016-03-01

    In this work, a micro-crack informed stochastic damage analysis is performed to consider the failures of material with stochastic microstructure. The derivation of the damage evolution law is based on the Helmholtz free energy equivalence between cracked microstructure and homogenized continuum. The damage model is constructed under the stochastic representative volume element (SRVE) framework. The characteristics of SRVE used in the construction of the stochastic damage model have been investigated based on the principle of the minimum potential energy. The mesh dependency issue has been addressed by introducing a scaling law into the damage evolution equation. The proposed methods are then validated through the comparison between numerical simulations and experimental observations of a high strength concrete. It is observed that the standard deviation of porosity in the microstructures has stronger effect on the damage states and the peak stresses than its effect on the Young's and shear moduli in the macro-scale responses.

  12. Estimation and Analysis of Nonlinear Stochastic Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Marcus, S. I.

    1975-01-01

    The algebraic and geometric structures of certain classes of nonlinear stochastic systems were exploited in order to obtain useful stability and estimation results. The class of bilinear stochastic systems (or linear systems with multiplicative noise) was discussed. The stochastic stability of bilinear systems driven by colored noise was considered. Approximate methods for obtaining sufficient conditions for the stochastic stability of bilinear systems evolving on general Lie groups were discussed. Two classes of estimation problems involving bilinear systems were considered. It was proved that, for systems described by certain types of Volterra series expansions or by certain bilinear equations evolving on nilpotent or solvable Lie groups, the optimal conditional mean estimator consists of a finite dimensional nonlinear set of equations. The theory of harmonic analysis was used to derive suboptimal estimators for bilinear systems driven by white noise which evolve on compact Lie groups or homogeneous spaces.

  13. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    NASA Astrophysics Data System (ADS)

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  14. Gene regulatory networks: a coarse-grained, equation-free approach to multiscale computation.

    PubMed

    Erban, Radek; Kevrekidis, Ioannis G; Adalsteinsson, David; Elston, Timothy C

    2006-02-28

    We present computer-assisted methods for analyzing stochastic models of gene regulatory networks. The main idea that underlies this equation-free analysis is the design and execution of appropriately initialized short bursts of stochastic simulations; the results of these are processed to estimate coarse-grained quantities of interest, such as mesoscopic transport coefficients. In particular, using a simple model of a genetic toggle switch, we illustrate the computation of an effective free energy Phi and of a state-dependent effective diffusion coefficient D that characterize an unavailable effective Fokker-Planck equation. Additionally we illustrate the linking of equation-free techniques with continuation methods for performing a form of stochastic "bifurcation analysis"; estimation of mean switching times in the case of a bistable switch is also implemented in this equation-free context. The accuracy of our methods is tested by direct comparison with long-time stochastic simulations. This type of equation-free analysis appears to be a promising approach to computing features of the long-time, coarse-grained behavior of certain classes of complex stochastic models of gene regulatory networks, circumventing the need for long Monte Carlo simulations.

  15. A coupled stochastic rainfall-evapotranspiration model for hydrological impact analysis

    NASA Astrophysics Data System (ADS)

    Pham, Minh Tu; Vernieuwe, Hilde; De Baets, Bernard; Verhoest, Niko E. C.

    2018-02-01

    A hydrological impact analysis concerns the study of the consequences of certain scenarios on one or more variables or fluxes in the hydrological cycle. In such an exercise, discharge is often considered, as floods originating from extremely high discharges often cause damage. Investigating the impact of extreme discharges generally requires long time series of precipitation and evapotranspiration to be used to force a rainfall-runoff model. However, such kinds of data may not be available and one should resort to stochastically generated time series, even though the impact of using such data on the overall discharge, and especially on the extreme discharge events, is not well studied. In this paper, stochastically generated rainfall and corresponding evapotranspiration time series, generated by means of vine copulas, are used to force a simple conceptual hydrological model. The results obtained are comparable to the modelled discharge using observed forcing data. Yet, uncertainties in the modelled discharge increase with an increasing number of stochastically generated time series used. Notwithstanding this finding, it can be concluded that using a coupled stochastic rainfall-evapotranspiration model has great potential for hydrological impact analysis.

  16. [The Probabilistic Efficiency Frontier: A Value Assessment of Treatment Options in Hepatitis C].

    PubMed

    Mühlbacher, Axel C; Sadler, Andrew

    2017-06-19

    Background The German Institute for Quality and Efficiency in Health Care (IQWiG) recommends the concept of the efficiency frontier to assess health care interventions. The efficiency frontier supports regulatory decisions on reimbursement prices for the appropriate allocation of health care resources. Until today this cost-benefit assessment framework has only been applied on the basis of individual patient-relevant endpoints. This contradicts the reality of a multi-dimensional patient benefit. Objective The objective of this study was to illustrate the operationalization of multi-dimensional benefit considering the uncertainty in clinical effects and preference data in order to calculate the efficiency of different treatment options for hepatitis C (HCV). This case study shows how methodological challenges could be overcome in order to use the efficiency frontier for economic analysis and health care decision-making. Method The operationalization of patient benefit was carried out on several patient-relevant endpoints. Preference data from a discrete choice experiment (DCE) study and clinical data based on clinical trials, which reflected the patient and the clinical perspective, respectively, were used for the aggregation of an overall benefit score. A probabilistic efficiency frontier was constructed in a Monte Carlo simulation with 10000 random draws. Patient-relevant endpoints were modeled with a beta distribution and preference data with a normal distribution. The assessment of overall benefit and costs provided information about the adequacy of the treatment prices. The parameter uncertainty was illustrated by the price-acceptability-curve and the net monetary benefit. Results Based on the clinical and preference data in Germany, the interferon-free treatment options proved to be efficient for the current price level. The interferon-free therapies of the latest generation achieved a positive net cost-benefit. Within the decision model, these therapies showed a maximum overall benefit. Due to their high additional benefit and approved prices, the therapies lie above of the extrapolated efficiency frontier, which suggests that these options have efficient reimbursement prices. Considering uncertainty, even a higher price would have resulted in a positive cost-benefit ratio. Conclusion IQWiG's efficiency frontier was used to assess the value of different treatment options in HCV. This study demonstrates that the probabilistic efficiency frontier, price-acceptability-curve and the net monetary benefit can contribute essential information to reimbursement decisions and price negotiations. © Georg Thieme Verlag KG Stuttgart · New York.

  17. Stochastic flux analysis of chemical reaction networks

    PubMed Central

    2013-01-01

    Background Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. Results We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. Conclusions We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network. PMID:24314153

  18. Stochastic flux analysis of chemical reaction networks.

    PubMed

    Kahramanoğulları, Ozan; Lynch, James F

    2013-12-07

    Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network.

  19. New Frontiers and Challenges for Single-Cell Electrochemical Analysis.

    PubMed

    Zhang, Jingjing; Zhou, Junyu; Pan, Rongrong; Jiang, Dechen; Burgess, James D; Chen, Hong-Yuan

    2018-02-23

    Previous measurements of cell populations might obscure many important cellular differences, and new strategies for single-cell analyses are urgently needed to re-examine these fundamental biological principles for better diagnosis and treatment of diseases. Electrochemistry is a robust technique for the analysis of single living cells that has the advantages of minor interruption of cellular activity and provides the capability of high spatiotemporal resolution. The achievements of the past 30 years have revealed significant information about the exocytotic events of single cells to elucidate the mechanisms of cellular activity. Currently, the rapid developments of micro/nanofabrication and optoelectronic technologies drive the development of multifunctional electrodes and novel electrochemical approaches with higher resolution for single cells. In this Perspective, three new frontiers in this field, namely, electrochemical microscopy, intracellular analysis, and single-cell analysis in a biological system (i.e., neocortex and retina), are reviewed. The unique features and remaining challenges of these techniques are discussed.

  20. A methodological proposal to evaluate the cost of duration moral hazard in workplace accident insurance.

    PubMed

    Martín-Román, Ángel; Moral, Alfonso

    2017-12-01

    The cost of duration moral hazard in workplace accident insurance has been amply explored by North-American scholars. Given the current context of financial constraints in public accounts, and particularly in the Social Security system, we feel that the issue merits inquiry in the case of Spain. The present research posits a methodological proposal using the econometric technique of stochastic frontiers, which allows us to break down the duration of work-related leave into what we term "economic days" and "medical days". Our calculations indicate that during the 9-year period spanning 2005-2013, the cost of sick leave amongst full-time salaried workers amounted to 6920 million Euros (in constant 2011 Euros). Of this total, and bearing in mind that "economic days" are those attributable to duration moral hazard, over 3000 million Euros might be linked to workplace absenteeism. It is on this figure where economic policy measures might prove more effective.

  1. No Evidence of Trade-Off between Farm Efficiency and Resilience: Dependence of Resource-Use Efficiency on Land-Use Diversity

    PubMed Central

    Kahiluoto, Helena; Kaseva, Janne

    2016-01-01

    Efficiency in the use of resources stream-lined for expected conditions could lead to reduced system diversity and consequently endanger resilience. We tested the hypothesis of a trade-off between farm resource-use efficiency and land-use diversity. We applied stochastic frontier production models to assess the dependence of resource-use-efficiency on land-use diversity as illustrated by the Shannon-Weaver index. Total revenue in relation to use of capital, land and labour on the farms in Southern Finland with a size exceeding 30 ha was studied. The data were extracted from the Finnish Profitability Bookkeeping data. Our results indicate that there is either no trade-off or a negligible trade-off of no economic importance. The small dependence of resource-use efficiency on land-use diversity can be positive as well as negative. We conclude that diversification as a strategy to enhance farm resilience does not necessarily constrain resource-use efficiency. PMID:27662475

  2. Effects of surface functionalization on the electronic and structural properties of carbon nanotubes: A computational approach

    NASA Astrophysics Data System (ADS)

    Ribeiro, M. S.; Pascoini, A. L.; Knupp, W. G.; Camps, I.

    2017-12-01

    Carbon nanotubes (CNTs) have important electronic, mechanical and optical properties. These features may be different when comparing a pristine nanotube with other presenting its surface functionalized. These changes can be explored in areas of research and application, such as construction of nanodevices that act as sensors and filters. Following this idea, in the current work, we present the results from a systematic study of CNT's surface functionalized with hydroxyl and carboxyl groups. Using the entropy as selection criterion, we filtered a library of 10k stochastically generated complexes for each functional concentration (5, 10, 15, 20 and 25%). The structurally related parameters (root-mean-square deviation, entropy, and volume/area) have a monotonic relationship with functionalization concentration. Differently, the electronic parameters (frontier molecular orbital energies, electronic gap, molecular hardness, and electrophilicity index) present and oscillatory behavior. For a set of concentrations, the nanotubes present spin polarized properties that can be used in spintronics.

  3. Specialty and Full-Service Hospitals: A Comparative Cost Analysis

    PubMed Central

    Carey, Kathleen; Burgess, James F; Young, Gary J

    2008-01-01

    Objective To compare the costs of physician-owned cardiac, orthopedic, and surgical single specialty hospitals with those of full-service hospital competitors. Data Sources The primary data sources are the Medicare Cost Reports for 1998–2004 and hospital inpatient discharge data for three of the states where single specialty hospitals are most prevalent, Texas, California, and Arizona. The latter were obtained from the Texas Department of State Health Services, the California Office of Statewide Health Planning and Development, and the Agency for Healthcare Research and Quality Healthcare Cost and Utilization Project. Additional data comes from the American Hospital Association Annual Survey Database. Study Design We identified all physician-owned cardiac, orthopedic, and surgical specialty hospitals in these three states as well as all full-service acute care hospitals serving the same market areas, defined using Dartmouth Hospital Referral Regions. We estimated a hospital cost function using stochastic frontier regression analysis, and generated hospital specific inefficiency measures. Application of t-tests of significance compared the inefficiency measures of specialty hospitals with those of full-service hospitals to make general comparisons between these classes of hospitals. Principal Findings Results do not provide evidence that specialty hospitals are more efficient than the full-service hospitals with whom they compete. In particular, orthopedic and surgical specialty hospitals appear to have significantly higher levels of cost inefficiency. Cardiac hospitals, however, do not appear to be different from competitors in this respect. Conclusions Policymakers should not embrace the assumption that physician-owned specialty hospitals produce patient care more efficiently than their full-service hospital competitors. PMID:18662170

  4. Relativistic analysis of stochastic kinematics

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    The relativistic analysis of stochastic kinematics is developed in order to determine the transformation of the effective diffusivity tensor in inertial frames. Poisson-Kac stochastic processes are initially considered. For one-dimensional spatial models, the effective diffusion coefficient measured in a frame Σ moving with velocity w with respect to the rest frame of the stochastic process is inversely proportional to the third power of the Lorentz factor γ (w ) =(1-w2/c2) -1 /2 . Subsequently, higher-dimensional processes are analyzed and it is shown that the diffusivity tensor in a moving frame becomes nonisotropic: The diffusivities parallel and orthogonal to the velocity of the moving frame scale differently with respect to γ (w ) . The analysis of discrete space-time diffusion processes permits one to obtain a general transformation theory of the tensor diffusivity, confirmed by several different simulation experiments. Several implications of the theory are also addressed and discussed.

  5. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  6. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  7. A polynomial-chaos-expansion-based building block approach for stochastic analysis of photonic circuits

    NASA Astrophysics Data System (ADS)

    Waqas, Abi; Melati, Daniele; Manfredi, Paolo; Grassi, Flavia; Melloni, Andrea

    2018-02-01

    The Building Block (BB) approach has recently emerged in photonic as a suitable strategy for the analysis and design of complex circuits. Each BB can be foundry related and contains a mathematical macro-model of its functionality. As well known, statistical variations in fabrication processes can have a strong effect on their functionality and ultimately affect the yield. In order to predict the statistical behavior of the circuit, proper analysis of the uncertainties effects is crucial. This paper presents a method to build a novel class of Stochastic Process Design Kits for the analysis of photonic circuits. The proposed design kits directly store the information on the stochastic behavior of each building block in the form of a generalized-polynomial-chaos-based augmented macro-model obtained by properly exploiting stochastic collocation and Galerkin methods. Using this approach, we demonstrate that the augmented macro-models of the BBs can be calculated once and stored in a BB (foundry dependent) library and then used for the analysis of any desired circuit. The main advantage of this approach, shown here for the first time in photonics, is that the stochastic moments of an arbitrary photonic circuit can be evaluated by a single simulation only, without the need for repeated simulations. The accuracy and the significant speed-up with respect to the classical Monte Carlo analysis are verified by means of classical photonic circuit example with multiple uncertain variables.

  8. Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process

    NASA Astrophysics Data System (ADS)

    Turner, Douglas C.; Ladde, Gangaram S.

    2018-03-01

    Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.

  9. Study on Nonlinear Vibration Analysis of Gear System with Random Parameters

    NASA Astrophysics Data System (ADS)

    Tong, Cao; Liu, Xiaoyuan; Fan, Li

    2018-03-01

    In order to study the dynamic characteristics of gear nonlinear vibration system and the influence of random parameters, firstly, a nonlinear stochastic vibration analysis model of gear 3-DOF is established based on Newton’s Law. And the random response of gear vibration is simulated by stepwise integration method. Secondly, the influence of stochastic parameters such as meshing damping, tooth side gap and excitation frequency on the dynamic response of gear nonlinear system is analyzed by using the stability analysis method such as bifurcation diagram and Lyapunov exponent method. The analysis shows that the stochastic process can not be neglected, which can cause the random bifurcation and chaos of the system response. This study will provide important reference value for vibration engineering designers.

  10. Tensor methods for parameter estimation and bifurcation analysis of stochastic reaction networks

    PubMed Central

    Liao, Shuohao; Vejchodský, Tomáš; Erban, Radek

    2015-01-01

    Stochastic modelling of gene regulatory networks provides an indispensable tool for understanding how random events at the molecular level influence cellular functions. A common challenge of stochastic models is to calibrate a large number of model parameters against the experimental data. Another difficulty is to study how the behaviour of a stochastic model depends on its parameters, i.e. whether a change in model parameters can lead to a significant qualitative change in model behaviour (bifurcation). In this paper, tensor-structured parametric analysis (TPA) is developed to address these computational challenges. It is based on recently proposed low-parametric tensor-structured representations of classical matrices and vectors. This approach enables simultaneous computation of the model properties for all parameter values within a parameter space. The TPA is illustrated by studying the parameter estimation, robustness, sensitivity and bifurcation structure in stochastic models of biochemical networks. A Matlab implementation of the TPA is available at http://www.stobifan.org. PMID:26063822

  11. Tensor methods for parameter estimation and bifurcation analysis of stochastic reaction networks.

    PubMed

    Liao, Shuohao; Vejchodský, Tomáš; Erban, Radek

    2015-07-06

    Stochastic modelling of gene regulatory networks provides an indispensable tool for understanding how random events at the molecular level influence cellular functions. A common challenge of stochastic models is to calibrate a large number of model parameters against the experimental data. Another difficulty is to study how the behaviour of a stochastic model depends on its parameters, i.e. whether a change in model parameters can lead to a significant qualitative change in model behaviour (bifurcation). In this paper, tensor-structured parametric analysis (TPA) is developed to address these computational challenges. It is based on recently proposed low-parametric tensor-structured representations of classical matrices and vectors. This approach enables simultaneous computation of the model properties for all parameter values within a parameter space. The TPA is illustrated by studying the parameter estimation, robustness, sensitivity and bifurcation structure in stochastic models of biochemical networks. A Matlab implementation of the TPA is available at http://www.stobifan.org.

  12. Analysis of degree of nonlinearity and stochastic nature of HRV signal during meditation using delay vector variance method.

    PubMed

    Reddy, L Ram Gopal; Kuntamalla, Srinivas

    2011-01-01

    Heart rate variability analysis is fast gaining acceptance as a potential non-invasive means of autonomic nervous system assessment in research as well as clinical domains. In this study, a new nonlinear analysis method is used to detect the degree of nonlinearity and stochastic nature of heart rate variability signals during two forms of meditation (Chi and Kundalini). The data obtained from an online and widely used public database (i.e., MIT/BIH physionet database), is used in this study. The method used is the delay vector variance (DVV) method, which is a unified method for detecting the presence of determinism and nonlinearity in a time series and is based upon the examination of local predictability of a signal. From the results it is clear that there is a significant change in the nonlinearity and stochastic nature of the signal before and during the meditation (p value > 0.01). During Chi meditation there is a increase in stochastic nature and decrease in nonlinear nature of the signal. There is a significant decrease in the degree of nonlinearity and stochastic nature during Kundalini meditation.

  13. Response analysis of a class of quasi-linear systems with fractional derivative excited by Poisson white noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yongge; Xu, Wei, E-mail: weixu@nwpu.edu.cn; Yang, Guidong

    The Poisson white noise, as a typical non-Gaussian excitation, has attracted much attention recently. However, little work was referred to the study of stochastic systems with fractional derivative under Poisson white noise excitation. This paper investigates the stationary response of a class of quasi-linear systems with fractional derivative excited by Poisson white noise. The equivalent stochastic system of the original stochastic system is obtained. Then, approximate stationary solutions are obtained with the help of the perturbation method. Finally, two typical examples are discussed in detail to demonstrate the effectiveness of the proposed method. The analysis also shows that the fractionalmore » order and the fractional coefficient significantly affect the responses of the stochastic systems with fractional derivative.« less

  14. FRONTIER FIELDS: HIGH-REDSHIFT PREDICTIONS AND EARLY RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Dan; Bradley, Larry; Zitrin, Adi, E-mail: DCoe@STScI.edu

    2015-02-20

    The Frontier Fields program is obtaining deep Hubble and Spitzer Space Telescope images of new ''blank'' fields and nearby fields gravitationally lensed by massive galaxy clusters. The Hubble images of the lensed fields are revealing nJy sources (AB mag > 31), the faintest galaxies yet observed. The full program will transform our understanding of galaxy evolution in the first 600 million years (z > 9). Previous programs have yielded a dozen or so z > 9 candidates, including perhaps fewer than expected in the Ultra Deep Field and more than expected in shallower Hubble images. In this paper, we present high-redshift (z >more » 6) number count predictions for the Frontier Fields and candidates in three of the first Hubble images. We show the full Frontier Fields program may yield up to ∼70 z > 9 candidates (∼6 per field). We base this estimate on an extrapolation of luminosity functions observed between 4 < z < 8 and gravitational lensing models submitted by the community. However, in the first two deep infrared Hubble images obtained to date, we find z ∼ 8 candidates but no strong candidates at z > 9. We defer quantitative analysis of the z > 9 deficit (including detection completeness estimates) to future work including additional data. At these redshifts, cosmic variance (field-to-field variation) is expected to be significant (greater than ±50%) and include clustering of early galaxies formed in overdensities. The full Frontier Fields program will significantly mitigate this uncertainty by observing six independent sightlines each with a lensing cluster and nearby blank field.« less

  15. Frontier Homes. Save Our History[TM]. Teacher's Guide.

    ERIC Educational Resources Information Center

    A&E Network, New York, NY.

    This lesson plan, based on the Arts and Entertainment documentary "Frontier Homes," consists of four segments which examine a style of historic dwelling built by settlers on the frontier: the post and beam structures built by English settlers in New England; the log houses constructed by pioneers on the forested frontier; sod houses…

  16. Vibrational spectroscopic and DFT calculation studies of 2-amino-7-bromo-5-oxo-[1]benzopyrano [2,3-b]pyridine-3 carbonitrile

    NASA Astrophysics Data System (ADS)

    Premkumar, S.; Jawahar, A.; Mathavan, T.; Kumara Dhas, M.; Milton Franklin Benial, A.

    2015-03-01

    The vibrational spectra of 2-amino-7-bromo-5-oxo-[1]benzopyrano [2,3-b]pyridine-3 carbonitrile were recorded using fourier transform-infrared and fourier transform-Raman spectrometer. The optimized structural parameters, vibrational frequencies, Mulliken atomic charge distribution, frontier molecular orbitals, thermodynamic properties, temperature dependence of thermodynamic parameters, first order hyperpolarizability and natural bond orbital calculations of the molecule were performed using the Gaussian 09 program. The vibrational frequencies were assigned on the basis of potential energy distribution calculation using the VEDA 4.0 program. The calculated first order hyperpolarizability of ABOBPC molecule was obtained as 6.908 × 10-30 issue, which was 10.5 times greater than urea. The nonlinear optical activity of the molecule was also confirmed by the frontier molecular orbitals and natural bond orbital analysis. The frontier molecular orbitals analysis shows that the lower energy gap of the molecule, which leads to the higher value of first order hyperpolarizability. The natural bond orbital analysis indicates that the nonlinear optical activity of the molecule arises due to the π → π∗ transitions. The Mulliken atomic charge distribution confirms the presence of intramolecular charge transfer within the molecule. The reactive site of the molecule was predicted from the molecular electrostatic potential contour map. The values of thermo dynamic parameters were increasing with increasing temperature.

  17. The Linear Programming to evaluate the performance of Oral Health in Primary Care.

    PubMed

    Colussi, Claudia Flemming; Calvo, Maria Cristina Marino; Freitas, Sergio Fernando Torres de

    2013-01-01

    To show the use of Linear Programming to evaluate the performance of Oral Health in Primary Care. This study used data from 19 municipalities of Santa Catarina city that participated of the state evaluation in 2009 and have more than 50,000 habitants. A total of 40 indicators were evaluated, calculated using the Microsoft Excel 2007, and converted to the interval [0, 1] in ascending order (one indicating the best situation and zero indicating the worst situation). Applying the Linear Programming technique municipalities were assessed and compared among them according to performance curve named "quality estimated frontier". Municipalities included in the frontier were classified as excellent. Indicators were gathered, and became synthetic indicators. The majority of municipalities not included in the quality frontier (values different of 1.0) had lower values than 0.5, indicating poor performance. The model applied to the municipalities of Santa Catarina city assessed municipal management and local priorities rather than the goals imposed by pre-defined parameters. In the final analysis three municipalities were included in the "perceived quality frontier". The Linear Programming technique allowed to identify gaps that must be addressed by city managers to enhance actions taken. It also enabled to observe each municipal performance and compare results among similar municipalities.

  18. Stochastic modeling of consumer preferences for health care institutions.

    PubMed

    Malhotra, N K

    1983-01-01

    This paper proposes a stochastic procedure for modeling consumer preferences via LOGIT analysis. First, a simple, non-technical exposition of the use of a stochastic approach in health care marketing is presented. Second, a study illustrating the application of the LOGIT model in assessing consumer preferences for hospitals is given. The paper concludes with several implications of the proposed approach.

  19. Probability density function evolution of power systems subject to stochastic variation of renewable energy

    NASA Astrophysics Data System (ADS)

    Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.

    2018-05-01

    As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.

  20. Controllability of fractional higher order stochastic integrodifferential systems with fractional Brownian motion.

    PubMed

    Sathiyaraj, T; Balasubramaniam, P

    2017-11-30

    This paper presents a new set of sufficient conditions for controllability of fractional higher order stochastic integrodifferential systems with fractional Brownian motion (fBm) in finite dimensional space using fractional calculus, fixed point technique and stochastic analysis approach. In particular, we discuss the complete controllability for nonlinear fractional stochastic integrodifferential systems under the proved result of the corresponding linear fractional system is controllable. Finally, an example is presented to illustrate the efficiency of the obtained theoretical results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Analysis of stability for stochastic delay integro-differential equations.

    PubMed

    Zhang, Yu; Li, Longsuo

    2018-01-01

    In this paper, we concern stability of numerical methods applied to stochastic delay integro-differential equations. For linear stochastic delay integro-differential equations, it is shown that the mean-square stability is derived by the split-step backward Euler method without any restriction on step-size, while the Euler-Maruyama method could reproduce the mean-square stability under a step-size constraint. We also confirm the mean-square stability of the split-step backward Euler method for nonlinear stochastic delay integro-differential equations. The numerical experiments further verify the theoretical results.

  2. Microscopy and microanalysis 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, G.W.; Corbett, J.M.; Dimlich, R.V.W.

    1996-12-31

    The Proceedings of this Annual Meeting contain paper of members from the three societies. These proceedings emphasizes the common research interests and attempts to eliminate some unwanted overlap. Topics covered are: microscopic analysis of animals with altered gene expression and in-situ gene and antibody localizations, high-resolution elemental mapping of nucleoprofein interactions, plant biology and pathology, quantitative HREM analysis of perfect and defected materials, computational methods for TEM image analysis, high-resolution FESM in materials research, frontiers in polymer microscopy and microanalysis, oxidation and corrosion, micro XRD and XRF, molecular microspectroscopy and spectral imaging, advances in confocal and multidimensional light microscopy, analyticalmore » electron microscopy in biology, correlative microscopy in biological sciences, grain-boundary microengineering, surfaces and interfaces, telepresence microscopy in education and research, MSA educational outreach, quantitative electron probe microanalysis, frontiers of analytical electron microscopy, critical issues in ceramic microstructures, dynamic organization of the cell, pathology, microbiology, high-resolution biological and cryo SEM, and scanning-probe microscopy.« less

  3. Synthesis, crystal structures, computational studies and antimicrobial activity of new designed bis((5-aryl-1,3,4-oxadiazol-2-yl)thio)alkanes

    NASA Astrophysics Data System (ADS)

    Ahmed, Muhammad Naeem; Sadiq, Beenish; Al-Masoudi, Najim A.; Yasin, Khawaja Ansar; Hameed, Shahid; Mahmood, Tariq; Ayub, Khurshid; Tahir, Muhammad Nawaz

    2018-03-01

    A new series of bis((5-aryl-1,3,4-oxadiazol-2-yl)thio)alkanes 4-14 have been synthesized via nucleophilic substitution reaction of dihaloalkanes with respective 1,3,4-oxadiazole-2-thiols 3a-f, and characterized by spectroscopic techniques. The structures of 4 and 12 were unambiguously confirmed by single-crystal X-ray diffraction analysis. Density functional theory calculations at B3LYP/6-31 + G(d) level of theory were performed for comparison of X-ray geometric parameters, molecular electrostatic potential (MEP) and frontier molecular orbital analyses of synthesized compounds. MEP analysis revealed that these compounds are nucleophilic in nature. Frontier molecular orbitals (FMOs) analysis of 4-14 was performed for evaluation of kinetic stability. All synthesized compounds were screened in vitro for antimicrobial activity against three bacterial and three fungal strains and showed promising results.

  4. Permanence and asymptotic behaviors of stochastic predator-prey system with Markovian switching and Lévy noise

    NASA Astrophysics Data System (ADS)

    Wang, Sheng; Wang, Linshan; Wei, Tengda

    2018-04-01

    This paper concerns the dynamics of a stochastic predator-prey system with Markovian switching and Lévy noise. First, the existence and uniqueness of global positive solution to the system is proved. Then, by combining stochastic analytical techniques with M-matrix analysis, sufficient conditions of stochastic permanence and extinction are obtained. Furthermore, for the stochastic permanence case, by means of four constants related to the stationary probability distribution of the Markov chain and the parameters of the subsystems, both the superior limit and the inferior limit of the average in time of the sample path of the solution are estimated. Finally, our conclusions are illustrated through an example.

  5. Economics of Agroforestry

    Treesearch

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  6. Deterministic and Stochastic Analysis of a Prey-Dependent Predator-Prey System

    ERIC Educational Resources Information Center

    Maiti, Alakes; Samanta, G. P.

    2005-01-01

    This paper reports on studies of the deterministic and stochastic behaviours of a predator-prey system with prey-dependent response function. The first part of the paper deals with the deterministic analysis of uniform boundedness, permanence, stability and bifurcation. In the second part the reproductive and mortality factors of the prey and…

  7. MONALISA for stochastic simulations of Petri net models of biochemical systems.

    PubMed

    Balazki, Pavel; Lindauer, Klaus; Einloft, Jens; Ackermann, Jörg; Koch, Ina

    2015-07-10

    The concept of Petri nets (PN) is widely used in systems biology and allows modeling of complex biochemical systems like metabolic systems, signal transduction pathways, and gene expression networks. In particular, PN allows the topological analysis based on structural properties, which is important and useful when quantitative (kinetic) data are incomplete or unknown. Knowing the kinetic parameters, the simulation of time evolution of such models can help to study the dynamic behavior of the underlying system. If the number of involved entities (molecules) is low, a stochastic simulation should be preferred against the classical deterministic approach of solving ordinary differential equations. The Stochastic Simulation Algorithm (SSA) is a common method for such simulations. The combination of the qualitative and semi-quantitative PN modeling and stochastic analysis techniques provides a valuable approach in the field of systems biology. Here, we describe the implementation of stochastic analysis in a PN environment. We extended MONALISA - an open-source software for creation, visualization and analysis of PN - by several stochastic simulation methods. The simulation module offers four simulation modes, among them the stochastic mode with constant firing rates and Gillespie's algorithm as exact and approximate versions. The simulator is operated by a user-friendly graphical interface and accepts input data such as concentrations and reaction rate constants that are common parameters in the biological context. The key features of the simulation module are visualization of simulation, interactive plotting, export of results into a text file, mathematical expressions for describing simulation parameters, and up to 500 parallel simulations of the same parameter sets. To illustrate the method we discuss a model for insulin receptor recycling as case study. We present a software that combines the modeling power of Petri nets with stochastic simulation of dynamic processes in a user-friendly environment supported by an intuitive graphical interface. The program offers a valuable alternative to modeling, using ordinary differential equations, especially when simulating single-cell experiments with low molecule counts. The ability to use mathematical expressions provides an additional flexibility in describing the simulation parameters. The open-source distribution allows further extensions by third-party developers. The software is cross-platform and is licensed under the Artistic License 2.0.

  8. MEANS: python package for Moment Expansion Approximation, iNference and Simulation

    PubMed Central

    Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C.; Kirk, Paul D. W.; Stumpf, Michael P. H.

    2016-01-01

    Motivation: Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system’s moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. Results: We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. Availability and implementation: https://github.com/theosysbio/means Contacts: m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153663

  9. MEANS: python package for Moment Expansion Approximation, iNference and Simulation.

    PubMed

    Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C; Kirk, Paul D W; Stumpf, Michael P H

    2016-09-15

    Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system's moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. https://github.com/theosysbio/means m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. Tipping point analysis of atmospheric oxygen concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.

    2015-03-15

    We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.

  11. Report in the Energy and Intensity Frontiers, and Theoretical at Northwestern University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velasco, Mayda; Schmitt, Michael; deGouvea, Andre

    The Northwestern (NU) Particle Physics (PP) group involved in this report is active on all the following priority areas: Energy and Intensity Frontiers. The group is lead by 2 full profs. in experimental physics (Schmitt and Velasco), 3 full profs. in theoretical physics (de Gouvea, Low and Petriello), and Heidi Schellman who is now at Oregon State. Low and Petriello hold joint appointments with the HEP Division at Argonne National Laboratory. The theoretical PP research focuses on different aspects of PP phenomenology. de Gouvea dedicates a large fraction of his research efforts to understanding the origin of neutrino masses, neutrinomore » properties and uncovering other new phenomena, and investigating connections between neutrino physics and other aspects of PP. Low works on Higgs physics as well as new theories beyond the Standard Model. Petriello pursues a research program in precision QCD and its associated collider phenomenology. The main goal of this effort is to improve the Standard Model predictions for important LHC observables in order to enable discoveries of new physics. In recent years, the emphasis on experimental PP at NU has been in collider physics. NU expands its efforts in new directions in both the Intensity and the Cosmic Frontiers (not discussed in this report). In the Intensity Frontier, Schmitt has started a new effort on Mu2e. He was accepted as a collaborator in April 2015 and is identified with important projects. In the Energy Frontier, Hahn, Schmitt and Velasco continue to have a significant impact and expanded their CMS program to include R&D for the real-time L1 tracking trigger and the high granularity calorimeter needed for the high-luminosity LHC. Hahn is supported by an independent DOE Career Award and his work will not be discussed in this document. The NU analysis effort includes searches for rare and forbidden decays of the Higgs bosons, Z boson, top quark, dark matter and other physics beyond the standard model topics. Four students completed their PhD: Kubik is now contributing to the Cosmic Frontier program, Pollack to both the Intensity and Energy Frontiers and Pozdnyakov and Odell will continue in the Energy Frontier. All our research scientists, Anastassov, Oferzynski, Lusito, and Stoynev, have found new positions. The new post-docs are Trovato from Scuola Normale de Pisa, Odell from Northwestern and Bhattacharya from Brown. Trovato is now supported by Hahn, and so is Sung, previously at MIT.« less

  12. Spatial delineation, fluid-lithology characterization, and petrophysical modeling of deepwater Gulf of Mexico reservoirs though joint AVA deterministic and stochastic inversion of three-dimensional partially-stacked seismic amplitude data and well logs

    NASA Astrophysics Data System (ADS)

    Contreras, Arturo Javier

    This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.

  13. Optimality, stochasticity, and variability in motor behavior

    PubMed Central

    Guigon, Emmanuel; Baraduc, Pierre; Desmurget, Michel

    2008-01-01

    Recent theories of motor control have proposed that the nervous system acts as a stochastically optimal controller, i.e. it plans and executes motor behaviors taking into account the nature and statistics of noise. Detrimental effects of noise are converted into a principled way of controlling movements. Attractive aspects of such theories are their ability to explain not only characteristic features of single motor acts, but also statistical properties of repeated actions. Here, we present a critical analysis of stochastic optimality in motor control which reveals several difficulties with this hypothesis. We show that stochastic control may not be necessary to explain the stochastic nature of motor behavior, and we propose an alternative framework, based on the action of a deterministic controller coupled with an optimal state estimator, which relieves drawbacks of stochastic optimality and appropriately explains movement variability. PMID:18202922

  14. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    PubMed

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.

  15. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    PubMed Central

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865

  16. Exploring gravitational lensing model variations in the Frontier Fields galaxy clusters

    NASA Astrophysics Data System (ADS)

    Harris James, Nicholas John; Raney, Catie; Brennan, Sean; Keeton, Charles

    2018-01-01

    Multiple groups have been working on modeling the mass distributions of the six lensing galaxy clusters in the Hubble Space Telescope Frontier Fields data set. The magnification maps produced from these mass models will be important for the future study of the lensed background galaxies, but there exists significant variation in the different groups’ models and magnification maps. We explore the use of two-dimensional histograms as a tool for visualizing these magnification map variations. Using a number of simple, one- or two-halo singular isothermal sphere models, we explore the features that are produced in 2D histogram model comparisons when parameters such as halo mass, ellipticity, and location are allowed to vary. Our analysis demonstrates the potential of 2D histograms as a means of observing the full range of differences between the Frontier Fields groups’ models.This work has been supported by funding from National Science Foundation grants PHY-1560077 and AST-1211385, and from the Space Telescope Science Institute.

  17. Stochastic Multiscale Analysis and Design of Engine Disks

    DTIC Science & Technology

    2010-07-28

    shown recently to fail when used with data-driven non-linear stochastic input models (KPCA, IsoMap, etc.). Need for scalable exascale computing algorithms Materials Process Design and Control Laboratory Cornell University

  18. Stochastic lumping analysis for linear kinetics and its application to the fluctuation relations between hierarchical kinetic networks.

    PubMed

    Deng, De-Ming; Chang, Cheng-Hung

    2015-05-14

    Conventional studies of biomolecular behaviors rely largely on the construction of kinetic schemes. Since the selection of these networks is not unique, a concern is raised whether and under which conditions hierarchical schemes can reveal the same experimentally measured fluctuating behaviors and unique fluctuation related physical properties. To clarify these questions, we introduce stochasticity into the traditional lumping analysis, generalize it from rate equations to chemical master equations and stochastic differential equations, and extract the fluctuation relations between kinetically and thermodynamically equivalent networks under intrinsic and extrinsic noises. The results provide a theoretical basis for the legitimate use of low-dimensional models in the studies of macromolecular fluctuations and, more generally, for exploring stochastic features in different levels of contracted networks in chemical and biological kinetic systems.

  19. A Stochastic-Variational Model for Soft Mumford-Shah Segmentation

    PubMed Central

    2006-01-01

    In contemporary image and vision analysis, stochastic approaches demonstrate great flexibility in representing and modeling complex phenomena, while variational-PDE methods gain enormous computational advantages over Monte Carlo or other stochastic algorithms. In combination, the two can lead to much more powerful novel models and efficient algorithms. In the current work, we propose a stochastic-variational model for soft (or fuzzy) Mumford-Shah segmentation of mixture image patterns. Unlike the classical hard Mumford-Shah segmentation, the new model allows each pixel to belong to each image pattern with some probability. Soft segmentation could lead to hard segmentation, and hence is more general. The modeling procedure, mathematical analysis on the existence of optimal solutions, and computational implementation of the new model are explored in detail, and numerical examples of both synthetic and natural images are presented. PMID:23165059

  20. The Center for Frontiers of Subsurface Energy Security (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, Gary A.

    "The Center for Frontiers of Subsurface Energy Security (CFSES)" was submitted to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CFSES is directed by Gary A. Pope at the University of Texas at Austin and partners with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conductmore » fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  1. The Center for Frontiers of Subsurface Energy Security (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Pope, Gary A. (Director, Center for Frontiers of Subsurface Energy Security); CFSES Staff

    2017-12-09

    'The Center for Frontiers of Subsurface Energy Security (CFSES)' was submitted to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CFSES is directed by Gary A. Pope at the University of Texas at Austin and partners with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  2. Energy Frontier Research Center Materials Science of Actinides (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Burns, Peter (Director, Materials Science of Actinides); MSA Staff

    2017-12-09

    'Energy Frontier Research Center Materials Science of Actinides' was submitted by the EFRC for Materials Science of Actinides (MSA) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. MSA is directed by Peter Burns at the University of Notre Dame, and is a partnership of scientists from ten institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  3. Unifying physical concepts of reality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, T.L.

    1983-08-01

    Physics may be characterized as the science of matter and energy. It anchors the two ends of the frontiers of science: the frontier of the very small and the frontier of the very large. All of the phenomena that we observe and study at the frontiers of science - all external experiences - are manifestations of matter and energy. One may, therefore, use physics to exemplify both the diversity and unity of science. This theme will be developed in two separate examples: first by sketching, very briefly, the historical origins of frontiers of the very small and very large andmore » the converging unity of these two frontiers; and then by describing certain unifying concepts that play a central role in physics and provide a framework for relating developments in different sciences.« less

  4. An efficient computational method for solving nonlinear stochastic Itô integral equations: Application for stochastic problems in physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir; The Laboratory of Quantum Information Processing, Yazd University, Yazd; Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir

    Because of the nonlinearity, closed-form solutions of many important stochastic functional equations are virtually impossible to obtain. Thus, numerical solutions are a viable alternative. In this paper, a new computational method based on the generalized hat basis functions together with their stochastic operational matrix of Itô-integration is proposed for solving nonlinear stochastic Itô integral equations in large intervals. In the proposed method, a new technique for computing nonlinear terms in such problems is presented. The main advantage of the proposed method is that it transforms problems under consideration into nonlinear systems of algebraic equations which can be simply solved. Errormore » analysis of the proposed method is investigated and also the efficiency of this method is shown on some concrete examples. The obtained results reveal that the proposed method is very accurate and efficient. As two useful applications, the proposed method is applied to obtain approximate solutions of the stochastic population growth models and stochastic pendulum problem.« less

  5. Delay-distribution-dependent H∞ state estimation for delayed neural networks with (x,v)-dependent noises and fading channels.

    PubMed

    Sheng, Li; Wang, Zidong; Tian, Engang; Alsaadi, Fuad E

    2016-12-01

    This paper deals with the H ∞ state estimation problem for a class of discrete-time neural networks with stochastic delays subject to state- and disturbance-dependent noises (also called (x,v)-dependent noises) and fading channels. The time-varying stochastic delay takes values on certain intervals with known probability distributions. The system measurement is transmitted through fading channels described by the Rice fading model. The aim of the addressed problem is to design a state estimator such that the estimation performance is guaranteed in the mean-square sense against admissible stochastic time-delays, stochastic noises as well as stochastic fading signals. By employing the stochastic analysis approach combined with the Kronecker product, several delay-distribution-dependent conditions are derived to ensure that the error dynamics of the neuron states is stochastically stable with prescribed H ∞ performance. Finally, a numerical example is provided to illustrate the effectiveness of the obtained results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Stochastic modular analysis for gene circuits: interplay among retroactivity, nonlinearity, and stochasticity.

    PubMed

    Kim, Kyung Hyuk; Sauro, Herbert M

    2015-01-01

    This chapter introduces a computational analysis method for analyzing gene circuit dynamics in terms of modules while taking into account stochasticity, system nonlinearity, and retroactivity. (1) ANALOG ELECTRICAL CIRCUIT REPRESENTATION FOR GENE CIRCUITS: A connection between two gene circuit components is often mediated by a transcription factor (TF) and the connection signal is described by the TF concentration. The TF is sequestered to its specific binding site (promoter region) and regulates downstream transcription. This sequestration has been known to affect the dynamics of the TF by increasing its response time. The downstream effect-retroactivity-has been shown to be explicitly described in an electrical circuit representation, as an input capacitance increase. We provide a brief review on this topic. (2) MODULAR DESCRIPTION OF NOISE PROPAGATION: Gene circuit signals are noisy due to the random nature of biological reactions. The noisy fluctuations in TF concentrations affect downstream regulation. Thus, noise can propagate throughout the connected system components. This can cause different circuit components to behave in a statistically dependent manner, hampering a modular analysis. Here, we show that the modular analysis is still possible at the linear noise approximation level. (3) NOISE EFFECT ON MODULE INPUT-OUTPUT RESPONSE: We investigate how to deal with a module input-output response and its noise dependency. Noise-induced phenotypes are described as an interplay between system nonlinearity and signal noise. Lastly, we provide the comprehensive approach incorporating the above three analysis methods, which we call "stochastic modular analysis." This method can provide an analysis framework for gene circuit dynamics when the nontrivial effects of retroactivity, stochasticity, and nonlinearity need to be taken into account.

  7. Global sensitivity analysis in stochastic simulators of uncertain reaction networks.

    PubMed

    Navarro Jimenez, M; Le Maître, O P; Knio, O M

    2016-12-28

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.

  8. Global sensitivity analysis in stochastic simulators of uncertain reaction networks

    DOE PAGES

    Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.

    2016-12-23

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol’s decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes thatmore » the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. Here, a sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.« less

  9. Global sensitivity analysis in stochastic simulators of uncertain reaction networks

    NASA Astrophysics Data System (ADS)

    Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.

    2016-12-01

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.

  10. Using stochastic models to incorporate spatial and temporal variability [Exercise 14

    Treesearch

    Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke

    2003-01-01

    To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...

  11. Nonlinear Image Denoising Methodologies

    DTIC Science & Technology

    2002-05-01

    53 5.3 A Multiscale Approach to Scale-Space Analysis . . . . . . . . . . . . . . . . 53 5.4...etc. In this thesis, Our approach to denoising is first based on a controlled nonlinear stochastic random walk to achieve a scale space analysis ( as in... stochastic treatment or interpretation of the diffusion. In addition, unless a specific stopping time is known to be adequate, the resulting evolution

  12. Fast smooth second-order sliding mode control for stochastic systems with enumerable coloured noises

    NASA Astrophysics Data System (ADS)

    Yang, Peng-fei; Fang, Yang-wang; Wu, You-li; Zhang, Dan-xu; Xu, Yang

    2018-01-01

    A fast smooth second-order sliding mode control is presented for a class of stochastic systems driven by enumerable Ornstein-Uhlenbeck coloured noises with time-varying coefficients. Instead of treating the noise as bounded disturbance, the stochastic control techniques are incorporated into the design of the control. The finite-time mean-square practical stability and finite-time mean-square practical reachability are first introduced. Then the prescribed sliding variable dynamic is presented. The sufficient condition guaranteeing its finite-time convergence is given and proved using stochastic Lyapunov-like techniques. The proposed sliding mode controller is applied to a second-order nonlinear stochastic system. Simulation results are given comparing with smooth second-order sliding mode control to validate the analysis.

  13. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids

    PubMed Central

    Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229

  14. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids.

    PubMed

    Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.

  15. Bond-based linear indices of the non-stochastic and stochastic edge-adjacency matrix. 1. Theory and modeling of ChemPhys properties of organic molecules.

    PubMed

    Marrero-Ponce, Yovani; Martínez-Albelo, Eugenio R; Casañola-Martín, Gerardo M; Castillo-Garit, Juan A; Echevería-Díaz, Yunaimy; Zaldivar, Vicente Romero; Tygat, Jan; Borges, José E Rodriguez; García-Domenech, Ramón; Torrens, Francisco; Pérez-Giménez, Facundo

    2010-11-01

    Novel bond-level molecular descriptors are proposed, based on linear maps similar to the ones defined in algebra theory. The kth edge-adjacency matrix (E(k)) denotes the matrix of bond linear indices (non-stochastic) with regard to canonical basis set. The kth stochastic edge-adjacency matrix, ES(k), is here proposed as a new molecular representation easily calculated from E(k). Then, the kth stochastic bond linear indices are calculated using ES(k) as operators of linear transformations. In both cases, the bond-type formalism is developed. The kth non-stochastic and stochastic total linear indices are calculated by adding the kth non-stochastic and stochastic bond linear indices, respectively, of all bonds in molecule. First, the new bond-based molecular descriptors (MDs) are tested for suitability, for the QSPRs, by analyzing regressions of novel indices for selected physicochemical properties of octane isomers (first round). General performance of the new descriptors in this QSPR studies is evaluated with regard to the well-known sets of 2D/3D MDs. From the analysis, we can conclude that the non-stochastic and stochastic bond-based linear indices have an overall good modeling capability proving their usefulness in QSPR studies. Later, the novel bond-level MDs are also used for the description and prediction of the boiling point of 28 alkyl-alcohols (second round), and to the modeling of the specific rate constant (log k), partition coefficient (log P), as well as the antibacterial activity of 34 derivatives of 2-furylethylenes (third round). The comparison with other approaches (edge- and vertices-based connectivity indices, total and local spectral moments, and quantum chemical descriptors as well as E-state/biomolecular encounter parameters) exposes a good behavior of our method in this QSPR studies. Finally, the approach described in this study appears to be a very promising structural invariant, useful not only for QSPR studies but also for similarity/diversity analysis and drug discovery protocols.

  16. The cardiorespiratory interaction: a nonlinear stochastic model and its synchronization properties

    NASA Astrophysics Data System (ADS)

    Bahraminasab, A.; Kenwright, D.; Stefanovska, A.; McClintock, P. V. E.

    2007-06-01

    We address the problem of interactions between the phase of cardiac and respiration oscillatory components. The coupling between these two quantities is experimentally investigated by the theory of stochastic Markovian processes. The so-called Markov analysis allows us to derive nonlinear stochastic equations for the reconstruction of the cardiorespiratory signals. The properties of these equations provide interesting new insights into the strength and direction of coupling which enable us to divide the couplings to two parts: deterministic and stochastic. It is shown that the synchronization behaviors of the reconstructed signals are statistically identical with original one.

  17. Efficient rehabilitation care for joint replacement patients: skilled nursing facility or inpatient rehabilitation facility?

    PubMed

    Tian, Wenqiang; DeJong, Gerben; Horn, Susan D; Putman, Koen; Hsieh, Ching-Hui; DaVanzo, Joan E

    2012-01-01

    There has been lengthy debate as to which setting, skilled nursing facility (SNF) or inpatient rehabilitation facility (IRF), is more efficient in treating joint replacement patients. This study aims to determine the efficiency of rehabilitation care provided by SNF and IRF to joint replacement patients with respect to both payment and length of stay (LOS). This study used a prospective multisite observational cohort design. Tobit models were used to examine the association between setting of care and efficiency. The study enrolled 948 knee replacement patients and 618 hip replacement patients from 11 IRFs and 7 SNFs between February 2006 and February 2007. Output was measured by motor functional independence measure (FIM) score at discharge. Efficiency was measured in 3 ways: payment efficiency, LOS efficiency, and stochastic frontier analysis efficiency. IRF patients incurred higher expenditures per case but also achieved larger motor FIM gains in shorter LOS than did SNF patients. Setting of care was not a strong predictor of overall efficiency of rehabilitation care. Great variation in characteristics existed within IRFs or SNFs and severity groups. Medium-volume facilities among both SNFs and IRFs were most efficient. Early rehabilitation was consistently predictive of efficient treatment. The advantage of either setting is not clear-cut. Definition of efficiency depends in part on preference between cost and time. SNFs are more payment efficient; IRFs are more LOS efficient. Variation within SNFs and IRFs blurred setting differences; a simple comparison between SNF and IRF may not be appropriate.

  18. Individualism in plant populations: using stochastic differential equations to model individual neighbourhood-dependent plant growth.

    PubMed

    Lv, Qiming; Schneider, Manuel K; Pitchford, Jonathan W

    2008-08-01

    We study individual plant growth and size hierarchy formation in an experimental population of Arabidopsis thaliana, within an integrated analysis that explicitly accounts for size-dependent growth, size- and space-dependent competition, and environmental stochasticity. It is shown that a Gompertz-type stochastic differential equation (SDE) model, involving asymmetric competition kernels and a stochastic term which decreases with the logarithm of plant weight, efficiently describes individual plant growth, competition, and variability in the studied population. The model is evaluated within a Bayesian framework and compared to its deterministic counterpart, and to several simplified stochastic models, using distributional validation. We show that stochasticity is an important determinant of size hierarchy and that SDE models outperform the deterministic model if and only if structural components of competition (asymmetry; size- and space-dependence) are accounted for. Implications of these results are discussed in the context of plant ecology and in more general modelling situations.

  19. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  20. Forecasting the Relative and Cumulative Effects of Multiple Stressors on At-risk Populations

    DTIC Science & Technology

    2011-08-01

    Vitals (observed vital rates), Movement, Ranges, Barriers (barrier interactions), Stochasticity (a time series of stochasticity indices...Simulation Viewer are themselves stochastic . They can change each time it is run. B. 196 Analysis If multiple Census events are present in the life...30-year period. A monthly time series was generated for the 20th-century using monthly anomalies for temperature, precipitation, and percent

  1. [Gene method for inconsistent hydrological frequency calculation. I: Inheritance, variability and evolution principles of hydrological genes].

    PubMed

    Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie

    2018-04-01

    A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.

  2. A brief history of Forging New Frontiers, the annual conference of the Injury Free Coalition for Kids.

    PubMed

    Johnson, Estell Lenita; Barlow, Barbara

    2016-10-01

    The Injury Free Coalition for Kids Annual Conference has contributed to the dissemination of information pertaining to the development of the field of injury prevention. A content analysis was completed using conference agendas used during the span of 2005-2015, finding that more than 398 presentations covering a wide variety of injuries have taken place. Published work has appeared in the Journal of Trauma and there has been recognition of people who have contributed to the development of the field. Forging New Frontiers is a valuable tool for attendees to exchange information about injury prevention.

  3. The improving efficiency frontier of religious not-for-profit hospitals.

    PubMed

    Harrison, Jeffrey P; Sexton, Christopher

    2006-01-01

    By using data-envelopment analysis (DEA), this study evaluates the efficiency of religious not-for-profit hospitals. Hospital executives, healthcare policy makers, taxpayers, and other stakeholders benefit from studies that improve hospital efficiency. Results indicate that overall efficiency in religious hospitals improved from 72% in 1998 to 74% in 2001. What is more important is that the number of religious hospitals operating on the efficiency frontier increased from 40 in 1998 to 47 in 2001. This clearly documents that religious hospitals are becoming more efficient in the management of resources. From a policy perspective, this study highlights the economic importance of encouraging increased efficiency throughout the healthcare industry.

  4. Measurement of Low Carbon Economy Efficiency with a Three-Stage Data Envelopment Analysis: A Comparison of the Largest Twenty CO2 Emitting Countries

    PubMed Central

    Liu, Xiang; Liu, Jia

    2016-01-01

    This paper employs a three-stage approach to estimate low carbon economy efficiency in the largest twenty CO2 emitting countries from 2000 to 2012. The approach includes the following three stages: (1) use of a data envelopment analysis (DEA) model with undesirable output to estimate the low carbon economy efficiency and calculate the input and output slacks; (2) use of a stochastic frontier approach to eliminate the impacts of external environment variables on these slacks; (3) re-estimation of the efficiency with adjusted inputs and outputs to reflect the capacity of the government to develop a low carbon economy. The results indicate that the low carbon economy efficiency performances in these countries had worsened during the studied period. The performances in the third stage are larger than that in the first stage. Moreover, in general, low carbon economy efficiency in Annex I countries of the United Nations Framework Convention on Climate Change (UNFCCC) is better than that in Non-Annex I countries. However, the gap of the average efficiency score between Annex I and Non-Annex I countries in the first stage is smaller than that in the third stage. It implies that the external environment variables show greater influence on Non-Annex I countries than that on Annex I countries. These external environment variables should be taken into account in the transnational negotiation of the responsibility of promoting CO2 reductions. Most importantly, the developed countries (mostly in Annex I) should help the developing countries (mostly in Non-Annex I) to reduce carbon emission by opening or expanding the trade, such as encouraging the import and export of the energy-saving and sharing emission reduction technology. PMID:27834890

  5. Measurement of Low Carbon Economy Efficiency with a Three-Stage Data Envelopment Analysis: A Comparison of the Largest Twenty CO₂ Emitting Countries.

    PubMed

    Liu, Xiang; Liu, Jia

    2016-11-09

    This paper employs a three-stage approach to estimate low carbon economy efficiency in the largest twenty CO₂ emitting countries from 2000 to 2012. The approach includes the following three stages: (1) use of a data envelopment analysis (DEA) model with undesirable output to estimate the low carbon economy efficiency and calculate the input and output slacks; (2) use of a stochastic frontier approach to eliminate the impacts of external environment variables on these slacks; (3) re-estimation of the efficiency with adjusted inputs and outputs to reflect the capacity of the government to develop a low carbon economy. The results indicate that the low carbon economy efficiency performances in these countries had worsened during the studied period. The performances in the third stage are larger than that in the first stage. Moreover, in general, low carbon economy efficiency in Annex I countries of the United Nations Framework Convention on Climate Change (UNFCCC) is better than that in Non-Annex I countries. However, the gap of the average efficiency score between Annex I and Non-Annex I countries in the first stage is smaller than that in the third stage. It implies that the external environment variables show greater influence on Non-Annex I countries than that on Annex I countries. These external environment variables should be taken into account in the transnational negotiation of the responsibility of promoting CO₂ reductions. Most importantly, the developed countries (mostly in Annex I) should help the developing countries (mostly in Non-Annex I) to reduce carbon emission by opening or expanding the trade, such as encouraging the import and export of the energy-saving and sharing emission reduction technology.

  6. Security in the CernVM File System and the Frontier Distributed Database Caching System

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  7. Vibrational spectroscopic and DFT calculation studies of 2-amino-7-bromo-5-oxo-[1]benzopyrano [2,3-b]pyridine-3 carbonitrile.

    PubMed

    Premkumar, S; Jawahar, A; Mathavan, T; Kumara Dhas, M; Milton Franklin Benial, A

    2015-03-05

    The vibrational spectra of 2-amino-7-bromo-5-oxo-[1]benzopyrano [2,3-b]pyridine-3 carbonitrile were recorded using fourier transform-infrared and fourier transform-Raman spectrometer. The optimized structural parameters, vibrational frequencies, Mulliken atomic charge distribution, frontier molecular orbitals, thermodynamic properties, temperature dependence of thermodynamic parameters, first order hyperpolarizability and natural bond orbital calculations of the molecule were performed using the Gaussian 09 program. The vibrational frequencies were assigned on the basis of potential energy distribution calculation using the VEDA 4.0 program. The calculated first order hyperpolarizability of ABOBPC molecule was obtained as 6.908×10(-30) issue, which was 10.5 times greater than urea. The nonlinear optical activity of the molecule was also confirmed by the frontier molecular orbitals and natural bond orbital analysis. The frontier molecular orbitals analysis shows that the lower energy gap of the molecule, which leads to the higher value of first order hyperpolarizability. The natural bond orbital analysis indicates that the nonlinear optical activity of the molecule arises due to the π→π(∗) transitions. The Mulliken atomic charge distribution confirms the presence of intramolecular charge transfer within the molecule. The reactive site of the molecule was predicted from the molecular electrostatic potential contour map. The values of thermo dynamic parameters were increasing with increasing temperature. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Energy Frontier Research Center Materials Science of Actinides (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Peter; Lenzen, Meehan

    "Energy Frontier Research Center Materials Science of Actinides" was submitted by the EFRC for Materials Science of Actinides (MSA) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. MSA is directed by Peter Burns at the University of Notre Dame, and is a partnership of scientists from ten institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Researchmore » Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  9. EFRC:CST at the University of Texas at Austin - A DOE Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Zhu, Xiaoyang (Director, Understanding Charge Separation and Transfer at Interfaces in Energy Materials); CST Staff

    2017-12-09

    'EFRC:CST at the University of Texas at Austin - A DOE Energy Frontier Research Center' was submitted by the EFRC for Understanding Charge Separation and Transfer at Interfaces in Energy Materials (EFRC:CST) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFRC:CST is directed by Xiaoyang Zhu at the University of Texas at Austin in partnership with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  10. Stochastic analysis of uncertain thermal parameters for random thermal regime of frozen soil around a single freezing pipe

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei

    2018-03-01

    The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.

  11. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  12. Sampling-Based Stochastic Sensitivity Analysis Using Score Functions for RBDO Problems with Correlated Random Variables

    DTIC Science & Technology

    2010-08-01

    a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables

  13. Forecasting transitions in systems with high-dimensional stochastic complex dynamics: a linear stability analysis of the tangled nature model.

    PubMed

    Cairoli, Andrea; Piovani, Duccio; Jensen, Henrik Jeldtoft

    2014-12-31

    We propose a new procedure to monitor and forecast the onset of transitions in high-dimensional complex systems. We describe our procedure by an application to the tangled nature model of evolutionary ecology. The quasistable configurations of the full stochastic dynamics are taken as input for a stability analysis by means of the deterministic mean-field equations. Numerical analysis of the high-dimensional stability matrix allows us to identify unstable directions associated with eigenvalues with a positive real part. The overlap of the instantaneous configuration vector of the full stochastic system with the eigenvectors of the unstable directions of the deterministic mean-field approximation is found to be a good early warning of the transitions occurring intermittently.

  14. Stochastic Stability of Sampled Data Systems with a Jump Linear Controller

    NASA Technical Reports Server (NTRS)

    Gonzalez, Oscar R.; Herencia-Zapana, Heber; Gray, W. Steven

    2004-01-01

    In this paper an equivalence between the stochastic stability of a sampled-data system and its associated discrete-time representation is established. The sampled-data system consists of a deterministic, linear, time-invariant, continuous-time plant and a stochastic, linear, time-invariant, discrete-time, jump linear controller. The jump linear controller models computer systems and communication networks that are subject to stochastic upsets or disruptions. This sampled-data model has been used in the analysis and design of fault-tolerant systems and computer-control systems with random communication delays without taking into account the inter-sample response. This paper shows that the known equivalence between the stability of a deterministic sampled-data system and the associated discrete-time representation holds even in a stochastic framework.

  15. Stochastic sensitivity analysis of the variability of dynamics and transition to chaos in the business cycles model

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana

    2018-01-01

    A problem of mathematical modeling of complex stochastic processes in macroeconomics is discussed. For the description of dynamics of income and capital stock, the well-known Kaldor model of business cycles is used as a basic example. The aim of the paper is to give an overview of the variety of stochastic phenomena which occur in Kaldor model forced by additive and parametric random noise. We study a generation of small- and large-amplitude stochastic oscillations, and their mixed-mode intermittency. To analyze these phenomena, we suggest a constructive approach combining the study of the peculiarities of deterministic phase portrait, and stochastic sensitivity of attractors. We show how parametric noise can stabilize the unstable equilibrium and transform dynamics of Kaldor system from order to chaos.

  16. Stochastic analysis of future vehicle populations

    DOT National Transportation Integrated Search

    1979-05-01

    The purpose of this study was to build a stochastic model of future vehicle populations. Such a model can be used to investigate the uncertainties inherent in Future Vehicle Populations. The model, which is called the Future Automobile Population Sto...

  17. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    PubMed

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  18. Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi

    2015-04-01

    The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be estimated based on the theory of stochastic processes, and it can be applied to the probabilistic risk of flood management.

  19. Use of behavioural stochastic resonance by paddle fish for feeding

    NASA Astrophysics Data System (ADS)

    Russell, David F.; Wilkens, Lon A.; Moss, Frank

    1999-11-01

    Stochastic resonance is the phenomenon whereby the addition of an optimal level of noise to a weak information-carrying input to certain nonlinear systems can enhance the information content at their outputs. Computer analysis of spike trains has been needed to reveal stochastic resonance in the responses of sensory receptors except for one study on human psychophysics. But is an animal aware of, and can it make use of, the enhanced sensory information from stochastic resonance? Here, we show that stochastic resonance enhances the normal feeding behaviour of paddlefish (Polyodon spathula), which use passive electroreceptors to detect electrical signals from planktonic prey. We demonstrate significant broadening of the spatial range for the detection of plankton when a noisy electric field of optimal amplitude is applied in the water. We also show that swarms of Daphnia plankton are a natural source of electrical noise. Our demonstration of stochastic resonance at the level of a vital animal behaviour, feeding, which has probably evolved for functional success, provides evidence that stochastic resonance in sensory nervous systems is an evolutionary adaptation.

  20. Frontiers in Chemistry.

    ERIC Educational Resources Information Center

    Joyce, Robert M., Ed.

    1980-01-01

    This article describes recent progress in chemical synthesis which depends on comparable advances in other areas of chemistry. Analysis and theories of chemical structure and reactions are determinants in progress in chemical synthesis and are described also. (Author/SA)

  1. Comparison of contact conditions obtained by direct simulation with statistical analysis for normally distributed isotropic surfaces

    NASA Astrophysics Data System (ADS)

    Uchidate, M.

    2018-09-01

    In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.

  2. Molecular Mapping of Flowering Time Major Genes and QTLs in Chickpea (Cicer arietinum L.)

    PubMed Central

    Mallikarjuna, Bingi P.; Samineni, Srinivasan; Thudi, Mahendar; Sajja, Sobhan B.; Khan, Aamir W.; Patil, Ayyanagowda; Viswanatha, Kannalli P.; Varshney, Rajeev K.; Gaur, Pooran M.

    2017-01-01

    Flowering time is an important trait for adaptation and productivity of chickpea in the arid and the semi-arid environments. This study was conducted for molecular mapping of genes/quantitative trait loci (QTLs) controlling flowering time in chickpea using F2 populations derived from four crosses (ICCV 96029 × CDC Frontier, ICC 5810 × CDC Frontier, BGD 132 × CDC Frontier and ICC 16641 × CDC Frontier). Genetic studies revealed monogenic control of flowering time in the crosses ICCV 96029 × CDC Frontier, BGD 132 × CDC Frontier and ICC 16641 × CDC Frontier, while digenic control with complementary gene action in ICC 5810 × CDC Frontier. The intraspecific genetic maps developed from these crosses consisted 75, 75, 68 and 67 markers spanning 248.8 cM, 331.4 cM, 311.1 cM and 385.1 cM, respectively. A consensus map spanning 363.8 cM with 109 loci was constructed by integrating four genetic maps. Major QTLs corresponding to flowering time genes efl-1 from ICCV 96029, efl-3 from BGD 132 and efl-4 from ICC 16641 were mapped on CaLG04, CaLG08 and CaLG06, respectively. The QTLs and linked markers identified in this study can be used in marker-assisted breeding for developing early maturing chickpea. PMID:28729871

  3. Stratigraphy and structural setting of Upper Cretaceous Frontier Formation, western Centennial Mountains, southwestern Montana and southeastern Idaho

    USGS Publications Warehouse

    Dyman, T.S.; Tysdal, R.G.; Perry, W.J.; Nichols, D.J.; Obradovich, J.D.

    2008-01-01

    Stratigraphic, sedimentologic, and palynologic data were used to correlate the Frontier Formation of the western Centennial Mountains with time-equivalent rocks in the Lima Peaks area and other nearby areas in southwestern Montana. The stratigraphic interval studied is in the middle and upper parts (but not uppermost) of the formation based on a comparison of sandstone petrography, palynologic age data, and our interpretation of the structure using a seismic line along the frontal zone of the Centennial Mountains and the adjacent Centennial Valley. The Frontier Formation is comprised of sandstone, siltstone, mudstone, limestone, and silty shale in fluvial and coastal depositional settings. A distinctive characteristic of these strata in the western Centennial Mountains is the absence of conglomerate and conglomeratic sandstone beds. Absence of conglomerate beds may be due to lateral facies changes associated with fluvial systems, a distal fining of grain size, and the absence of both uppermost and lower Frontier rocks in the study area. Palynostratigraphic data indicate a Coniacian age for the Frontier Formation in the western Centennial Mountains. These data are supported by a geochronologic age from the middle part of the Frontier at Lima Peaks indicating a possible late Coniacian-early Santonian age (86.25 ?? 0.38 Ma) for the middle Frontier there. The Frontier Formation in the western Centennial Mountains is comparable in age and thickness to part of the Frontier at Lima Peaks. These rocks represent one of the thickest known sequences of Frontier strata in the Rocky Mountain region. Deposition was from about 95 to 86 Ma (middle Cenomanian to at least early Santonian), during which time, shoreface sandstone of the Telegraph Creek Formation and marine shale of the Cody Shale were deposited to the east in the area now occupied by the Madison Range in southwestern Montana. Frontier strata in the western Centennial Mountains are structurally isolated from other Cretaceous rocks in the region and are part of the Lima thrust sheet that lies at the leading edge of the Sevier-style overthrusting in this part of southwestern Montana and adjacent southeastern Idaho.

  4. The relationship between stochastic and deterministic quasi-steady state approximations.

    PubMed

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R

    2015-11-23

    The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.

  5. Stochastic analysis of multiphase flow in porous media: II. Numerical simulations

    NASA Astrophysics Data System (ADS)

    Abin, A.; Kalurachchi, J. J.; Kemblowski, M. W.; Chang, C.-M.

    1996-08-01

    The first paper (Chang et al., 1995b) of this two-part series described the stochastic analysis using spectral/perturbation approach to analyze steady state two-phase (water and oil) flow in a, liquid-unsaturated, three fluid-phase porous medium. In this paper, the results between the numerical simulations and closed-form expressions obtained using the perturbation approach are compared. We present the solution to the one-dimensional, steady-state oil and water flow equations. The stochastic input processes are the spatially correlated logk where k is the intrinsic permeability and the soil retention parameter, α. These solutions are subsequently used in the numerical simulations to estimate the statistical properties of the key output processes. The comparison between the results of the perturbation analysis and numerical simulations showed a good agreement between the two methods over a wide range of logk variability with three different combinations of input stochastic processes of logk and soil parameter α. The results clearly demonstrated the importance of considering the spatial variability of key subsurface properties under a variety of physical scenarios. The variability of both capillary pressure and saturation is affected by the type of input stochastic process used to represent the spatial variability. The results also demonstrated the applicability of perturbation theory in predicting the system variability and defining effective fluid properties through the ergodic assumption.

  6. Mathematical Sciences Division 1992 Programs

    DTIC Science & Technology

    1992-10-01

    statistical theory that underlies modern signal analysis . There is a strong emphasis on stochastic processes and time series , particularly those which...include optimal resource planning and real- time scheduling of stochastic shop-floor processes. Scheduling systems will be developed that can adapt to...make forecasts for the length-of-service time series . Protocol analysis of these sessions will be used to idenify relevant contextual features and to

  7. Stochastic models of the Social Security trust funds.

    PubMed

    Burdick, Clark; Manchester, Joyce

    Each year in March, the Board of Trustees of the Social Security trust funds reports on the current and projected financial condition of the Social Security programs. Those programs, which pay monthly benefits to retired workers and their families, to the survivors of deceased workers, and to disabled workers and their families, are financed through the Old-Age, Survivors, and Disability Insurance (OASDI) Trust Funds. In their 2003 report, the Trustees present, for the first time, results from a stochastic model of the combined OASDI trust funds. Stochastic modeling is an important new tool for Social Security policy analysis and offers the promise of valuable new insights into the financial status of the OASDI trust funds and the effects of policy changes. The results presented in this article demonstrate that several stochastic models deliver broadly consistent results even though they use very different approaches and assumptions. However, they also show that the variation in trust fund outcomes differs as the approach and assumptions are varied. Which approach and assumptions are best suited for Social Security policy analysis remains an open question. Further research is needed before the promise of stochastic modeling is fully realized. For example, neither parameter uncertainty nor variability in ultimate assumption values is recognized explicitly in the analyses. Despite this caveat, stochastic modeling results are already shedding new light on the range and distribution of trust fund outcomes that might occur in the future.

  8. Center for Defect Physics - Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Stocks, G. Malcolm (Director, Center for Defect Physics in Structural Materials); CDP Staff

    2017-12-09

    'Center for Defect Physics - Energy Frontier Research Center' was submitted by the Center for Defect Physics (CDP) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from nine institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; Brown University; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Lawrence Livermore National Laboratory; Ohio State University; and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  9. EFRC: CST at the University of Texas at Austin- A DOE Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Xiaoyang

    "EFRC: CST at the University of Texas at Austin- A DOE Energy Frontier Research Center" was submitted by the EFRC for Understanding Charge Separation and Transfer at Interfaces in Energy Materials (EFRC:CST) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFRC: CST is directed by Xiaoyang Zhu at the University of Texas at Austin in partnership with Sandia National Laboratories. The Office of Basic Energy Sciences in themore » U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  10. Center for Defect Physics - Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stocks, G. Malcolm; Ice, Gene

    "Center for Defect Physics - Energy Frontier Research Center" was submitted by the Center for Defect Physics (CDP) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from eight institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Ohio State University;more » University of Georgia and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  11. Expanding the frontiers of population nutrition research: new questions, new methods, and new approaches.

    PubMed

    Pelletier, David L; Porter, Christine M; Aarons, Gregory A; Wuehler, Sara E; Neufeld, Lynnette M

    2013-01-01

    Nutrition research, ranging from molecular to population levels and all points along this spectrum, is exploring new frontiers as new technologies and societal changes create new possibilities and demands. This paper defines a set of frontiers at the population level that are being created by the increased societal recognition of the importance of nutrition; its connection to urgent health, social, and environmental problems; and the need for effective and sustainable solutions at the population level. The frontiers are defined in terms of why, what, who, and how we study at the population level and the disciplinary foundations for that research. The paper provides illustrations of research along some of these frontiers, an overarching framework for population nutrition research, and access to some of the literature from outside of nutrition that can enhance the intellectual coherence, practical utility, and societal benefit of population nutrition research. The frontiers defined in this paper build on earlier forward-looking efforts by the American Society for Nutrition and extend these efforts in significant ways. The American Society for Nutrition and its members can play pivotal roles in advancing these frontiers by addressing a number of well-recognized challenges associated with transdisciplinary and engaged research.

  12. Community Project for Accelerator Science and Simulation (ComPASS) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cary, John R.; Cowan, Benjamin M.; Veitzer, S. A.

    2016-03-04

    Tech-X participated across the full range of ComPASS activities, with efforts in the Energy Frontier primarily through modeling of laser plasma accelerators and dielectric laser acceleration, in the Intensity Frontier primarily through electron cloud modeling, and in Uncertainty Quantification being applied to dielectric laser acceleration. In the following we present the progress and status of our activities for the entire period of the ComPASS project for the different areas of Energy Frontier, Intensity Frontier and Uncertainty Quantification.

  13. Accelerated Sensitivity Analysis in High-Dimensional Stochastic Reaction Networks

    PubMed Central

    Arampatzis, Georgios; Katsoulakis, Markos A.; Pantazis, Yannis

    2015-01-01

    Existing sensitivity analysis approaches are not able to handle efficiently stochastic reaction networks with a large number of parameters and species, which are typical in the modeling and simulation of complex biochemical phenomena. In this paper, a two-step strategy for parametric sensitivity analysis for such systems is proposed, exploiting advantages and synergies between two recently proposed sensitivity analysis methodologies for stochastic dynamics. The first method performs sensitivity analysis of the stochastic dynamics by means of the Fisher Information Matrix on the underlying distribution of the trajectories; the second method is a reduced-variance, finite-difference, gradient-type sensitivity approach relying on stochastic coupling techniques for variance reduction. Here we demonstrate that these two methods can be combined and deployed together by means of a new sensitivity bound which incorporates the variance of the quantity of interest as well as the Fisher Information Matrix estimated from the first method. The first step of the proposed strategy labels sensitivities using the bound and screens out the insensitive parameters in a controlled manner. In the second step of the proposed strategy, a finite-difference method is applied only for the sensitivity estimation of the (potentially) sensitive parameters that have not been screened out in the first step. Results on an epidermal growth factor network with fifty parameters and on a protein homeostasis with eighty parameters demonstrate that the proposed strategy is able to quickly discover and discard the insensitive parameters and in the remaining potentially sensitive parameters it accurately estimates the sensitivities. The new sensitivity strategy can be several times faster than current state-of-the-art approaches that test all parameters, especially in “sloppy” systems. In particular, the computational acceleration is quantified by the ratio between the total number of parameters over the number of the sensitive parameters. PMID:26161544

  14. A stochastic method for stand-alone photovoltaic system sizing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabral, Claudia Valeria Tavora; Filho, Delly Oliveira; Martins, Jose Helvecio

    Photovoltaic systems utilize solar energy to generate electrical energy to meet load demands. Optimal sizing of these systems includes the characterization of solar radiation. Solar radiation at the Earth's surface has random characteristics and has been the focus of various academic studies. The objective of this study was to stochastically analyze parameters involved in the sizing of photovoltaic generators and develop a methodology for sizing of stand-alone photovoltaic systems. Energy storage for isolated systems and solar radiation were analyzed stochastically due to their random behavior. For the development of the methodology proposed stochastic analysis were studied including the Markov chainmore » and beta probability density function. The obtained results were compared with those for sizing of stand-alone using from the Sandia method (deterministic), in which the stochastic model presented more reliable values. Both models present advantages and disadvantages; however, the stochastic one is more complex and provides more reliable and realistic results. (author)« less

  15. Introducing "Frontiers in Zoology"

    PubMed

    Heinze, Jürgen; Tautz, Diethard

    2004-09-29

    As a biological discipline, zoology has one of the longest histories. Today it occasionally appears as though, due to the rapid expansion of life sciences, zoology has been replaced by more or less independent sub-disciplines amongst which exchange is often sparse. However, the recent advance of molecular methodology into "classical" fields of biology, and the development of theories that can explain phenomena on different levels of organisation, has led to a re-integration of zoological disciplines promoting a broader than usual approach to zoological questions. Zoology has re-emerged as an integrative discipline encompassing the most diverse aspects of animal life, from the level of the gene to the level of the ecosystem.The new journal Frontiers in Zoology is the first Open Access journal focussing on zoology as a whole. It aims to represent and re-unite the various disciplines that look at animal life from different perspectives and at providing the basis for a comprehensive understanding of zoological phenomena on all levels of analysis. Frontiers in Zoology provides a unique opportunity to publish high quality research and reviews on zoological issues that will be internationally accessible to any reader at no cost.

  16. Finite-time H∞ filtering for non-linear stochastic systems

    NASA Astrophysics Data System (ADS)

    Hou, Mingzhe; Deng, Zongquan; Duan, Guangren

    2016-09-01

    This paper describes the robust H∞ filtering analysis and the synthesis of general non-linear stochastic systems with finite settling time. We assume that the system dynamic is modelled by Itô-type stochastic differential equations of which the state and the measurement are corrupted by state-dependent noises and exogenous disturbances. A sufficient condition for non-linear stochastic systems to have the finite-time H∞ performance with gain less than or equal to a prescribed positive number is established in terms of a certain Hamilton-Jacobi inequality. Based on this result, the existence of a finite-time H∞ filter is given for the general non-linear stochastic system by a second-order non-linear partial differential inequality, and the filter can be obtained by solving this inequality. The effectiveness of the obtained result is illustrated by a numerical example.

  17. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics.

    PubMed

    Arampatzis, Georgios; Katsoulakis, Markos A; Rey-Bellet, Luc

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  18. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-01

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  19. Analysis of novel stochastic switched SILI epidemic models with continuous and impulsive control

    NASA Astrophysics Data System (ADS)

    Gao, Shujing; Zhong, Deming; Zhang, Yan

    2018-04-01

    In this paper, we establish two new stochastic switched epidemic models with continuous and impulsive control. The stochastic perturbations are considered for the natural death rate in each equation of the models. Firstly, a stochastic switched SILI model with continuous control schemes is investigated. By using Lyapunov-Razumikhin method, the sufficient conditions for extinction in mean are established. Our result shows that the disease could be die out theoretically if threshold value R is less than one, regardless of whether the disease-free solutions of the corresponding subsystems are stable or unstable. Then, a stochastic switched SILI model with continuous control schemes and pulse vaccination is studied. The threshold value R is derived. The global attractivity of the model is also obtained. At last, numerical simulations are carried out to support our results.

  20. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systemsmore » with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.« less

  1. Tipping point analysis of ocean acoustic noise

    NASA Astrophysics Data System (ADS)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  2. Medicine: The final frontier in cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Leachman, Sancy A.; Merlino, Glenn

    2017-01-01

    A computer, trained to classify skin cancers using image analysis alone, can now identify certain cancers as successfully as can skin-cancer doctors. What are the implications for the future of medical diagnosis? See Letter p.115

  3. A Fast Fourier transform stochastic analysis of the contaminant transport problem

    USGS Publications Warehouse

    Deng, F.W.; Cushman, J.H.; Delleur, J.W.

    1993-01-01

    A three-dimensional stochastic analysis of the contaminant transport problem is developed in the spirit of Naff (1990). The new derivation is more general and simpler than previous analysis. The fast Fourier transformation is used extensively to obtain numerical estimates of the mean concentration and various spatial moments. Data from both the Borden and Cape Cod experiments are used to test the methodology. Results are comparable to results obtained by other methods, and to the experiments themselves.

  4. Frontiers in Industrial Arts Education.

    ERIC Educational Resources Information Center

    MacDonnell, Elisabeth, Ed.; Strosnider, Floy, Ed.

    Presentation topics of the 28th annual American Industrial Arts Association Convention include: (1) "Where We Are in Federal Legislation Programs," (2) "Frontiers in Industrial Arts Education," and (3) "Industry's Cooperation with Education." Eleven symposia were conducted on the topic of "Implementing Frontier Ideas in Industrial Arts Education…

  5. Space: The Final Frontier in the Learning of Science?

    ERIC Educational Resources Information Center

    Milne, Catherine

    2014-01-01

    In "Space", relations, and the learning of science", Wolff-Michael Roth and Pei-Ling Hsu use ethnomethodology to explore high school interns learning shopwork and shoptalk in a research lab that is located in a world class facility for water quality analysis. Using interaction analysis they identify how spaces, like a research…

  6. Cost Efficiency in the University: A Departmental Evaluation Model

    ERIC Educational Resources Information Center

    Gimenez, Victor M.; Martinez, Jose Luis

    2006-01-01

    This article presents a model for the analysis of cost efficiency within the framework of data envelopment analysis models. It calculates the cost excess, separating a unit of production from its optimal or frontier levels, and, at the same time, breaks these excesses down into three explanatory factors: (a) technical inefficiency, which depends…

  7. Space Frontiers for New Pedagogies: A Tale of Constraints and Possibilities

    ERIC Educational Resources Information Center

    Jessop, Tansy; Gubby, Laura; Smith, Angela

    2012-01-01

    This article draws together two linked studies on formal teaching spaces within one university. The first consisted of a multi-method analysis, including observations of four teaching events, interviews with academics and estates staff, analysis of architectural plans, and a talking campus tour. The second study surveyed 166 students about their…

  8. Relationships among multiple aspects of agriculture's environmental impact and productivity: a meta-analysis to guide sustainable agriculture.

    PubMed

    German, Richard N; Thompson, Catherine E; Benton, Tim G

    2017-05-01

    Given the pressures on land to produce ever more food, doing it 'sustainably' is growing in importance. However, 'sustainable agriculture' is complex to define, not least because agriculture impacts in many different ways and it is not clear how different aspects of sustainability may be in synergy or trade off against each other. We conducted a meta-analysis to assess the relationships between multiple measures of sustainability using novel analytical methods, based around defining the efficiency frontier in the relationship between variables, as well as using correlation analysis. We define 20 grouped variables of agriculture's impact (e.g. on soil, greenhouse gas, water, biodiversity) and find evidence of both strong positive and negative correlations between them. Analysis based on the efficiency frontier suggests that trade-offs can be 'softened' by exploiting the natural between-study variation that arises from a combination of farming best practice and context. Nonetheless, the literature provides strong evidence of the relationship between yields and the negative externalities created by farming across a range of measures. © 2016 Cambridge Philosophical Society.

  9. Multivariate moment closure techniques for stochastic kinetic models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.

    2015-09-07

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporallymore » evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.« less

  10. Final Report for Dynamic Models for Causal Analysis of Panel Data. Models for Change in Quantitative Variables, Part II Scholastic Models. Part II, Chapter 4.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This document is part of a series of chapters described in SO 011 759. Stochastic models for the sociological analysis of change and the change process in quantitative variables are presented. The author lays groundwork for the statistical treatment of simple stochastic differential equations (SDEs) and discusses some of the continuities of…

  11. Stochastic Calculus and Differential Equations for Physics and Finance

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2013-02-01

    1. Random variables and probability distributions; 2. Martingales, Markov, and nonstationarity; 3. Stochastic calculus; 4. Ito processes and Fokker-Planck equations; 5. Selfsimilar Ito processes; 6. Fractional Brownian motion; 7. Kolmogorov's PDEs and Chapman-Kolmogorov; 8. Non Markov Ito processes; 9. Black-Scholes, martingales, and Feynman-Katz; 10. Stochastic calculus with martingales; 11. Statistical physics and finance, a brief history of both; 12. Introduction to new financial economics; 13. Statistical ensembles and time series analysis; 14. Econometrics; 15. Semimartingales; References; Index.

  12. Simulation of quantum dynamics based on the quantum stochastic differential equation.

    PubMed

    Li, Ming

    2013-01-01

    The quantum stochastic differential equation derived from the Lindblad form quantum master equation is investigated. The general formulation in terms of environment operators representing the quantum state diffusion is given. The numerical simulation algorithm of stochastic process of direct photodetection of a driven two-level system for the predictions of the dynamical behavior is proposed. The effectiveness and superiority of the algorithm are verified by the performance analysis of the accuracy and the computational cost in comparison with the classical Runge-Kutta algorithm.

  13. Existence and uniqueness of solution for a class of stochastic differential equations.

    PubMed

    Cao, Junfei; Huang, Zaitang; Zeng, Caibin

    2013-01-01

    A class of stochastic differential equations given by dx(t) = f(x(t))dt + g(x(t))dW(t),  x(t 0) = x 0,  t 0 ≤ t ≤ T < +∞, are investigated. Upon making some suitable assumptions, the existence and uniqueness of solution for the equations are obtained. Moreover, the existence and uniqueness of solution for stochastic Lorenz system, which is illustrated by example, are in good agreement with the theoretical analysis.

  14. Sensitivity analysis of consumption cycles

    NASA Astrophysics Data System (ADS)

    Jungeilges, Jochen; Ryazanova, Tatyana; Mitrofanova, Anastasia; Popova, Irina

    2018-05-01

    We study the special case of a nonlinear stochastic consumption model taking the form of a 2-dimensional, non-invertible map with an additive stochastic component. Applying the concept of the stochastic sensitivity function and the related technique of confidence domains, we establish the conditions under which the system's complex consumption attractor is likely to become observable. It is shown that the level of noise intensities beyond which the complex consumption attractor is likely to be observed depends on the weight given to past consumption in an individual's preference adjustment.

  15. Further studies using matched filter theory and stochastic simulation for gust loads prediction

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd Iii

    1993-01-01

    This paper describes two analysis methods -- one deterministic, the other stochastic -- for computing maximized and time-correlated gust loads for aircraft with nonlinear control systems. The first method is based on matched filter theory; the second is based on stochastic simulation. The paper summarizes the methods, discusses the selection of gust intensity for each method and presents numerical results. A strong similarity between the results from the two methods is seen to exist for both linear and nonlinear configurations.

  16. Coastal zone management with stochastic multi-criteria analysis.

    PubMed

    Félix, A; Baquerizo, A; Santiago, J M; Losada, M A

    2012-12-15

    The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  18. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  19. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  20. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  1. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  2. Deterministic and stochastic CTMC models from Zika disease transmission

    NASA Astrophysics Data System (ADS)

    Zevika, Mona; Soewono, Edy

    2018-03-01

    Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.

  3. Fast smooth second-order sliding mode control for systems with additive colored noises.

    PubMed

    Yang, Pengfei; Fang, Yangwang; Wu, Youli; Liu, Yunxia; Zhang, Danxu

    2017-01-01

    In this paper, a fast smooth second-order sliding mode control is presented for a class of stochastic systems with enumerable Ornstein-Uhlenbeck colored noises. The finite-time mean-square practical stability and finite-time mean-square practical reachability are first introduced. Instead of treating the noise as bounded disturbance, the stochastic control techniques are incorporated into the design of the controller. The finite-time convergence of the prescribed sliding variable dynamics system is proved by using stochastic Lyapunov-like techniques. Then the proposed sliding mode controller is applied to a second-order nonlinear stochastic system. Simulation results are presented comparing with smooth second-order sliding mode control to validate the analysis.

  4. Parallel replica dynamics method for bistable stochastic reaction networks: Simulation and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ting; Plecháč, Petr

    2017-12-01

    Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.

  5. Numerical methods for stochastic differential equations

    NASA Astrophysics Data System (ADS)

    Kloeden, Peter; Platen, Eckhard

    1991-06-01

    The numerical analysis of stochastic differential equations differs significantly from that of ordinary differential equations due to the peculiarities of stochastic calculus. This book provides an introduction to stochastic calculus and stochastic differential equations, both theory and applications. The main emphasise is placed on the numerical methods needed to solve such equations. It assumes an undergraduate background in mathematical methods typical of engineers and physicists, through many chapters begin with a descriptive summary which may be accessible to others who only require numerical recipes. To help the reader develop an intuitive understanding of the underlying mathematicals and hand-on numerical skills exercises and over 100 PC Exercises (PC-personal computer) are included. The stochastic Taylor expansion provides the key tool for the systematic derivation and investigation of discrete time numerical methods for stochastic differential equations. The book presents many new results on higher order methods for strong sample path approximations and for weak functional approximations, including implicit, predictor-corrector, extrapolation and variance-reduction methods. Besides serving as a basic text on such methods. the book offers the reader ready access to a large number of potential research problems in a field that is just beginning to expand rapidly and is widely applicable.

  6. Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.

    PubMed

    Caglar, Mehmet Umut; Pal, Ranadip

    2013-01-01

    Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.

  7. Cape York Peninsula, Australia: A Frontier Region Undergoing a Multifunctional Transition with Indigenous Engagement

    ERIC Educational Resources Information Center

    Holmes, John

    2012-01-01

    Within Australia's tropical savanna zone, the northernmost frontier regions have experienced the swiftest transition towards multifunctional occupance, as a formerly flimsy productivist mode is readily displaced by more complex modes, with greater prominence given to consumption, protection and Indigenous values. Of these frontier regions, Cape…

  8. Frontiers in Distributed Optimization and Control of Sustainable Power

    Science.gov Websites

    Optimization and Control of Sustainable Power Systems Workshop Frontiers in Distributed Optimization and Control of Sustainable Power Systems Workshop In January 2016, NREL's energy systems integration team hosted a workshop on frontiers in distributed optimization and control of sustainable power systems. The

  9. Organising Industrial Knowledge Dissemination on Frontier Technology

    ERIC Educational Resources Information Center

    Brintrup, A. M.; Ranasinghe, D.

    2008-01-01

    This paper describes the challenges faced by frontier technology education, typical among large integrated EU projects. These include an evolving nature, the scarcity of experts and established material, and the need for relevant material. Classical approaches to learning seem to not adequately address the needs of frontier technology alone.…

  10. On the Endogeneity of the Mean-Variance Efficient Frontier.

    ERIC Educational Resources Information Center

    Somerville, R. A.; O'Connell, Paul G. J.

    2002-01-01

    Explains that the endogeneity of the efficient frontier in the mean-variance model of portfolio selection is commonly obscured in portfolio selection literature and in widely used textbooks. Demonstrates endogeneity and discusses the impact of parameter changes on the mean-variance efficient frontier and on the beta coefficients of individual…

  11. American Pathfinders: Using Ray Bradbury's "Martian Chronicles" To Teach Frontier History.

    ERIC Educational Resources Information Center

    Schmalholz, Deborah Wielgot

    1999-01-01

    Presents an interdisciplinary thematic unit designed for eleventh graders that uses selected chapters of "The Martian Chronicles" to teach frontier history. Maintains that Bradbury's novel enriches students' understanding of the frontier because it compares the interactions between Native inhabitants of Mars and Earthlings to the…

  12. Stereoselective Epoxidation of 4-Deoxypentenosides: A Polarized-πModel

    PubMed Central

    Cheng, Gang; Boulineau, Fabien P.; Liew, Siong-Tern; Shi, Qicun; Wenthold, Paul G.; Wei, Alexander

    2008-01-01

    The high facioselectivity in the epoxidation of 4-deoxypentenosides (4-DPs) by dimethyldioxirane (DMDO) correlates with a stereoelectronic bias in the 4-DPs’ ground-state conformations, as elucidated by polarized-π frontier molecular orbital (PPFMO) analysis. PMID:16986946

  13. Expanding the Frontiers of Population Nutrition Research: New Questions, New Methods, and New Approaches12

    PubMed Central

    Pelletier, David L.; Porter, Christine M.; Aarons, Gregory A.; Wuehler, Sara E.; Neufeld, Lynnette M.

    2013-01-01

    Nutrition research, ranging from molecular to population levels and all points along this spectrum, is exploring new frontiers as new technologies and societal changes create new possibilities and demands. This paper defines a set of frontiers at the population level that are being created by the increased societal recognition of the importance of nutrition; its connection to urgent health, social, and environmental problems; and the need for effective and sustainable solutions at the population level. The frontiers are defined in terms of why, what, who, and how we study at the population level and the disciplinary foundations for that research. The paper provides illustrations of research along some of these frontiers, an overarching framework for population nutrition research, and access to some of the literature from outside of nutrition that can enhance the intellectual coherence, practical utility, and societal benefit of population nutrition research. The frontiers defined in this paper build on earlier forward-looking efforts by the American Society for Nutrition and extend these efforts in significant ways. The American Society for Nutrition and its members can play pivotal roles in advancing these frontiers by addressing a number of well-recognized challenges associated with transdisciplinary and engaged research. PMID:23319128

  14. Flow injection analysis simulations and diffusion coefficient determination by stochastic and deterministic optimization methods.

    PubMed

    Kucza, Witold

    2013-07-25

    Stochastic and deterministic simulations of dispersion in cylindrical channels on the Poiseuille flow have been presented. The random walk (stochastic) and the uniform dispersion (deterministic) models have been used for computations of flow injection analysis responses. These methods coupled with the genetic algorithm and the Levenberg-Marquardt optimization methods, respectively, have been applied for determination of diffusion coefficients. The diffusion coefficients of fluorescein sodium, potassium hexacyanoferrate and potassium dichromate have been determined by means of the presented methods and FIA responses that are available in literature. The best-fit results agree with each other and with experimental data thus validating both presented approaches. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  15. Stochastic Watershed Models for Risk Based Decision Making

    NASA Astrophysics Data System (ADS)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  16. Bedrock topography of Talos Dome and Frontier Mountain area

    NASA Astrophysics Data System (ADS)

    Forieri, A.; Tabacco, I.; della Vedova, A.; Zirizzotti, A.; de Michelis, P.

    2003-04-01

    Talos Dome is an ice dome in the East Antarctica near the coastal line. The exact position was located first with the analysis of ERS-1 data and then from kinematic GPS data collected in 2002. In the area of Talos Dome two traverse surveys were carried out in 1996 and 2002 and eight shallow snow firn cores were drilled in order to understand latitudinal and longitudinal gradient and to document climatic and atmospheric conditions. The interest in Talos Dome area is due to the possibility to extract an ice core down to the bedrock: it would be the first deep drilling in a near coastal site. Frontier Mountain is located about 30 km SE from Talos Dome and its blue ice field is an important meteorite trap. The mechanism concentration is due to the particular flow of ice, slow moving against an absolute and submerged barrier. In the area of Talos Dome and Frontier Mountain airborne radar surveys were conducted by Italian PNRA (Programma Nazionale di Ricerche in Antartide) in 1995, 1997, 1999 and 2001. We present here the bedrock topography obtained by the analysis of all radar data. Our objective is to have a full description of main caractheristics of the bedrock. This could be helpful in the choice of the best site for drilling and could provide more input data for flow model near Frontier Mountain. Radar data are not homogeneous because radar systems with different characteristics have been used. All data have been processed with the same criteria to obtain a homogeneous dataset. Radio-echo sounding records show quite good reflections from the ice sheet base and the internal layering. This confirms the preliminary results of snow radar data with a continuous and horizontal (up to 15 km from the Dome) internal layering. The data of all expeditions have been cross-controlled and are in good agreement each-other.

  17. Image analysis methods for assessing levels of image plane nonuniformity and stochastic noise in a magnetic resonance image of a homogeneous phantom.

    PubMed

    Magnusson, P; Olsson, L E

    2000-08-01

    Magnetic response image plane nonuniformity and stochastic noise are properties that greatly influence the outcome of quantitative magnetic resonance imaging (MRI) evaluations such as gel dosimetry measurements using MRI. To study these properties, robust and accurate image analysis methods are required. New nonuniformity level assessment methods were designed, since previous methods were found to be insufficiently robust and accurate. The new and previously reported nonuniformity level assessment methods were analyzed with respect to, for example, insensitivity to stochastic noise; and previously reported stochastic noise level assessment methods with respect to insensitivity to nonuniformity. Using the same image data, different methods were found to assess significantly different levels of nonuniformity. Nonuniformity levels obtained using methods that count pixels in an intensity interval, and obtained using methods that use only intensity values, were found not to be comparable. The latter were found preferable, since they assess the quantity intrinsically sought. A new method which calculates a deviation image, with every pixel representing the deviation from a reference intensity, was least sensitive to stochastic noise. Furthermore, unlike any other analyzed method, it includes all intensity variations across the phantom area and allows for studies of nonuniformity shapes. This new method was designed for accurate studies of nonuniformities in gel dosimetry measurements, but could also be used with benefit in quality assurance and acceptance testing of MRI, scintillation camera, and computer tomography systems. The stochastic noise level was found to be greatly method dependent. Two methods were found to be insensitive to nonuniformity and also simple to use in practice. One method assesses the stochastic noise level as the average of the levels at five different positions within the phantom area, and the other assesses the stochastic noise in a region outside the phantom area.

  18. Heart of the Solution - Energy Frontiers (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Green, Peter F. (Director, Center for Solar and Thermal Energy Conversion, University of Michigan); CSTEC Staff

    2017-12-09

    'Heart of the Solution - Energy Frontiers' was submitted by the Center for Solar and Thermal Energy Conversion (CSTEC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was both the People's Choice Award winner and selected as one of five winners by a distinguished panel of judges for its 'exemplary explanation of the role of an Energy Frontier Research Center'. The Center for Solar and Thermal Energy Conversion is directed by Peter F. Green at the University of Michigan. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Solar and Thermal Energy Conversion is 'to study complex material structures on the nanoscale to identify key features for their potential use as materials to convert solar energy and heat to electricity.' Research topics are: solar photovoltaic, photonic, optics, solar thermal, thermoelectric, phonons, thermal conductivity, solar electrodes, defects, ultrafast physics, interfacial characterization, matter by design, novel materials synthesis, charge transport, and self-assembly.

  19. Hamiltonian Analysis of Subcritical Stochastic Epidemic Dynamics

    PubMed Central

    2017-01-01

    We extend a technique of approximation of the long-term behavior of a supercritical stochastic epidemic model, using the WKB approximation and a Hamiltonian phase space, to the subcritical case. The limiting behavior of the model and approximation are qualitatively different in the subcritical case, requiring a novel analysis of the limiting behavior of the Hamiltonian system away from its deterministic subsystem. This yields a novel, general technique of approximation of the quasistationary distribution of stochastic epidemic and birth-death models and may lead to techniques for analysis of these models beyond the quasistationary distribution. For a classic SIS model, the approximation found for the quasistationary distribution is very similar to published approximations but not identical. For a birth-death process without depletion of susceptibles, the approximation is exact. Dynamics on the phase plane similar to those predicted by the Hamiltonian analysis are demonstrated in cross-sectional data from trachoma treatment trials in Ethiopia, in which declining prevalences are consistent with subcritical epidemic dynamics. PMID:28932256

  20. Frontier Schools in Montana: Challenges and Sustainability Practices. A Research Report

    ERIC Educational Resources Information Center

    Harmon, Hobart L.; Morton, Claudette

    2010-01-01

    This study reveals the challenges confronting small, rural "frontier" schools in Montana and the practices that contribute to their sustainability. A Montana frontier school is defined as a school district with 200 or fewer students and its attendant community in a county with five or fewer people per square mile. The researcher…

  1. Fermilab | Science at Fermilab | Experiments & Projects | Energy Frontier

    Science.gov Websites

    Go Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library

  2. Center for Support of Mental Health Services in Isolated Rural Areas. Final Report.

    ERIC Educational Resources Information Center

    Ciarlo, James A.

    In 1994, the University of Denver received a grant to develop and operate the Frontier Mental Health Services Resource Network (FMHSRN). FMHSRN's principal aim was to improve delivery of mental health services in sparsely populated "frontier" areas by providing technical assistance to frontier and rural audiences. Traditional…

  3. The Production and Cost Behavior of Higher Education Institutions.

    ERIC Educational Resources Information Center

    Carlson, Daryl E.

    This report is an empirical analysis of the "frontier" production and cost relationships between the number of students enrolled and the labor and capital inputs observed over a wide cross-section of four-year higher education institutions in the United States. In the analysis, students are differentiated as to type and as to part-time versus…

  4. Analysis of longitudinal data from the Puget Sound Transportation Panel : task F : cross section and dynamic analysis of activity and travel patterns in PSTP

    DOT National Transportation Integrated Search

    1995-02-01

    The profiles contained in the appendix are all in the Portland, Maine district. They are listed below by border groups as used in the study, with the U.S. Customs port codes indicated. Maine Frontier Border Crossings: Calais - Calais, Ferry Point, ME...

  5. Phenomenological analysis of medical time series with regular and stochastic components

    NASA Astrophysics Data System (ADS)

    Timashev, Serge F.; Polyakov, Yuriy S.

    2007-06-01

    Flicker-Noise Spectroscopy (FNS), a general approach to the extraction and parameterization of resonant and stochastic components contained in medical time series, is presented. The basic idea of FNS is to treat the correlation links present in sequences of different irregularities, such as spikes, "jumps", and discontinuities in derivatives of different orders, on all levels of the spatiotemporal hierarchy of the system under study as main information carriers. The tools to extract and analyze the information are power spectra and difference moments (structural functions), which complement the information of each other. The structural function stochastic component is formed exclusively by "jumps" of the dynamic variable while the power spectrum stochastic component is formed by both spikes and "jumps" on every level of the hierarchy. The information "passport" characteristics that are determined by fitting the derived expressions to the experimental variations for the stochastic components of power spectra and structural functions are interpreted as the correlation times and parameters that describe the rate of "memory loss" on these correlation time intervals for different irregularities. The number of the extracted parameters is determined by the requirements of the problem under study. Application of this approach to the analysis of tremor velocity signals for a Parkinsonian patient is discussed.

  6. DNA

    ERIC Educational Resources Information Center

    Stent, Gunther S.

    1970-01-01

    This history for molecular genetics and its explanation of DNA begins with an analysis of the Golden Jubilee essay papers, 1955. The paper ends stating that the higher nervous system is the one major frontier of biological inquiry which still offers some romance of research. (Author/VW)

  7. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  8. Reduced linear noise approximation for biochemical reaction networks with time-scale separation: The stochastic tQSSA+

    NASA Astrophysics Data System (ADS)

    Herath, Narmada; Del Vecchio, Domitilla

    2018-03-01

    Biochemical reaction networks often involve reactions that take place on different time scales, giving rise to "slow" and "fast" system variables. This property is widely used in the analysis of systems to obtain dynamical models with reduced dimensions. In this paper, we consider stochastic dynamics of biochemical reaction networks modeled using the Linear Noise Approximation (LNA). Under time-scale separation conditions, we obtain a reduced-order LNA that approximates both the slow and fast variables in the system. We mathematically prove that the first and second moments of this reduced-order model converge to those of the full system as the time-scale separation becomes large. These mathematical results, in particular, provide a rigorous justification to the accuracy of LNA models derived using the stochastic total quasi-steady state approximation (tQSSA). Since, in contrast to the stochastic tQSSA, our reduced-order model also provides approximations for the fast variable stochastic properties, we term our method the "stochastic tQSSA+". Finally, we demonstrate the application of our approach on two biochemical network motifs found in gene-regulatory and signal transduction networks.

  9. Frontier Fields: Engaging Educators, the Youth, and the Public in Exploring the Cosmic Frontier

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, Bonnie; Smith, Denise A.; Summers, Frank; Darnell, John A.; Ryer, Holly

    2015-01-01

    The Frontier Fields is a multi-cycle program of six deep-field observations of strong-lensing galaxy clusters that will be taken in parallel with six deep 'blank fields.' The three-year long collaborative program is led by observations from NASA's Great Observatories. The observations allow astronomers to look deeper into the universe than ever before, and potentially uncover galaxies that are as much as 100 times fainter than what the telescopes can typically observe. The Frontier Fields science program is ideal for informing audiences about scientific advances and topics in STEM. The study of galaxy properties, statistics, optics, and Einstein's theory of general relativity naturally leverages off of the science returns of the Frontier Fields program. As a result, the Space Telescope Science Institute's Office of Public Outreach (OPO) has initiated an education and public outreach (EPO) project to follow the progress of the Frontier Fields.For over two decades, the Hubble EPO program has sought to bring the wonders of the universe to the education community, the youth, and the public, and engage audiences in the adventure of scientific discovery. Program components include standards-based curriculum-support materials, exhibits and exhibit components, professional development workshops, and direct interactions with scientists. We are also leveraging our new social media strategy to bring the science program to the public in the form of an ongoing blog. The main underpinnings of the program's infrastructure are scientist-educator development teams, partnerships, and an embedded program evaluation component. OPO is leveraging this existing infrastructure to bring the Frontier Fields science program to the education community and the public in a cost-effective way.The Frontier Fields program has just completed its first year. This talk will feature the goals and current status of the Frontier Fields EPO program. We will highlight OPO's strategies and infrastructure that allows for the quick delivery of groundbreaking science to the education community and public.

  10. Global behavior analysis for stochastic system of 1,3-PD continuous fermentation

    NASA Astrophysics Data System (ADS)

    Zhu, Xi; Kliemann, Wolfgang; Li, Chunfa; Feng, Enmin; Xiu, Zhilong

    2017-12-01

    Global behavior for stochastic system of continuous fermentation in glycerol bio-dissimilation to 1,3-propanediol by Klebsiella pneumoniae is analyzed in this paper. This bioprocess cannot avoid the stochastic perturbation caused by internal and external disturbance which reflect on the growth rate. These negative factors can limit and degrade the achievable performance of controlled systems. Based on multiplicity phenomena, the equilibriums and bifurcations of the deterministic system are analyzed. Then, a stochastic model is presented by a bounded Markov diffusion process. In order to analyze the global behavior, we compute the control sets for the associated control system. The probability distributions of relative supports are also computed. The simulation results indicate that how the disturbed biosystem tend to stationary behavior globally.

  11. Didactic discussion of stochastic resonance effects and weak signals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adair, R.K.

    1996-12-01

    A simple, paradigmatic, model is used to illustrate some general properties of effects subsumed under the label stochastic resonance. In particular, analyses of the transparent model show that (1) a small amount of noise added to a much larger signal can greatly increase the response to the signal, but (2) a weak signal added to much larger noise will not generate a substantial added response. The conclusions drawn from the model illustrate the general result that stochastic resonance effects do not provide an avenue for signals that are much smaller than noise to affect biology. A further analysis demonstrates themore » effects of small signals in the shifting of biologically important chemical equilibria under conditions where stochastic resonance effects are significant.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFSmore » and Frontier.« less

  13. Quantitative analysis of random ameboid motion

    NASA Astrophysics Data System (ADS)

    Bödeker, H. U.; Beta, C.; Frank, T. D.; Bodenschatz, E.

    2010-04-01

    We quantify random migration of the social ameba Dictyostelium discoideum. We demonstrate that the statistics of cell motion can be described by an underlying Langevin-type stochastic differential equation. An analytic expression for the velocity distribution function is derived. The separation into deterministic and stochastic parts of the movement shows that the cells undergo a damped motion with multiplicative noise. Both contributions to the dynamics display a distinct response to external physiological stimuli. The deterministic component depends on the developmental state and ambient levels of signaling substances, while the stochastic part does not.

  14. Towards Stability Analysis of Jump Linear Systems with State-Dependent and Stochastic Switching

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo; Gonzalez, Oscar R.; Gray, W. Steven

    2004-01-01

    This paper analyzes the stability of hierarchical jump linear systems where the supervisor is driven by a Markovian stochastic process and by the values of the supervised jump linear system s states. The stability framework for this class of systems is developed over infinite and finite time horizons. The framework is then used to derive sufficient stability conditions for a specific class of hybrid jump linear systems with performance supervision. New sufficient stochastic stability conditions for discrete-time jump linear systems are also presented.

  15. Reflected stochastic differential equation models for constrained animal movement

    USGS Publications Warehouse

    Hanks, Ephraim M.; Johnson, Devin S.; Hooten, Mevin B.

    2017-01-01

    Movement for many animal species is constrained in space by barriers such as rivers, shorelines, or impassable cliffs. We develop an approach for modeling animal movement constrained in space by considering a class of constrained stochastic processes, reflected stochastic differential equations. Our approach generalizes existing methods for modeling unconstrained animal movement. We present methods for simulation and inference based on augmenting the constrained movement path with a latent unconstrained path and illustrate this augmentation with a simulation example and an analysis of telemetry data from a Steller sea lion (Eumatopias jubatus) in southeast Alaska.

  16. Three-stage stochastic pump: Another type of Onsager-Casimir symmetry and results far from equilibrium

    NASA Astrophysics Data System (ADS)

    Rosas, Alexandre; Van den Broeck, Christian; Lindenberg, Katja

    2018-06-01

    The stochastic thermodynamic analysis of a time-periodic single particle pump sequentially exposed to three thermochemical reservoirs is presented. The analysis provides explicit results for flux, thermodynamic force, entropy production, work, and heat. These results apply near equilibrium as well as far from equilibrium. In the linear response regime, a different type of Onsager-Casimir symmetry is uncovered. The Onsager matrix becomes symmetric in the limit of zero dissipation.

  17. Analysis and Application of Quality Economics Based on Input-Output

    NASA Astrophysics Data System (ADS)

    Lu, Qiang; Li, Xin

    2018-01-01

    Quality economics analysis is an important research area in the current economic frontier, which has a huge role in promoting the quality-benefit type road development in China. Through the study of quality economics analysis and application, economics of quality and quality economics management are summarized, and theoretical framework of quality economics analysis is constructed. Finally, the quality economics analysis of aerospace equipment is taken as an example to carry on the application research.

  18. Autoionizing states driven by stochastic electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Mouloudakis, G.; Lambropoulos, P.

    2018-01-01

    We have examined the profile of an isolated autoionizing resonance driven by a pulse of short duration and moderately strong field. The analysis has been based on stochastic differential equations governing the time evolution of the density matrix under a stochastic field. Having focused our quantitative analysis on the 2{{s}}2{{p}}({}1{{P}}) resonance of helium, we have investigated the role of field fluctuations and of the duration of the pulse. We report surprisingly strong distortion of the profile, even for peak intensity below the strong field limit. Our results demonstrate the intricate connection between intensity and pulse duration, with the latter appearing to be the determining influence, even for a seemingly short pulse of 50 fs. Further effects that would arise under much shorter pulses are discussed.

  19. Heart of the Solution - Energy Frontiers (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Peter F.

    "Heart of the Solution- Energy Frontiers" was submitted by the Center for Solar and Thermal Energy Conversion (CSTEC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was both the People's Choice Award winner and selected as one of five winners by a distinguished panel of judges for its "exemplary explanation of the role of an Energy Frontier Research Center". The Center for Solar and Thermal Energymore » Conversion is directed by Peter F. Green at the University of Michigan. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Solar and Thermal Energy Conversion is 'to study complex material structures on the nanoscale to identify key features for their potential use as materials to convert solar energy and heat to electricity.' Research topics are: solar photovoltaic, photonic, optics, solar thermal, thermoelectric, phonons, thermal conductivity, solar electrodes, defects, ultrafast physics, interfacial characterization, matter by design, novel materials synthesis, charge transport, and self-assembly.« less

  20. Bidirectional Classical Stochastic Processes with Measurements and Feedback

    NASA Technical Reports Server (NTRS)

    Hahne, G. E.

    2005-01-01

    A measurement on a quantum system is said to cause the "collapse" of the quantum state vector or density matrix. An analogous collapse occurs with measurements on a classical stochastic process. This paper addresses the question of describing the response of a classical stochastic process when there is feedback from the output of a measurement to the input, and is intended to give a model for quantum-mechanical processes that occur along a space-like reaction coordinate. The classical system can be thought of in physical terms as two counterflowing probability streams, which stochastically exchange probability currents in a way that the net probability current, and hence the overall probability, suitably interpreted, is conserved. The proposed formalism extends the . mathematics of those stochastic processes describable with linear, single-step, unidirectional transition probabilities, known as Markov chains and stochastic matrices. It is shown that a certain rearrangement and combination of the input and output of two stochastic matrices of the same order yields another matrix of the same type. Each measurement causes the partial collapse of the probability current distribution in the midst of such a process, giving rise to calculable, but non-Markov, values for the ensuing modification of the system's output probability distribution. The paper concludes with an analysis of a classical probabilistic version of the so-called grandfather paradox.

  1. The Frontier Framework (and its eight Frontier Archetypes): A new conceptual approach to representing staff and patient well-being in health systems.

    PubMed

    Baines, Darrin L

    2018-05-04

    This paper proposes a new conceptual framework for jointly analysing the production of staff and patient welfare in health systems. Research to date has identified a direct link between staff and patient well-being. However, until now, no one has produced a unified framework for analysing them concurrently. In response, this paper introduces the "Frontier Framework". The new conceptual framework is applicable to all health systems regardless of their structure or financing. To demonstrate the benefits of its use, an empirical example of the Frontier Framework is constructed using data from the UK's National Health Service. This paper also introduces eight "Frontier Archetypes", which represent common patterns of welfare generation observable in health organisations involved in programmes of change. These archetypes may be used in planning, monitoring or creating narratives about organisational journeys. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. A developmental basis for stochasticity in floral organ numbers

    PubMed Central

    Kitazawa, Miho S.; Fujimoto, Koichi

    2014-01-01

    Stochasticity ubiquitously inevitably appears at all levels from molecular traits to multicellular, morphological traits. Intrinsic stochasticity in biochemical reactions underlies the typical intercellular distributions of chemical concentrations, e.g., morphogen gradients, which can give rise to stochastic morphogenesis. While the universal statistics and mechanisms underlying the stochasticity at the biochemical level have been widely analyzed, those at the morphological level have not. Such morphological stochasticity is found in foral organ numbers. Although the floral organ number is a hallmark of floral species, it can distribute stochastically even within an individual plant. The probability distribution of the floral organ number within a population is usually asymmetric, i.e., it is more likely to increase rather than decrease from the modal value, or vice versa. We combined field observations, statistical analysis, and mathematical modeling to study the developmental basis of the variation in floral organ numbers among 50 species mainly from Ranunculaceae and several other families from core eudicots. We compared six hypothetical mechanisms and found that a modified error function reproduced much of the asymmetric variation found in eudicot floral organ numbers. The error function is derived from mathematical modeling of floral organ positioning, and its parameters represent measurable distances in the floral bud morphologies. The model predicts two developmental sources of the organ-number distributions: stochastic shifts in the expression boundaries of homeotic genes and a semi-concentric (whorled-type) organ arrangement. Other models species- or organ-specifically reproduced different types of distributions that reflect different developmental processes. The organ-number variation could be an indicator of stochasticity in organ fate determination and organ positioning. PMID:25404932

  3. Introducing "Frontiers in Zoology"

    PubMed Central

    Heinze, Jürgen; Tautz, Diethard

    2004-01-01

    As a biological discipline, zoology has one of the longest histories. Today it occasionally appears as though, due to the rapid expansion of life sciences, zoology has been replaced by more or less independent sub-disciplines amongst which exchange is often sparse. However, the recent advance of molecular methodology into "classical" fields of biology, and the development of theories that can explain phenomena on different levels of organisation, has led to a re-integration of zoological disciplines promoting a broader than usual approach to zoological questions. Zoology has re-emerged as an integrative discipline encompassing the most diverse aspects of animal life, from the level of the gene to the level of the ecosystem. The new journal Frontiers in Zoology is the first Open Access journal focussing on zoology as a whole. It aims to represent and re-unite the various disciplines that look at animal life from different perspectives and at providing the basis for a comprehensive understanding of zoological phenomena on all levels of analysis. Frontiers in Zoology provides a unique opportunity to publish high quality research and reviews on zoological issues that will be internationally accessible to any reader at no cost. PMID:15679902

  4. Final Report for Research in High Energy Physics (University of Hawaii)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browder, Thomas E.

    2013-08-31

    Here we present a final report for the DOE award for the University of Hawaii High Energy Physics Group (UHHEPG) for the period from December 1, 2009 to May 31, 2013 (including a period of no-cost extension). The high energy physics (HEP) group at the University of Hawaii (UH) has been engaged in experiments at the intensity frontier studying flavor physics (Task A: Belle, Belle-II and Task B: BES) and neutrinos (Task C: SuperK, LBNE, Double Chooz, DarkSide, and neutrino R\\&D). On the energy frontier, new types of pixel detectors were developed for upgrades of the ATLAS experiment at themore » LHC (Task D). On the cosmic frontier, there were investigations of ultra high-energy neutrino astrophysics and the highest energy cosmic rays using special radio detection techniques (Task E: AMBER, ANITA R\\&D) and results of the analysis of ANITA data. In addition, we have developed new types of sophisticated and cutting edge instrumentation based on novel ``oscilloscope on a chip'' electronics (Task F). Theoretical physics research (Task G) is phenomenologically oriented and has studied experimental consequences of existing and proposed new theories relevant to the energy, cosmic and intensity frontiers. The senior investigators for proposal were T. E. Browder (Task A), F. A. Harris (Task B), P. Gorham (Task E), J. Kumar (Task G), J. Maricic (Task C), J. G. Learned (Task C), S. Pakvasa (Task G), S. Parker (Task D), S. Matsuno (Task C), X. Tata (Task G) and G. S. Varner (Tasks F, A, E).« less

  5. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  6. Time Ordering in Frontal Lobe Patients: A Stochastic Model Approach

    ERIC Educational Resources Information Center

    Magherini, Anna; Saetti, Maria Cristina; Berta, Emilia; Botti, Claudio; Faglioni, Pietro

    2005-01-01

    Frontal lobe patients reproduced a sequence of capital letters or abstract shapes. Immediate and delayed reproduction trials allowed the analysis of short- and long-term memory for time order by means of suitable Markov chain stochastic models. Patients were as proficient as healthy subjects on the immediate reproduction trial, thus showing spared…

  7. Continuum of risk analysis methods to assess tillage system sustainability at the experimental plot level

    USDA-ARS?s Scientific Manuscript database

    The primary goal of this study was to evaluate the efficacy of stochastic dominance and stochastic efficiency with respect to a function (SERF) methodology for ranking conventional and conservation tillage systems using 14 years (1990-2003) of economic budget data collected from 36 plots at the Iowa...

  8. Analyzing long-term correlated stochastic processes by means of recurrence networks: Potentials and pitfalls

    NASA Astrophysics Data System (ADS)

    Zou, Yong; Donner, Reik V.; Kurths, Jürgen

    2015-02-01

    Long-range correlated processes are ubiquitous, ranging from climate variables to financial time series. One paradigmatic example for such processes is fractional Brownian motion (fBm). In this work, we highlight the potentials and conceptual as well as practical limitations when applying the recently proposed recurrence network (RN) approach to fBm and related stochastic processes. In particular, we demonstrate that the results of a previous application of RN analysis to fBm [Liu et al. Phys. Rev. E 89, 032814 (2014), 10.1103/PhysRevE.89.032814] are mainly due to an inappropriate treatment disregarding the intrinsic nonstationarity of such processes. Complementarily, we analyze some RN properties of the closely related stationary fractional Gaussian noise (fGn) processes and find that the resulting network properties are well-defined and behave as one would expect from basic conceptual considerations. Our results demonstrate that RN analysis can indeed provide meaningful results for stationary stochastic processes, given a proper selection of its intrinsic methodological parameters, whereas it is prone to fail to uniquely retrieve RN properties for nonstationary stochastic processes like fBm.

  9. Fault detection and diagnosis for non-Gaussian stochastic distribution systems with time delays via RBF neural networks.

    PubMed

    Yi, Qu; Zhan-ming, Li; Er-chao, Li

    2012-11-01

    A new fault detection and diagnosis (FDD) problem via the output probability density functions (PDFs) for non-gausian stochastic distribution systems (SDSs) is investigated. The PDFs can be approximated by radial basis functions (RBFs) neural networks. Different from conventional FDD problems, the measured information for FDD is the output stochastic distributions and the stochastic variables involved are not confined to Gaussian ones. A (RBFs) neural network technique is proposed so that the output PDFs can be formulated in terms of the dynamic weighings of the RBFs neural network. In this work, a nonlinear adaptive observer-based fault detection and diagnosis algorithm is presented by introducing the tuning parameter so that the residual is as sensitive as possible to the fault. Stability and Convergency analysis is performed in fault detection and fault diagnosis analysis for the error dynamic system. At last, an illustrated example is given to demonstrate the efficiency of the proposed algorithm, and satisfactory results have been obtained. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  10. A nonlinear dynamic age-structured model of e-commerce in spain: Stability analysis of the equilibrium by delay and stochastic perturbations

    NASA Astrophysics Data System (ADS)

    Burgos, C.; Cortés, J.-C.; Shaikhet, L.; Villanueva, R.-J.

    2018-11-01

    First, we propose a deterministic age-structured epidemiological model to study the diffusion of e-commerce in Spain. Afterwards, we determine the parameters (death, birth and growth rates) of the underlying demographic model as well as the parameters (transmission of the use of e-commerce rates) of the proposed epidemiological model that best fit real data retrieved from the Spanish National Statistical Institute. Motivated by the two following facts: first the dynamics of acquiring the use of a new technology as e-commerce is mainly driven by the feedback after interacting with our peers (family, friends, mates, mass media, etc.), hence having a certain delay, and second the inherent uncertainty of sampled real data and the social complexity of the phenomena under analysis, we introduce aftereffect and stochastic perturbations in the initial deterministic model. This leads to a delayed stochastic model for e-commerce. We then investigate sufficient conditions in order to guarantee the stability in probability of the equilibrium point of the dynamic e-commerce delayed stochastic model. Our theoretical findings are numerically illustrated using real data.

  11. Regression analysis on the variation in efficiency frontiers for prevention stage of HIV/AIDS.

    PubMed

    Kamae, Maki S; Kamae, Isao; Cohen, Joshua T; Neumann, Peter J

    2011-01-01

    To investigate how the cost effectiveness of preventing HIV/AIDS varies across possible efficiency frontiers (EFs) by taking into account potentially relevant external factors, such as prevention stage, and how the EFs can be characterized using regression analysis given uncertainty of the QALY-cost estimates. We reviewed cost-effectiveness estimates for the prevention and treatment of HIV/AIDS published from 2002-2007 and catalogued in the Tufts Medical Center Cost-Effectiveness Analysis (CEA) Registry. We constructed efficiency frontier (EF) curves by plotting QALYs against costs, using methods used by the Institute for Quality and Efficiency in Health Care (IQWiG) in Germany. We stratified the QALY-cost ratios by prevention stage, country of study, and payer perspective, and estimated EF equations using log and square-root models. A total of 53 QALY-cost ratios were identified for HIV/AIDS in the Tufts CEA Registry. Plotted ratios stratified by prevention stage were visually grouped into a cluster consisting of primary/secondary prevention measures and a cluster consisting of tertiary measures. Correlation coefficients for each cluster were statistically significant. For each cluster, we derived two EF equations - one based on the log model, and one based on the square-root model. Our findings indicate that stratification of HIV/AIDS interventions by prevention stage can yield distinct EFs, and that the correlation and regression analyses are useful for parametrically characterizing EF equations. Our study has certain limitations, such as the small number of included articles and the potential for study populations to be non-representative of countries of interest. Nonetheless, our approach could help develop a deeper appreciation of cost effectiveness beyond the deterministic approach developed by IQWiG.

  12. Joint Stochastic Inversion of Pre-Stack 3D Seismic Data and Well Logs for High Resolution Hydrocarbon Reservoir Characterization

    NASA Astrophysics Data System (ADS)

    Torres-Verdin, C.

    2007-05-01

    This paper describes the successful implementation of a new 3D AVA stochastic inversion algorithm to quantitatively integrate pre-stack seismic amplitude data and well logs. The stochastic inversion algorithm is used to characterize flow units of a deepwater reservoir located in the central Gulf of Mexico. Conventional fluid/lithology sensitivity analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generates typical Class III AVA responses. On the other hand, layer- dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution. Accordingly, AVA stochastic inversion, which combines the advantages of AVA analysis with those of geostatistical inversion, provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties (P-velocity, S-velocity, density), and lithotype (sand- shale) distributions. The quantitative use of rock/fluid information through AVA seismic amplitude data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, yields accurate 3D models of petrophysical properties such as porosity and permeability. Finally, by fully integrating pre-stack seismic amplitude data and well logs, the vertical resolution of inverted products is higher than that of deterministic inversions methods.

  13. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  14. Modelling Evolutionary Algorithms with Stochastic Differential Equations.

    PubMed

    Heredia, Jorge Pérez

    2017-11-20

    There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.

  15. 77 FR 39695 - HollyFrontier Refining and Marketing LLC v. Osage Pipe Line Company, LLC; Notice of Complaint

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. OR12-21-000] HollyFrontier Refining and Marketing LLC v. Osage Pipe Line Company, LLC; Notice of Complaint Take notice that on June 25...; 18 CFR 343.1(a) and 343.2(c), HollyFrontier Refining and Marketing LLC (Complainant) filed a formal...

  16. Proximate Population Factors and Deforestation in Tropical Agricultural Frontiers

    PubMed Central

    Carr, David L.

    2009-01-01

    Forest conversion for agriculture expansion is the most salient signature of human occupation of the earth’s land surface. Although population growth and deforestation are significantly associated at the global and regional scales, evidence for population links to deforestation at micro-scales—where people are actually clearing0020forests—is scant. Much of the planet’s forest elimination is proceeding along tropical agricultural frontiers. This article examines the evolution of thought on population–environment theories relevant to deforestation in tropical agricultural frontiers. Four primary ways by which population dynamics interact with frontier forest conversion are examined: population density, fertility, and household demographic composition, and in-migration. PMID:19672475

  17. Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin

    USGS Publications Warehouse

    Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.

    2018-01-01

    Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.

  18. Future HEP Accelerators: The US Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pushpalatha; Shiltsev, Vladimir

    2015-11-02

    Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN throughmore » its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.« less

  19. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  20. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  1. Lyapunov stability analysis for the generalized Kapitza pendulum

    NASA Astrophysics Data System (ADS)

    Druzhinina, O. V.; Sevastianov, L. A.; Vasilyev, S. A.; Vasilyeva, D. G.

    2017-12-01

    In this work generalization of Kapitza pendulum whose suspension point moves in the vertical and horizontal planes is made. Lyapunov stability analysis of the motion for this pendulum subjected to excitation of periodic driving forces and stochastic driving forces that act in the vertical and horizontal planes has been studied. The numerical study of the random motion for generalized Kapitza pendulum under stochastic driving forces has made. It is shown the existence of stable quasi-periodic motion for this pendulum.

  2. A Global Existence and Uniqueness Theorem for a Riccati Equation.

    DTIC Science & Technology

    1981-01-01

    made to an asymptotic stochastic analysis of a noisy duel problem. / DTICELECTE[I JUN 2 3 19820 !--i *This w paper was partially supported by AFOSR Grant...of these results is made to an asymptotic stochastic analysis of I ntssy duel problem. DD ,OR 1473 EDITION O, 1.OV 1SIS OSOLTE UNCLASTFIED SCUJRITY...motivated by the approach used in [3] and [6] to analyze the equal-accuracy noisy duel problem for two players having finite unequal units of ammunition

  3. Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis

    DTIC Science & Technology

    2015-08-13

    is due to Reiman [36] who considered the case where the arrivals and services are mutually independent renewal processes with square integrable summands...to a reflected diffusion process with drift and diffusion coefficients that depend on the state of the process. In models considered in works of Reiman ...the infinity Laplacian. Jour. AMS, to appear [36] M. I. Reiman . Open queueing networks in heavy traffic. Mathematics of Operations Research, 9(3): 441

  4. Disentangling Mechanisms That Mediate the Balance Between Stochastic and Deterministic Processes in Microbial Succession

    DOE PAGES

    Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...

    2015-03-17

    Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less

  5. Stochastic Averaging for Constrained Optimization With Application to Online Resource Allocation

    NASA Astrophysics Data System (ADS)

    Chen, Tianyi; Mokhtari, Aryan; Wang, Xin; Ribeiro, Alejandro; Giannakis, Georgios B.

    2017-06-01

    Existing approaches to resource allocation for nowadays stochastic networks are challenged to meet fast convergence and tolerable delay requirements. The present paper leverages online learning advances to facilitate stochastic resource allocation tasks. By recognizing the central role of Lagrange multipliers, the underlying constrained optimization problem is formulated as a machine learning task involving both training and operational modes, with the goal of learning the sought multipliers in a fast and efficient manner. To this end, an order-optimal offline learning approach is developed first for batch training, and it is then generalized to the online setting with a procedure termed learn-and-adapt. The novel resource allocation protocol permeates benefits of stochastic approximation and statistical learning to obtain low-complexity online updates with learning errors close to the statistical accuracy limits, while still preserving adaptation performance, which in the stochastic network optimization context guarantees queue stability. Analysis and simulated tests demonstrate that the proposed data-driven approach improves the delay and convergence performance of existing resource allocation schemes.

  6. The nature of combustion noise: Stochastic or chaotic?

    NASA Astrophysics Data System (ADS)

    Gupta, Vikrant; Lee, Min Chul; Li, Larry K. B.

    2016-11-01

    Combustion noise, which refers to irregular low-amplitude pressure oscillations, is conventionally thought to be stochastic. It has therefore been modeled using a stochastic term in the analysis of thermoacoustic systems. Recently, however, there has been a renewed interest in the validity of that stochastic assumption, with tests based on nonlinear dynamical theory giving seemingly contradictory results: some show combustion noise to be stochastic while others show it to be chaotic. In this study, we show that this contradiction arises because those tests cannot distinguish between noise amplification and chaos. We further show that although there are many similarities between noise amplification and chaos, there are also some subtle differences. It is these subtle differences, not the results of those tests, that should be the focus of analyses aimed at determining the true nature of combustion noise. Recognizing this is an important step towards improved understanding and modeling of combustion noise for the study of thermoacoustic instabilities. This work was supported by the Research Grants Council of Hong Kong (Project No. 16235716 and 26202815).

  7. Nonlinear stochastic interacting dynamics and complexity of financial gasket fractal-like lattice percolation

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Wang, Jun

    2018-05-01

    A novel nonlinear stochastic interacting price dynamics is proposed and investigated by the bond percolation on Sierpinski gasket fractal-like lattice, aim to make a new approach to reproduce and study the complexity dynamics of real security markets. Fractal-like lattices correspond to finite graphs with vertices and edges, which are similar to fractals, and Sierpinski gasket is a well-known example of fractals. Fractional ordinal array entropy and fractional ordinal array complexity are introduced to analyze the complexity behaviors of financial signals. To deeper comprehend the fluctuation characteristics of the stochastic price evolution, the complexity analysis of random logarithmic returns and volatility are preformed, including power-law distribution, fractional sample entropy and fractional ordinal array complexity. For further verifying the rationality and validity of the developed stochastic price evolution, the actual security market dataset are also studied with the same statistical methods for comparison. The empirical results show that this stochastic price dynamics can reconstruct complexity behaviors of the actual security markets to some extent.

  8. The Behavior of Hydrogen Under Extreme Conditions on Ultrafast Timescales (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Mao, Ho-kwang (Director, Center for Energy Frontier Research in Extreme Environments); EFree Staff

    2017-12-09

    'The Behavior of Hydrogen Under Extreme Conditions on Ultrafast Timescales ' was submitted by the Center for Energy Frontier Research in Extreme Environments (EFree) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFree is directed by Ho-kwang Mao at the Carnegie Institute of Washington and is a partnership of scientists from thirteen institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of Energy Frontier Research in Extreme Environments is 'to accelerate the discovery and creation of energy-relevant materials using extreme pressures and temperatures.' Research topics are: catalysis (CO{sub 2}, water), photocatalysis, solid state lighting, optics, thermelectric, phonons, thermal conductivity, solar electrodes, fuel cells, superconductivity, extreme environment, radiation effects, defects, spin dynamics, CO{sub 2} (capture, convert, store), greenhouse gas, hydrogen (fuel, storage), ultrafast physics, novel materials synthesis, and defect tolerant materials.

  9. Particle Physics at the Cosmic, Intensity, and Energy Frontiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Essig, Rouven

    Major efforts at the Intensity, Cosmic, and Energy frontiers of particle physics are rapidly furthering our understanding of the fundamental constituents of Nature and their interactions. The overall objectives of this research project are (1) to interpret and develop the theoretical implications of the data collected at these frontiers and (2) to provide the theoretical motivation, basis, and ideas for new experiments and for new analyses of experimental data. Within the Intensity Frontier, an experimental search for a new force mediated by a GeV-scale gauge boson will be carried out with the $A'$ Experiment (APEX) and the Heavy Photon Searchmore » (HPS), both at Jefferson Laboratory. Within the Cosmic Frontier, contributions are planned to the search for dark matter particles with the Fermi Gamma-ray Space Telescope and other instruments. A detailed exploration will also be performed of new direct detection strategies for dark matter particles with sub-GeV masses to facilitate the development of new experiments. In addition, the theoretical implications of existing and future dark matter-related anomalies will be examined. Within the Energy Frontier, the implications of the data from the Large Hadron Collider will be investigated. Novel search strategies will be developed to aid the search for new phenomena not described by the Standard Model of particle physics. By combining insights from all three particle physics frontiers, this research aims to increase our understanding of fundamental particle physics.« less

  10. Names and Weapons.

    ERIC Educational Resources Information Center

    Kauffman, Charles

    1989-01-01

    Traces the theoretical significance of using names as titles for situations, and applies this analysis to the United States' intercontinental ballistic missile (ICBM) programs. Argues that the names given to ICBMs preserve their utility as weapons by linking them to the myths of the nineteenth-century western frontier. (MM)

  11. A new look into the quantum chemical and spectroscopic investigations of 5-chloro-1-methyl-4-nitroimidazole.

    PubMed

    Arjunan, V; Raj, Arushma; Anitha, R; Mohan, S

    2014-05-05

    Optimised geometrical structural parameters, harmonic vibrational frequencies, natural bonding orbital analysis and frontier molecular orbitals are determined by B3LYP and B3PW91 methods. The exact geometry of 5-chloro-1-methyl-4-nitroimidazole is determined through conformational analysis. The experimentally observed infrared and Raman bands have been assigned and analysed. The (13)C and (1)H NMR chemical shifts of the compound are investigated. The total electron density and molecular electrostatic potentials are determined. The electrostatic potential (electron+nuclei) distribution, molecular shape, size and dipole moments of the molecule have been displayed. The energies of the frontier molecular orbitals and LUMO-HOMO energy gap are measured. The possible electronic transitions of the molecule are studied by TD-DFT method along with the UV-Visible spectrum. The structure-activity relationship of the compound is also investigated by conceptual DFT methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Energy: the microfluidic frontier.

    PubMed

    Sinton, David

    2014-09-07

    Global energy is largely a fluids problem. It is also large-scale, in stark contrast to microchannels. Microfluidic energy technologies must offer either massive scalability or direct relevance to energy processes already operating at scale. We have to pick our fights. Highlighted here are the exceptional opportunities I see, including some recent successes and areas where much more attention is needed. The most promising directions are those that leverage high surface-to-volume ratios, rapid diffusive transport, capacity for high temperature and high pressure experiments, and length scales characteristic of microbes and fluids (hydrocarbons, CO2) underground. The most immediate areas of application are where information is the product; either fluid sample analysis (e.g. oil analysis); or informing operations (e.g. CO2 transport in microporous media). I'll close with aspects that differentiate energy from traditional microfluidics applications, the uniquely important role of engineering in energy, and some thoughts for the research community forming at the nexus of lab-on-a-chip and energy--a microfluidic frontier.

  13. FEAMAC-CARES Software Coupling Development Effort for CMC Stochastic-Strength-Based Damage Simulation

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  14. Constrained Stochastic Extended Redundancy Analysis.

    PubMed

    DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco

    2015-06-01

    We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).

  15. An empirical analysis of the distribution of the duration of overshoots in a stationary gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Parrish, R. S.; Carter, M. C.

    1974-01-01

    This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.

  16. Efficient analysis of stochastic gene dynamics in the non-adiabatic regime using piecewise deterministic Markov processes

    PubMed Central

    2018-01-01

    Single-cell experiments show that gene expression is stochastic and bursty, a feature that can emerge from slow switching between promoter states with different activities. In addition to slow chromatin and/or DNA looping dynamics, one source of long-lived promoter states is the slow binding and unbinding kinetics of transcription factors to promoters, i.e. the non-adiabatic binding regime. Here, we introduce a simple analytical framework, known as a piecewise deterministic Markov process (PDMP), that accurately describes the stochastic dynamics of gene expression in the non-adiabatic regime. We illustrate the utility of the PDMP on a non-trivial dynamical system by analysing the properties of a titration-based oscillator in the non-adiabatic limit. We first show how to transform the underlying chemical master equation into a PDMP where the slow transitions between promoter states are stochastic, but whose rates depend upon the faster deterministic dynamics of the transcription factors regulated by these promoters. We show that the PDMP accurately describes the observed periods of stochastic cycles in activator and repressor-based titration oscillators. We then generalize our PDMP analysis to more complicated versions of titration-based oscillators to explain how multiple binding sites lengthen the period and improve coherence. Last, we show how noise-induced oscillation previously observed in a titration-based oscillator arises from non-adiabatic and discrete binding events at the promoter site. PMID:29386401

  17. Temperature variation effects on stochastic characteristics for low-cost MEMS-based inertial sensor error

    NASA Astrophysics Data System (ADS)

    El-Diasty, M.; El-Rabbany, A.; Pagiatakis, S.

    2007-11-01

    We examine the effect of varying the temperature points on MEMS inertial sensors' noise models using Allan variance and least-squares spectral analysis (LSSA). Allan variance is a method of representing root-mean-square random drift error as a function of averaging times. LSSA is an alternative to the classical Fourier methods and has been applied successfully by a number of researchers in the study of the noise characteristics of experimental series. Static data sets are collected at different temperature points using two MEMS-based IMUs, namely MotionPakII and Crossbow AHRS300CC. The performance of the two MEMS inertial sensors is predicted from the Allan variance estimation results at different temperature points and the LSSA is used to study the noise characteristics and define the sensors' stochastic model parameters. It is shown that the stochastic characteristics of MEMS-based inertial sensors can be identified using Allan variance estimation and LSSA and the sensors' stochastic model parameters are temperature dependent. Also, the Kaiser window FIR low-pass filter is used to investigate the effect of de-noising stage on the stochastic model. It is shown that the stochastic model is also dependent on the chosen cut-off frequency.

  18. Some Historical Background to the Country School Legacy: Frontier and Rural Schools in Colorado, 1859-1950. Country School Legacy: Humanities on the Frontier.

    ERIC Educational Resources Information Center

    Johnson, Charlie H., Jr.

    A study of historical background of the frontier and rural schools in Colorado describes education in the United State in general and the development of the educational process and school facilities during five phases of Colorado's economic and political development. "The Nation" discusses philosophies generally held during the middle…

  19. Doubly stochastic Poisson process models for precipitation at fine time-scales

    NASA Astrophysics Data System (ADS)

    Ramesh, Nadarajah I.; Onof, Christian; Xie, Dichao

    2012-09-01

    This paper considers a class of stochastic point process models, based on doubly stochastic Poisson processes, in the modelling of rainfall. We examine the application of this class of models, a neglected alternative to the widely-known Poisson cluster models, in the analysis of fine time-scale rainfall intensity. These models are mainly used to analyse tipping-bucket raingauge data from a single site but an extension to multiple sites is illustrated which reveals the potential of this class of models to study the temporal and spatial variability of precipitation at fine time-scales.

  20. Conference on Stochastic Processes and Their Applications (12th) held at Ithaca, New York on 11-15 Jul 83,

    DTIC Science & Technology

    1983-07-15

    RD- R136 626 CONFERENCE ON STOCHASTIC PROCESSES AND THEIR APPLICATIONS (12TH> JULY 11 15 1983 ITHACA NEW YORK(U) CORNELL UNIV ITHACA NY 15 JUL 83...oscillator phase Instability" 2t53 - 3s15 p.m. M.N. GOPALAN, Indian Institute of Technoloy, Bombay "Cost benefit analysis of systems subject to inspection...p.m. W. KLIEDANN, Univ. Bremen, Fed. Rep. Germany "Controllability of stochastic systems 8sO0 - lOsO0 p.m. RECEPTION Johnson Art Museum ’q % , t

  1. Fluid Stochastic Petri Nets: Theory, Applications, and Solution

    NASA Technical Reports Server (NTRS)

    Horton, Graham; Kulkarni, Vidyadhar G.; Nicol, David M.; Trivedi, Kishor S.

    1996-01-01

    In this paper we introduce a new class of stochastic Petri nets in which one or more places can hold fluid rather than discrete tokens. We define a class of fluid stochastic Petri nets in such a way that the discrete and continuous portions may affect each other. Following this definition we provide equations for their transient and steady-state behavior. We present several examples showing the utility of the construct in communication network modeling and reliability analysis, and discuss important special cases. We then discuss numerical methods for computing the transient behavior of such nets. Finally, some numerical examples are presented.

  2. Detailed thermodynamic analyses of high-speed compressible turbulence

    NASA Astrophysics Data System (ADS)

    Towery, Colin; Darragh, Ryan; Poludnenko, Alexei; Hamlington, Peter

    2016-11-01

    Interactions between high-speed turbulence and flames (or chemical reactions) are important in the dynamics and description of many different combustion phenomena, including autoignition and deflagration-to-detonation transition. The probability of these phenomena to occur depends on the magnitude and spectral content of turbulence fluctuations, which can impact a wide range of science and engineering problems, from the hypersonic scramjet engine to the onset of Type Ia supernovae. In this talk, we present results from new direct numerical simulations (DNS) of homogeneous isotropic turbulence with turbulence Mach numbers ranging from 0 . 05 to 1 . 0 and Taylor-scale Reynolds numbers as high as 700. A set of detailed analyses are described in both Eulerian and Lagrangian reference frames in order to assess coherent (structural) and incoherent (stochastic) thermodynamic flow features. These analyses provide direct insights into the thermodynamics of strongly compressible turbulence. Furthermore, presented results provide a non-reacting baseline for future studies of turbulence-chemistry interactions in DNS with complex chemistry mechanisms. This work was supported by the Air Force Office of Scientific Research (AFOSR) under Award No. FA9550-14-1-0273, and the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) under a Frontier project award.

  3. A new look at the decomposition of agricultural productivity growth incorporating weather effects.

    PubMed

    Njuki, Eric; Bravo-Ureta, Boris E; O'Donnell, Christopher J

    2018-01-01

    Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960-2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time.

  4. A new look at the decomposition of agricultural productivity growth incorporating weather effects

    PubMed Central

    Bravo-Ureta, Boris E.; O’Donnell, Christopher J.

    2018-01-01

    Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960–2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time. PMID:29466461

  5. Stochastic Modeling of Past Volcanic Crises

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2018-01-01

    The statistical foundation of disaster risk analysis is past experience. From a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of Nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios. In particular, the dynamic perturbations that might have transitioned a volcano from an unrest to an eruptive state need to be considered. The stochastic modeling of past volcanic crises leads to estimates of eruption probability that can illuminate historical volcanic crisis decisions. It can also inform future economic risk management decisions in regions where there has been some volcanic unrest, but no actual eruption for at least hundreds of years. Furthermore, the availability of a library of past eruption probabilities would provide benchmark support for estimates of eruption probability in future volcanic crises.

  6. Stochastic subspace identification for operational modal analysis of an arch bridge

    NASA Astrophysics Data System (ADS)

    Loh, Chin-Hsiung; Chen, Ming-Che; Chao, Shu-Hsien

    2012-04-01

    In this paer the application of output-only system identification technique, known as Stochastic Subspace Identification (SSI) algorithms, for civil infrastructures is carried out. The ability of covariance driven stochastic subspace identification (SSI-COV) was proved through the analysis of the ambient data of an arch bridge under operational condition. A newly developed signal processing technique, Singular Spectrum analysis (SSA), capable to smooth noisy signals, is adopted for pre-processing the recorded data before the SSI. The conjunction of SSA and SSICOV provides a useful criterion for the system order determination. With the aim of estimating accurate modal parameters of the structure in off-line analysis, a stabilization diagram is constructed by plotting the identified poles of the system with increasing the size of data Hankel matrix. Identification task of a real structure, Guandu Bridge, is carried out to identify the system natural frequencies and mode shapes. The uncertainty of the identified model parameters from output-only measurement of the bridge under operation condition, such as temperature and traffic loading conditions, is discussed.

  7. A stochastic evolutionary model generating a mixture of exponential distributions

    NASA Astrophysics Data System (ADS)

    Fenner, Trevor; Levene, Mark; Loizou, George

    2016-02-01

    Recent interest in human dynamics has stimulated the investigation of the stochastic processes that explain human behaviour in various contexts, such as mobile phone networks and social media. In this paper, we extend the stochastic urn-based model proposed in [T. Fenner, M. Levene, G. Loizou, J. Stat. Mech. 2015, P08015 (2015)] so that it can generate mixture models, in particular, a mixture of exponential distributions. The model is designed to capture the dynamics of survival analysis, traditionally employed in clinical trials, reliability analysis in engineering, and more recently in the analysis of large data sets recording human dynamics. The mixture modelling approach, which is relatively simple and well understood, is very effective in capturing heterogeneity in data. We provide empirical evidence for the validity of the model, using a data set of popular search engine queries collected over a period of 114 months. We show that the survival function of these queries is closely matched by the exponential mixture solution for our model.

  8. Parameter estimation uncertainty: Comparing apples and apples?

    NASA Astrophysics Data System (ADS)

    Hart, D.; Yoon, H.; McKenna, S. A.

    2012-12-01

    Given a highly parameterized ground water model in which the conceptual model of the heterogeneity is stochastic, an ensemble of inverse calibrations from multiple starting points (MSP) provides an ensemble of calibrated parameters and follow-on transport predictions. However, the multiple calibrations are computationally expensive. Parameter estimation uncertainty can also be modeled by decomposing the parameterization into a solution space and a null space. From a single calibration (single starting point) a single set of parameters defining the solution space can be extracted. The solution space is held constant while Monte Carlo sampling of the parameter set covering the null space creates an ensemble of the null space parameter set. A recently developed null-space Monte Carlo (NSMC) method combines the calibration solution space parameters with the ensemble of null space parameters, creating sets of calibration-constrained parameters for input to the follow-on transport predictions. Here, we examine the consistency between probabilistic ensembles of parameter estimates and predictions using the MSP calibration and the NSMC approaches. A highly parameterized model of the Culebra dolomite previously developed for the WIPP project in New Mexico is used as the test case. A total of 100 estimated fields are retained from the MSP approach and the ensemble of results defining the model fit to the data, the reproduction of the variogram model and prediction of an advective travel time are compared to the same results obtained using NSMC. We demonstrate that the NSMC fields based on a single calibration model can be significantly constrained by the calibrated solution space and the resulting distribution of advective travel times is biased toward the travel time from the single calibrated field. To overcome this, newly proposed strategies to employ a multiple calibration-constrained NSMC approach (M-NSMC) are evaluated. Comparison of the M-NSMC and MSP methods suggests that M-NSMC can provide a computationally efficient and practical solution for predictive uncertainty analysis in highly nonlinear and complex subsurface flow and transport models. This material is based upon work supported as part of the Center for Frontiers of Subsurface Energy Security, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Award Number DE-SC0001114. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  9. Stochastic Education in Childhood: Examining the Learning of Teachers and Students

    ERIC Educational Resources Information Center

    de Souza, Antonio Carlos; Lopes, Celi Espasandin; de Oliveira, Débora

    2014-01-01

    This paper presents discussions on stochastic education in early childhood, based on two doctoral research projects carried out with groups of preschool teachers from public schools in the Brazilian cities of Suzano and São Paulo who were participating in a continuing education program. The objective is to reflect on the analysis of two didactic…

  10. Global output feedback stabilisation of stochastic high-order feedforward nonlinear systems with time-delay

    NASA Astrophysics Data System (ADS)

    Zhang, Kemei; Zhao, Cong-Ran; Xie, Xue-Jun

    2015-12-01

    This paper considers the problem of output feedback stabilisation for stochastic high-order feedforward nonlinear systems with time-varying delay. By using the homogeneous domination theory and solving several troublesome obstacles in the design and analysis, an output feedback controller is constructed to drive the closed-loop system globally asymptotically stable in probability.

  11. Deformation effects of multi-functional monatomic carbon ring device

    NASA Astrophysics Data System (ADS)

    Qiu, Ming; Liew, K. M.

    2011-06-01

    Carrying on first-principles, the deformation effects on negative differential resistance (NDR) and rectifying behaviors of two cumulenic monatomic rings connected by polyyne and sandwiched between two Au electrodes are investigated. Interestingly, the number of obvious NDR whose peak-to-valley ratios increase from 1.24 to 5.16 is more than three and reverse rectification ratios also climb up from 1.42 to 7.89 with deformations increasing. Analysis of transmission spectra and frontier orbitals reveals that the response of different levels and resonant peaks, and transfer of the extended states to localized states of frontier orbital resonances under biases are responsible for these phenomena. Our works present a potential route to develop a multi-functional pressure device which has multi-peaks of NDR and rectifying behaviors.

  12. Improved ensemble-mean forecasting of ENSO events by a zero-mean stochastic error model of an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zheng, Fei; Zhu, Jiang

    2017-04-01

    How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.

  13. Evolution of the Climate Continuum from the Mid-Miocene Climatic Optimum to the Present

    NASA Astrophysics Data System (ADS)

    Aswasereelert, W.; Meyers, S. R.; Hinnov, L. A.; Kelly, D.

    2011-12-01

    The recognition of orbital rhythms in paleoclimate data has led to a rich understanding of climate evolution during the Neogene and Quaternary. In contrast, changes in stochastic variability associated with the transition from unipolar to bipolar glaciation have received less attention, although the stochastic component likely preserves key insights about climate. In this study, we seek to evaluate the dominance and character of stochastic climate energy since the Middle Miocene Climatic Optimum (~17 Ma). These analyses extend a previous study that suggested diagnostic stochastic responses associated with Northern Hemisphere ice sheet development during the Plio-Pleistocene (Meyers and Hinnov, 2010). A critical and challenging step necessary to conduct the work is the conversion of depth data to time data. We investigate climate proxy datasets using multiple time scale hypotheses, including depth-derived time scales, sedimentologic/geochemical "tuning", minimal orbital tuning, and comprehensive orbital tuning. To extract the stochastic component of climate, and also explore potential relationships between the orbital parameters and paleoclimate response, a number of approaches rooted in Thomson's (1982) multi-taper spectral method (MTM) are applied. Importantly, the MTM technique is capable of separating the spectral "continuum" - a measure of stochastic variability - from the deterministic periodic orbital signals (spectral "lines") preserved in proxy data. Time series analysis of the proxy records using different chronologic approaches allows us to evaluate the sensitivity of our conclusion about stochastic and deterministic orbital processes during the Middle Miocene to present. Moreover, comparison of individual records permits examination of the spatial dependence of the identified climate responses. Meyers, S.R., and Hinnov, L.A. (2010), Northern Hemisphere glaciation and the evolution of Plio-Pleistocene climate noise: Paleoceanography, 25, PA3207, doi:10.1029/2009PA001834. Thomson, D.J. (1982), Spectrum estimation and harmonic analysis: IEEE Proceedings, v. 70, p. 1055-1096.

  14. Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.

    PubMed

    Kang, Yun; Lanchier, Nicolas

    2011-06-01

    We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.

  15. Ongoing Voluntary Settlement and Independent Agency: Evidence from China

    PubMed Central

    Feng, Jing; Ren, Xiaopeng; Ma, Xinran

    2017-01-01

    Voluntary frontier settlement leads to independent agency. Since this type of research has not yet been implemented in ongoing voluntary settlement frontiers, we conducted several cultural tasks to investigate Shenzhen, known as China’s ongoing “South Frontier,” which is composed mostly of people that have emigrated from other Chinese provinces within the past 30 years. We hypothesized that residents of Shenzhen are more independent than those in other regions of Mainland China. As predicted, residents of Shenzhen scored higher than China inland residents in self-reported independent beliefs and scored lower in nepotism. The results indicate that, even in a short-term ongoing frontier, voluntary settlement is associated with independent agency. PMID:28798712

  16. Rural Cross-Sector Collaboration: A Social Frontier Analysis

    ERIC Educational Resources Information Center

    Miller, Peter M.; Scanlan, Martin K.; Phillippo, Kate

    2017-01-01

    Schools throughout the United States apply comprehensive community partnership strategies to address students' in- and out-of-school needs. Drawing from models like the Harlem Children's Zone, Promise Neighborhoods, and full-service community schools, such strategies call for diverse professionals to reach beyond their own organizations to…

  17. PARC - Scientific Exchange Program (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blankenship, Robert E.

    "PARC - Scientific Exchange Program" was submitted by the Photosynthetic Antenna Research Center (PARC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. PARC, an EFRC directed by Robert E. Blankenship at Washington University in St. Louis, is a partnership of scientists from ten institutions. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) inmore » 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  18. Electricity: The Energy of Tomorrow (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    "Electricity: the Energy of Tomorrow" was submitted by the Energy Materials Center at Cornell (emc2) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. emc2, an EFRC directed by Hector D. Abruna at Cornell University (lead) is a partnership between Cornell and Lawrence Berkeley National Laboratory. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs)more » in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  19. Electricity: The Energy of Tomorrow (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Abruna, Hector D. (Director, Energy Materials Center at Cornell); emc2 Staff

    2017-12-09

    'Electricity: the Energy of Tomorrow' was submitted by the Energy Materials Center at Cornell (emc2) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. emc2, an EFRC directed by Hector D. Abruna at Cornell University (lead) is a partnership between Cornell and Lawrence Berkeley National Laboratory. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  20. PARC - Scientific Exchange Program (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Blankenship, Robert E. (Director, Photosynthetic Antenna Research Center); PARC Staff

    2017-12-09

    'PARC - Scientific Exchange Program' was submitted by the Photosynthetic Antenna Research Center (PARC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. PARC, an EFRC directed by Robert E. Blankenship at Washington University in St. Louis, is a partnership of scientists from ten institutions. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  1. Econometrics of joint production: another approach. [Petroleum refining and petrochemicals production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, J.M.

    1977-11-01

    The pseudo data approach to the joint production of petroleum refining and chemicals is described as an alternative that avoids the multicollinearity of time series data and allows a complex technology to be characterized in a statistical price possibility frontier. Intended primarily for long-range analysis, the pseudo data method can be used as a source of elasticity estimate for policy analysis. 19 references.

  2. Stochastic resonance in a fractional oscillator driven by multiplicative quadratic noise

    NASA Astrophysics Data System (ADS)

    Ren, Ruibin; Luo, Maokang; Deng, Ke

    2017-02-01

    Stochastic resonance of a fractional oscillator subject to an external periodic field as well as to multiplicative and additive noise is investigated. The fluctuations of the eigenfrequency are modeled as the quadratic function of the trichotomous noise. Applying the moment equation method and Shapiro-Loginov formula, we obtain the exact expression of the complex susceptibility and related stability criteria. Theoretical analysis and numerical simulations indicate that the spectral amplification (SPA) depends non-monotonicly both on the external driving frequency and the parameters of the quadratic noise. In addition, the investigations into fractional stochastic systems have suggested that both the noise parameters and the memory effect can induce the phenomenon of stochastic multi-resonance (SMR), which is previously reported and believed to be absent in the case of the multiplicative noise with only a linear term.

  3. The Behavior of Hydrogen Under Extreme Conditions on Ultrafast Timescales (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    "The Behavior of Hydrogen Under Extreme Conditions on Ultrafast Timescales" was submitted by the Center for Energy Frontier Research in Extreme Environments (EFree) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFree is directed by Ho-kwang Mao at the Carnegie Institute of Science in Washington, DC and is a partnership of scientists from thirteen institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office ofmore » Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of Energy Frontier Research in Extreme Environments is 'to accelerate the discovery and creation of energy-relevant materials using extreme pressures and temperatures.' Research topics are: catalysis (CO2, water), photocatalysis, solid state lighting, optics, thermelectric, phonons, thermal conductivity, solar electrodes, fuel cells, superconductivity, extreme environment, radiation effects, defects, spin dynamics, CO2 (capture, convert, store), greenhouse gas, hydrogen (fuel, storage), ultrafast physics, novel materials synthesis, and defect tolerant materials.« less

  4. The Hubble Frontier Fields: Engaging Multiple Audiences in Exploring the Cosmic Frontier

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Smith, Denise A.; Summers, Frank; Ryer, Holly; Slivinski, Carolyn; Lotz, Jennifer M.

    2017-06-01

    The Hubble Frontier Fields is a multi-cycle program of six deep-field observations of strong-lensing galaxy clusters taken in parallel with six deep “blank fields.” The three-year long collaborative program began in late 2013 and is led by observations from NASA’s Great Observatories. The observations, now complete, allow astronomers to look deeper into the universe than ever before, and potentially uncover galaxies that are as much as 100 times fainter than what the telescopes can typically observe. The Frontier Fields science program is ideal for informing audiences about scientific advances and topics in STEM. The study of galaxy properties, statistics, optics, and Einstein’s theory of general relativity naturally leverages off of the science returns of the Frontier Fields program. As a result, the Space Telescope Science Institute’s Office of Public Outreach (OPO) has engaged multiple audiences over the past three years to follow the progress of the Frontier Fields.For over two decades, the STScI outreach program has sought to bring the wonders of the universe to the public and engage audiences in the adventure of scientific discovery. In addition, we are leveraging the reach of the new NASA’s Universe of Learning education program to bring the science of the Frontier Fields to informal education audiences. The main underpinnings of the STScI outreach program and the Universe of Learning education program are scientist-educator development teams, partnerships, and an embedded program evaluation component. OPO is leveraging the infrastructure of these education and outreach programs to bring the Frontier Fields science program to the education community and the public in a cost-effective way.This talk will feature highlights over the past three years of the program. We will highlight OPO’s strategies and infrastructure that allows for the quick delivery of groundbreaking science to the education community and public.

  5. Stochastic population dynamics of a montane ground-dwelling squirrel.

    PubMed

    Hostetler, Jeffrey A; Kneip, Eva; Van Vuren, Dirk H; Oli, Madan K

    2012-01-01

    Understanding the causes and consequences of population fluctuations is a central goal of ecology. We used demographic data from a long-term (1990-2008) study and matrix population models to investigate factors and processes influencing the dynamics and persistence of a golden-mantled ground squirrel (Callospermophilus lateralis) population, inhabiting a dynamic subalpine habitat in Colorado, USA. The overall deterministic population growth rate λ was 0.94±SE 0.05 but it varied widely over time, ranging from 0.45±0.09 in 2006 to 1.50±0.12 in 2003, and was below replacement (λ<1) for 9 out of 18 years. The stochastic population growth rate λ(s) was 0.92, suggesting a declining population; however, the 95% CI on λ(s) included 1.0 (0.52-1.60). Stochastic elasticity analysis showed that survival of adult females, followed by survival of juvenile females and litter size, were potentially the most influential vital rates; analysis of life table response experiments revealed that the same three life history variables made the largest contributions to year-to year changes in λ. Population viability analysis revealed that, when the influences of density dependence and immigration were not considered, the population had a high (close to 1.0 in 50 years) probability of extinction. However, probability of extinction declined to as low as zero when density dependence and immigration were considered. Destabilizing effects of stochastic forces were counteracted by regulating effects of density dependence and rescue effects of immigration, which allowed our study population to bounce back from low densities and prevented extinction. These results suggest that dynamics and persistence of our study population are determined synergistically by density-dependence, stochastic forces, and immigration.

  6. Stochastic Population Dynamics of a Montane Ground-Dwelling Squirrel

    PubMed Central

    Hostetler, Jeffrey A.; Kneip, Eva; Van Vuren, Dirk H.; Oli, Madan K.

    2012-01-01

    Understanding the causes and consequences of population fluctuations is a central goal of ecology. We used demographic data from a long-term (1990–2008) study and matrix population models to investigate factors and processes influencing the dynamics and persistence of a golden-mantled ground squirrel (Callospermophilus lateralis) population, inhabiting a dynamic subalpine habitat in Colorado, USA. The overall deterministic population growth rate λ was 0.94±SE 0.05 but it varied widely over time, ranging from 0.45±0.09 in 2006 to 1.50±0.12 in 2003, and was below replacement (λ<1) for 9 out of 18 years. The stochastic population growth rate λs was 0.92, suggesting a declining population; however, the 95% CI on λs included 1.0 (0.52–1.60). Stochastic elasticity analysis showed that survival of adult females, followed by survival of juvenile females and litter size, were potentially the most influential vital rates; analysis of life table response experiments revealed that the same three life history variables made the largest contributions to year-to year changes in λ. Population viability analysis revealed that, when the influences of density dependence and immigration were not considered, the population had a high (close to 1.0 in 50 years) probability of extinction. However, probability of extinction declined to as low as zero when density dependence and immigration were considered. Destabilizing effects of stochastic forces were counteracted by regulating effects of density dependence and rescue effects of immigration, which allowed our study population to bounce back from low densities and prevented extinction. These results suggest that dynamics and persistence of our study population are determined synergistically by density-dependence, stochastic forces, and immigration. PMID:22479616

  7. Amazon Land Wars in the South of Para

    NASA Technical Reports Server (NTRS)

    Simmons, Cynthia S.; Walker, Robert T.; Arima, Eugenio Y.; Aldrich, Stephen P.; Caldas, Marcellus M.

    2007-01-01

    The South of Para, located in the heart of the Brazilian Amazon, has become notorious for violent land struggle. Although land conflict has a long history in Brazil, and today impacts many parts of the country, violence is most severe and persistent here. The purpose of this article is to examine why. Specifically, we consider how a particular Amazonian place, the so-called South of Para has come to be known as Brazil's most dangerous badland. We begin by considering the predominant literature, which attributes land conflict to the frontier expansion process with intensified struggle emerging in the face of rising property values and demand for private property associated with capitalist development. From this discussion, we distill a concept of the frontier, based on notions of property rights evolution and locational rents. We then empirically test the persistence of place-based violence in the region, and assess the frontier movement through an analysis of transportation costs. The findings from the analyses indicate that the prevalent theorization of frontier violence in Amazonia does little to explain its persistent and pervasive nature in the South of Para. To fill this gap in understanding, we develop an explanation based the geographic conception of place, and we use contentious politics theory heuristically to elucidate the ways in which general processes interact with place specific history to engender a landscape of violence. In so doing, we focus on environmental, cognitive, and relational mechanisms (and implicated structures), and attempt to deploy them in an explanatory framework that allows direct observation of the accumulating layers of the region's tragic history. We end by placing our discussion within a political ecological context, and consider the implications of the Amazon Land War for the environment.

  8. The Herschel Lensing Survey (HLS): HST Frontier Field Coverage

    NASA Astrophysics Data System (ADS)

    Egami, Eiichi

    2015-08-01

    The Herschel Lensing Survey (HLS; PI: Egami) is a large Far-IR/Submm imaging survey of massive galaxy clusters using the Herschel Space Observatory. Its main goal is to detect and study IR/Submm galaxies that are below the nominal confusion limit of Herschel by taking advantage of the strong gravitational lensing power of massive galaxy clusters. HLS has obtained deep PACS (100/160 um) and SPIRE (250/350/500 um) images for 54 cluster fields (HLS-deep) as well as shallower but nearly confusion-limited SPIRE-only images for 527 cluster fields (HLS-snapshot) with a total observing time of ~420 hours. Extensive multi-wavelength follow-up studies are currently on-going with a variety of observing facilities including ALMA.Here, I will focus on the analysis of the deep Herschel PACS/SPIRE images obtained for the 6 HST Frontier Fields (5 observed by HLS-deep; 1 observed by the Herschel GT programs). The Herschel/SPIRE maps are wide enough to cover the Frontier-Field parallel pointings, and we have detected a total of ~180 sources, some of which are strongly lensed. I will present the sample and discuss the properties of these Herschel-detected dusty star-forming galaxies (DSFGs) identified in the Frontier Fields. Although the majority of these Herschel sources are at moderate redshift (z<3), a small number of extremely high-redshift (z>6) candidates can be identified as "Herschel dropouts" when combined with longer-wavelength data. We have also identified ~40 sources as likely cluster members, which will allow us to study the properties of DSFGs in the dense cluster environment.A great legacy of our HLS project will be the extensive multi-wavelength database that incorporates most of the currently available data/information for the fields of the Frontier-Field, CLASH, and other HLS clusters (e.g., HST/Spitzer/Herschel images, spectroscopic/photometric redshifts, lensing models, best-fit SED models etc.). Provided with a user-friendly GUI and a flexible search engine, this database should serve as a powerful tool for a variety of projects including those with ALMA and JWST in the future. I will conclude by introducing this HLS database system.

  9. Stochastic Dominance and Analysis of ODI Batting Performance: the Indian Cricket Team, 1989-2005

    PubMed Central

    Damodaran, Uday

    2006-01-01

    Relative to other team games, the contribution of individual team members to the overall team performance is more easily quantifiable in cricket. Viewing players as securities and the team as a portfolio, cricket thus lends itself better to the use of analytical methods usually employed in the analysis of securities and portfolios. This paper demonstrates the use of stochastic dominance rules, normally used in investment management, to analyze the One Day International (ODI) batting performance of Indian cricketers. The data used span the years 1989 to 2005. In dealing with cricketing data the existence of ‘not out’ scores poses a problem while processing the data. In this paper, using a Bayesian approach, the ‘not-out’ scores are first replaced with a conditional average. The conditional average that is used represents an estimate of the score that the player would have gone on to score, if the ‘not out’ innings had been completed. The data thus treated are then used in the stochastic dominance analysis. To use stochastic dominance rules we need to characterize the ‘utility’ of a batsman. The first derivative of the utility function, with respect to runs scored, of an ODI batsman can safely be assumed to be positive (more runs scored are preferred to less). However, the second derivative needs not be negative (no diminishing marginal utility for runs scored). This means that we cannot clearly specify whether the value attached to an additional run scored is lesser at higher levels of scores. Because of this, only first-order stochastic dominance is used to analyze the performance of the players under consideration. While this has its limitation (specifically, we cannot arrive at a complete utility value for each batsman), the approach does well in describing player performance. Moreover, the results have intuitive appeal. Key Points The problem of dealing with ‘not out’ scores in cricket is tackled using a Bayesian approach. Stochastic dominance rules are used to characterize the utility of a batsman. Since the marginal utility of runs scored is not diminishing in nature, only first order stochastic dominance rules are used. The results, demonstrated using data for the Indian cricket team are intuitively appealing. The limitation of the approach is that it cannot arrive at a complete utility value for the batsman. PMID:24357944

  10. Stochastic Dominance and Analysis of ODI Batting Performance: the Indian Cricket Team, 1989-2005.

    PubMed

    Damodaran, Uday

    2006-01-01

    Relative to other team games, the contribution of individual team members to the overall team performance is more easily quantifiable in cricket. Viewing players as securities and the team as a portfolio, cricket thus lends itself better to the use of analytical methods usually employed in the analysis of securities and portfolios. This paper demonstrates the use of stochastic dominance rules, normally used in investment management, to analyze the One Day International (ODI) batting performance of Indian cricketers. The data used span the years 1989 to 2005. In dealing with cricketing data the existence of 'not out' scores poses a problem while processing the data. In this paper, using a Bayesian approach, the 'not-out' scores are first replaced with a conditional average. The conditional average that is used represents an estimate of the score that the player would have gone on to score, if the 'not out' innings had been completed. The data thus treated are then used in the stochastic dominance analysis. To use stochastic dominance rules we need to characterize the 'utility' of a batsman. The first derivative of the utility function, with respect to runs scored, of an ODI batsman can safely be assumed to be positive (more runs scored are preferred to less). However, the second derivative needs not be negative (no diminishing marginal utility for runs scored). This means that we cannot clearly specify whether the value attached to an additional run scored is lesser at higher levels of scores. Because of this, only first-order stochastic dominance is used to analyze the performance of the players under consideration. While this has its limitation (specifically, we cannot arrive at a complete utility value for each batsman), the approach does well in describing player performance. Moreover, the results have intuitive appeal. Key PointsThe problem of dealing with 'not out' scores in cricket is tackled using a Bayesian approach.Stochastic dominance rules are used to characterize the utility of a batsman.Since the marginal utility of runs scored is not diminishing in nature, only first order stochastic dominance rules are used.The results, demonstrated using data for the Indian cricket team are intuitively appealing.The limitation of the approach is that it cannot arrive at a complete utility value for the batsman.

  11. Discrete Deterministic and Stochastic Petri Nets

    NASA Technical Reports Server (NTRS)

    Zijal, Robert; Ciardo, Gianfranco

    1996-01-01

    Petri nets augmented with timing specifications gained a wide acceptance in the area of performance and reliability evaluation of complex systems exhibiting concurrency, synchronization, and conflicts. The state space of time-extended Petri nets is mapped onto its basic underlying stochastic process, which can be shown to be Markovian under the assumption of exponentially distributed firing times. The integration of exponentially and non-exponentially distributed timing is still one of the major problems for the analysis and was first attacked for continuous time Petri nets at the cost of structural or analytical restrictions. We propose a discrete deterministic and stochastic Petri net (DDSPN) formalism with no imposed structural or analytical restrictions where transitions can fire either in zero time or according to arbitrary firing times that can be represented as the time to absorption in a finite absorbing discrete time Markov chain (DTMC). Exponentially distributed firing times are then approximated arbitrarily well by geometric distributions. Deterministic firing times are a special case of the geometric distribution. The underlying stochastic process of a DDSPN is then also a DTMC, from which the transient and stationary solution can be obtained by standard techniques. A comprehensive algorithm and some state space reduction techniques for the analysis of DDSPNs are presented comprising the automatic detection of conflicts and confusions, which removes a major obstacle for the analysis of discrete time models.

  12. Proper orthogonal decomposition-based spectral higher-order stochastic estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baars, Woutijn J., E-mail: wbaars@unimelb.edu.au; Tinney, Charles E.

    A unique routine, capable of identifying both linear and higher-order coherence in multiple-input/output systems, is presented. The technique combines two well-established methods: Proper Orthogonal Decomposition (POD) and Higher-Order Spectra Analysis. The latter of these is based on known methods for characterizing nonlinear systems by way of Volterra series. In that, both linear and higher-order kernels are formed to quantify the spectral (nonlinear) transfer of energy between the system's input and output. This reduces essentially to spectral Linear Stochastic Estimation when only first-order terms are considered, and is therefore presented in the context of stochastic estimation as spectral Higher-Order Stochastic Estimationmore » (HOSE). The trade-off to seeking higher-order transfer kernels is that the increased complexity restricts the analysis to single-input/output systems. Low-dimensional (POD-based) analysis techniques are inserted to alleviate this void as POD coefficients represent the dynamics of the spatial structures (modes) of a multi-degree-of-freedom system. The mathematical framework behind this POD-based HOSE method is first described. The method is then tested in the context of jet aeroacoustics by modeling acoustically efficient large-scale instabilities as combinations of wave packets. The growth, saturation, and decay of these spatially convecting wave packets are shown to couple both linearly and nonlinearly in the near-field to produce waveforms that propagate acoustically to the far-field for different frequency combinations.« less

  13. A stochastic model for correlated protein motions

    NASA Astrophysics Data System (ADS)

    Karain, Wael I.; Qaraeen, Nael I.; Ajarmah, Basem

    2006-06-01

    A one-dimensional Langevin-type stochastic difference equation is used to find the deterministic and Gaussian contributions of time series representing the projections of a Bovine Pancreatic Trypsin Inhibitor (BPTI) protein molecular dynamics simulation along different eigenvector directions determined using principal component analysis. The deterministic part shows a distinct nonlinear behavior only for eigenvectors contributing significantly to the collective protein motion.

  14. Convergence analysis of stochastic hybrid bidirectional associative memory neural networks with delays

    NASA Astrophysics Data System (ADS)

    Wan, Li; Zhou, Qinghua

    2007-10-01

    The stability property of stochastic hybrid bidirectional associate memory (BAM) neural networks with discrete delays is considered. Without assuming the symmetry of synaptic connection weights and the monotonicity and differentiability of activation functions, the delay-independent sufficient conditions to guarantee the exponential stability of the equilibrium solution for such networks are given by using the nonnegative semimartingale convergence theorem.

  15. First City, Anti-City: Cain, Heterotopia, and Study Abroad

    ERIC Educational Resources Information Center

    Kenney, Lance

    2011-01-01

    Cross-/inter-cultural development, language acquisition, employment potential, and the impact of homestays, program duration, direct enrollment, even grading policies have been analyzed in the pages of "Frontiers" and other journals. This analysis more often than not utilizes methodologies particular to social science research. What has been…

  16. Gear Fatigue Crack Diagnosis by Vibration Analysis Using Embedded Modeling

    DTIC Science & Technology

    2001-04-05

    gave references on Wigner - Ville Distribution ( WVD ) and some statistical based methods including FM4, NA4 and NB4. There are limitations for vibration...Embedded Modeling DISTRIBUTION : Approved for public release, distribution unlimited This paper is part of the following report: TITLE: New Frontiers in

  17. Frontiers of Remote Sensing of the Oceans and Troposphere from Air and Space Platforms

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Several areas of remote sensing are addressed including: future satellite systems; air-sea interaction/wind; ocean waves and spectra/S.A.R.; atmospheric measurements (particulates and water vapor); synoptic and weather forecasting; topography; bathymetry; sea ice; and impact of remote sensing on synoptic analysis/forecasting.

  18. EDITORIAL: Invited papers from the international meeting on 'New Frontiers in Numerical Relativity' (Albert Einstein Institute, Potsdam, Germany, 17 21 July 2006)

    NASA Astrophysics Data System (ADS)

    Campanelli, M.; Rezzolla, L.

    2007-06-01

    Traditionally, frontiers represent a treacherous terrain to venture into, where hidden obstacles are present and uncharted territories lie ahead. At the same time, frontiers are also a place where new perspectives can be appreciated and have often been the cradle of new and thriving developments. With this in mind and inspired by this spirit, the Numerical Relativity Group at the Albert Einstein Institute (AEI) organized a `New Frontiers in Numerical Relativity' meeting on 17 21 July 2006 at the AEI campus in Potsdam, Germany. It is an interesting historical remark that the suggestion of the meeting was first made in the late summer of 2005 and thus at a time that for many reasons has been a turning point in the recent history of numerical relativity. A few months earlier (April 2005) in fact, F Pretorius had announced the first multi-orbit simulations of binary black holes and computed the waveforms from the inspiral, merger and ring-down (`Numerical Relativity', Banff International Research Station, Banff, Canada, 16 21 April 2005). At that time, the work of Pretorius served as an important boost to the research in this field and although no other group has yet adopted the techniques he employed, his results provided the numerical relativity community with clear evidence that the binary black hole problem could be solved. A few months later (November 2005), equally striking results were presented by the NASA Goddard and Texas/Brownsville groups, who also reported, independently, multi-orbit evolutions of binary black holes using numerical techniques and formulations of the Einstein equations which were markedly distinct from those suggested by Pretorius (`Numerical Relativity 2005', Goddard Space Flight Centre, Greenbelt, MD, USA, 2 4 November 2005). A few months later other groups were able to repeat the same simulations and obtain equivalent results, testifying that the community as a whole had reached comparable levels of maturity in both the numerical techniques and the mathematical methods needed for successful solution of the Einstein equations for binary black holes. Clearly, an important frontier, and actually a long-awaited one, was finally open and the `gold rush' was just about to begin by the time the `New Frontiers in Numerical Relativity' meeting started its sessions in July 2006. And so, almost 20 years since the almost homonymous meeting held at Urbana Champaign (`Frontiers in Numerical Relativity', University of Illinois, IL, USA, 1988), the `New Frontiers in Numerical Relativity' meeting at the AEI saw the enthusiastic participation of a great part of the community, with 127 participants present (in 1988 they were 55) and with a large majority being represented by students and postdocs, a reassuring sign of good health for the community. Faithful to the title of the conference, the programme was dedicated to the many and diversified `frontiers' in numerical relativity and organized so as to have few talks with ample time dedicated to discussions. Overall, the talks presented at the meeting covered all of the most salient aspects of numerical relativity: from the formulation of the Einstein equations, over to the initial-value problem in general relativity, from the evolution of vacuum and non-vacuum spacetimes, to multiblock adaptive mesh-refinement techniques, from boundary conditions and perturbative methods, to relativistic fluids and plasmas. The contributions in this special issue represent a selection of that research, but also include invited papers from authors who were not present at the meeting but were pursuing research at the forefronts of numerical relativity. In addition to the more traditional sessions, the `New Frontiers in Numerical Relativity' meeting also hosted a less traditional session, dedicated to an `unconstrained' discussion which covered some of the most controversial issues that emerged during the conference. During this session, chaired by E Seidel, a lively discussion took place in the non-trivial attempt of marking the new frontiers on the map of numerical relativity. The transcript of this discussion is an integral part of this issue and it is available, along with the audio recording, in the online version only. We believe they embody an important part of the development of this field and, like a good bottle of wine, it will be interesting to read them again once sufficiently aged. As a concluding remark we note that it is almost one year since the `New Frontiers in Numerical Relativity' meeting and dozens of excellent papers have been published or posted on the preprint archive. Some of the scientific results obtained over these months, especially those revolving around binary black holes, were simply unimaginable a few years ago and represent an indisputable evidence that the research in numerical relativity has never been as exciting as it is now. These results have already had an impact in astrophysics and the community interested in the analysis of gravitational-wave data, thus opening new and different frontiers in numerical relativity. Interestingly, all of this is happening while ground-based gravitational wave detectors in the US and Europe are operating at a sensitivity such that gravitational radiation may soon be directly detected. While much still needs to be understood and improved, the gold rush towards the new frontiers of numerical relativity does not yet show any sign of being close to a rapid end.

  19. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student’s t-distribution*

    PubMed Central

    Leão, William L.; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor’s 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model. PMID:29333210

  20. A quantile-based scenario analysis approach to biomass supply chain optimization under uncertainty

    DOE PAGES

    Zamar, David S.; Gopaluni, Bhushan; Sokhansanj, Shahab; ...

    2016-11-21

    Supply chain optimization for biomass-based power plants is an important research area due to greater emphasis on renewable power energy sources. Biomass supply chain design and operational planning models are often formulated and studied using deterministic mathematical models. While these models are beneficial for making decisions, their applicability to real world problems may be limited because they do not capture all the complexities in the supply chain, including uncertainties in the parameters. This study develops a statistically robust quantile-based approach for stochastic optimization under uncertainty, which builds upon scenario analysis. We apply and evaluate the performance of our approach tomore » address the problem of analyzing competing biomass supply chains subject to stochastic demand and supply. Finally, the proposed approach was found to outperform alternative methods in terms of computational efficiency and ability to meet the stochastic problem requirements.« less

  1. A quantile-based scenario analysis approach to biomass supply chain optimization under uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamar, David S.; Gopaluni, Bhushan; Sokhansanj, Shahab

    Supply chain optimization for biomass-based power plants is an important research area due to greater emphasis on renewable power energy sources. Biomass supply chain design and operational planning models are often formulated and studied using deterministic mathematical models. While these models are beneficial for making decisions, their applicability to real world problems may be limited because they do not capture all the complexities in the supply chain, including uncertainties in the parameters. This study develops a statistically robust quantile-based approach for stochastic optimization under uncertainty, which builds upon scenario analysis. We apply and evaluate the performance of our approach tomore » address the problem of analyzing competing biomass supply chains subject to stochastic demand and supply. Finally, the proposed approach was found to outperform alternative methods in terms of computational efficiency and ability to meet the stochastic problem requirements.« less

  2. Resilient filtering for time-varying stochastic coupling networks under the event-triggering scheduling

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Liang, Jinling; Dobaie, Abdullah M.

    2018-07-01

    The resilient filtering problem is considered for a class of time-varying networks with stochastic coupling strengths. An event-triggered strategy is adopted to save the network resources by scheduling the signal transmission from the sensors to the filters based on certain prescribed rules. Moreover, the filter parameters to be designed are subject to gain perturbations. The primary aim of the addressed problem is to determine a resilient filter that ensures an acceptable filtering performance for the considered network with event-triggering scheduling. To handle such an issue, an upper bound on the estimation error variance is established for each node according to the stochastic analysis. Subsequently, the resilient filter is designed by locally minimizing the derived upper bound at each iteration. Moreover, rigorous analysis shows the monotonicity of the minimal upper bound regarding the triggering threshold. Finally, a simulation example is presented to show effectiveness of the established filter scheme.

  3. Stochastic sensitivity measure for mistuned high-performance turbines

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha V.; Pierre, Christophe

    1992-01-01

    A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.

  4. Study on Stationarity of Random Load Spectrum Based on the Special Road

    NASA Astrophysics Data System (ADS)

    Yan, Huawen; Zhang, Weigong; Wang, Dong

    2017-09-01

    In the special road quality assessment method, there is a method using a wheel force sensor, the essence of this method is collecting the load spectrum of the car to reflect the quality of road. According to the definition of stochastic process, it is easy to find that the load spectrum is a stochastic process. However, the analysis method and application range of different random processes are very different, especially in engineering practice, which will directly affect the design and development of the experiment. Therefore, determining the type of a random process has important practical significance. Based on the analysis of the digital characteristics of road load spectrum, this paper determines that the road load spectrum in this experiment belongs to a stationary stochastic process, paving the way for the follow-up modeling and feature extraction of the special road.

  5. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student's t-distribution.

    PubMed

    Leão, William L; Abanto-Valle, Carlos A; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor's 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model.

  6. Simulation of stochastic wind action on transmission power lines

    NASA Astrophysics Data System (ADS)

    Wielgos, Piotr; Lipecki, Tomasz; Flaga, Andrzej

    2018-01-01

    The paper presents FEM analysis of the wind action on overhead transmission power lines. The wind action is based on a stochastic simulation of the wind field in several points of the structure and on the wind tunnel tests on aerodynamic coefficients of the single conductor consisting of three wires. In FEM calculations the section of the transmission power line composed of three spans is considered. Non-linear analysis with deadweight of the structure is performed first to obtain the deformed shape of conductors. Next, time-dependent wind forces are applied to respective points of conductors and non-linear dynamic analysis is carried out.

  7. 1/f Noise from nonlinear stochastic differential equations.

    PubMed

    Ruseckas, J; Kaulakys, B

    2010-03-01

    We consider a class of nonlinear stochastic differential equations, giving the power-law behavior of the power spectral density in any desirably wide range of frequency. Such equations were obtained starting from the point process models of 1/fbeta noise. In this article the power-law behavior of spectrum is derived directly from the stochastic differential equations, without using the point process models. The analysis reveals that the power spectrum may be represented as a sum of the Lorentzian spectra. Such a derivation provides additional justification of equations, expands the class of equations generating 1/fbeta noise, and provides further insights into the origin of 1/fbeta noise.

  8. Mean Field Games for Stochastic Growth with Relative Utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Minyi, E-mail: mhuang@math.carleton.ca; Nguyen, Son Luu, E-mail: sonluu.nguyen@upr.edu

    This paper considers continuous time stochastic growth-consumption optimization in a mean field game setting. The individual capital stock evolution is determined by a Cobb–Douglas production function, consumption and stochastic depreciation. The individual utility functional combines an own utility and a relative utility with respect to the population. The use of the relative utility reflects human psychology, leading to a natural pattern of mean field interaction. The fixed point equation of the mean field game is derived with the aid of some ordinary differential equations. Due to the relative utility interaction, our performance analysis depends on some ratio based approximation errormore » estimate.« less

  9. On the Boltzmann Equation with Stochastic Kinetic Transport: Global Existence of Renormalized Martingale Solutions

    NASA Astrophysics Data System (ADS)

    Punshon-Smith, Samuel; Smith, Scott

    2018-02-01

    This article studies the Cauchy problem for the Boltzmann equation with stochastic kinetic transport. Under a cut-off assumption on the collision kernel and a coloring hypothesis for the noise coefficients, we prove the global existence of renormalized (in the sense of DiPerna/Lions) martingale solutions to the Boltzmann equation for large initial data with finite mass, energy, and entropy. Our analysis includes a detailed study of weak martingale solutions to a class of linear stochastic kinetic equations. This study includes a criterion for renormalization, the weak closedness of the solution set, and tightness of velocity averages in {{L}1}.

  10. Graph Theory-Based Pinning Synchronization of Stochastic Complex Dynamical Networks.

    PubMed

    Li, Xiao-Jian; Yang, Guang-Hong

    2017-02-01

    This paper is concerned with the adaptive pinning synchronization problem of stochastic complex dynamical networks (CDNs). Based on algebraic graph theory and Lyapunov theory, pinning controller design conditions are derived, and the rigorous convergence analysis of synchronization errors in the probability sense is also conducted. Compared with the existing results, the topology structures of stochastic CDN are allowed to be unknown due to the use of graph theory. In particular, it is shown that the selection of nodes for pinning depends on the unknown lower bounds of coupling strengths. Finally, an example on a Chua's circuit network is given to validate the effectiveness of the theoretical results.

  11. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects

    PubMed Central

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity. PMID:27010993

  12. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects.

    PubMed

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity.

  13. Declining spatial efficiency of global cropland nitrogen allocation

    NASA Astrophysics Data System (ADS)

    Mueller, Nathaniel D.; Lassaletta, Luis; Runck, Bryan C.; Billen, Gilles; Garnier, Josette; Gerber, James S.

    2017-02-01

    Efficiently allocating nitrogen (N) across space maximizes crop productivity for a given amount of N input and reduces N losses to the environment. Here we quantify changes in the global spatial efficiency of cropland N use by calculating historical trade-off frontiers relating N inputs to possible N yield assuming efficient allocation. Time series cropland N budgets from 1961 to 2009 characterize the evolution of N input-yield response functions across 12 regions and are the basis for constructing trade-off frontiers. Improvements in agronomic technology have substantially increased cropping system yield potentials and expanded N-driven crop production possibilities. However, we find that these gains are compromised by the declining spatial efficiency of N use across regions. Since the start of the Green Revolution, N inputs and yields have moved farther from the optimal frontier over time; in recent years (1994-2009), global N surplus has grown to a value that is 69% greater than what is possible with efficient N allocation between regions. To reflect regional pollution and agricultural development goals, we construct scenarios that restrict reallocation, finding that these changes only slightly decrease potential gains in nitrogen use efficiency. Our results are inherently conservative due to the regional unit of analysis, meaning a larger potential exists than is quantified here for cross-scale policies to promote spatially efficient N use.

  14. The Lateral Somitic Frontier in Ontogeny and Phylogeny*

    PubMed Central

    SHEARMAN, REBECCA MARIE; BURKE, ANN CAMPBELL

    2010-01-01

    The vertebrate musculoskeletal system comprises the axial and appendicular systems. The postcranial axial system consists of the vertebrae, ribs and associated muscles, and the appendicular system comprises the muscles and skeleton of the paired appendages and their respective girdles. The morphology, proportions, and arrangements of these parts have undergone tremendous variation during vertebrate history. Despite this vertebrate diversity, the cells that form all of the key parts of the musculoskeletal system during development arise from two populations of embryonic mesoderm, the somites and somatic lateral plate. Nowicki et al. (2003. Mech Dev 120:227–240) identified two dynamic domains in the developing chick embryo. The primaxial domain is populated exclusively by cells from the somites. The abaxial domain includes muscle and bone that develop within lateral plate-derived connective tissue. The boundary between the two domains is the lateral somitic frontier. We hypothesize that the primaxial and abaxial domains are patterned independently and that morphological evolution of the musculoskeletal system is facilitated by partially independent developmental changes in the abaxial and primaxial domain. Here we present our hypothesis in detail and review recent experimental and comparative studies that use the concept of the lateral somitic frontier in the analysis of the evolution of the highly derived chelonian and limbless squamate body plans. PMID:19021255

  15. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  16. Mineral Potential Mapping in a Frontier Region

    NASA Astrophysics Data System (ADS)

    Ford, A.

    2009-04-01

    Mineral potential mapping using Geographic Information Systems (GIS) allows for rapid evaluation of spatial geoscience data and has the potential to delineate areas which may be prospective for hosting mineral deposits. Popular methods for evaluating digital data include weights of evidence, fuzzy logic and probabilistic neural networks. To date, such methods have been mostly applied to terrains that are well-studied, well-explored, and for which high-quality data is readily available. However, despite lacking protracted exploration histories and high-quality data, many frontier regions may have high-potential for hosting world-class mineral deposits and may benefit from mineral potential mapping exercises. Sovereign risk factors can limit the scope of previous work in a frontier region, and previous research in such areas is often limited and/or inaccessible, publicly available literature and data can be restricted, and any available data may also be unreliable. Mineral potential mapping using GIS in a frontier region presents many challenges in terms of the data availability (eg. non-existent information, lack of digital data) and data quality (eg. inaccuracy, incomplete coverage). The quality of the final mineral potential map is limited by the quality of the input data and as such, is affected by data availability and quality. Such issues are not limited to frontier regions, but they are often compounded by having multiple weaknesses within the same dataset, which is uncommon for data in more well-explored, data-rich areas. We show how mineral potential mapping can be successfully applied to frontier regions in order to delineate targets with high potential for hosting a mineral deposit despite the data challenges posed. Data is evaluated using the weights of evidence and fuzzy logic methods due to their effectiveness in dealing with incomplete geoscientific datasets. Weights of evidence may be employed as a data driven method for indirectly evaluating the quality of the data. In a frontier region, the quality of both the training data (mineral deposits) and evidential layers (geological features) may be questionable. Statistical measures can be used to verify whether the data exhibits logical inconsistencies which may be the result of inaccurate training data or inaccurate data in the evidential layer. Expert geological knowledge may be used to exclude, refine or modify such datasets for further analysis using an iterative weights of evidence process. After verification of the datasets using weights of evidence, fuzzy logic can be used to prepare a mineral potential map using expert geological knowledge. Fuzzy logic is suited to new areas where data availability may be poor, and allows a geologist to select the evidential layers they believe are the most critical for the particular ore deposit style being investigated, as specific deposit models for the area may not yet exist. These critical layers can then be quantified based on expert opinion. The results of the mineral potential mapping can be verified by their ability to predict known ore deposits within the study area.

  17. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes.

    PubMed

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  18. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes

    NASA Astrophysics Data System (ADS)

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V.

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  19. Compressible cavitation with stochastic field method

    NASA Astrophysics Data System (ADS)

    Class, Andreas; Dumond, Julien

    2012-11-01

    Non-linear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrange particles or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic field method solving pdf transport based on Euler fields has been proposed which eliminates the necessity to mix Euler and Lagrange techniques or prescribed pdf assumptions. In the present work, part of the PhD Design and analysis of a Passive Outflow Reducer relying on cavitation, a first application of the stochastic field method to multi-phase flow and in particular to cavitating flow is presented. The application considered is a nozzle subjected to high velocity flow so that sheet cavitation is observed near the nozzle surface in the divergent section. It is demonstrated that the stochastic field formulation captures the wide range of pdf shapes present at different locations. The method is compatible with finite-volume codes where all existing physical models available for Lagrange techniques, presumed pdf or binning methods can be easily extended to the stochastic field formulation.

  20. Characterizing the dynamics of rubella relative to measles: the role of stochasticity

    PubMed Central

    Rozhnova, Ganna; Metcalf, C. Jessica E.; Grenfell, Bryan T.

    2013-01-01

    Rubella is a completely immunizing and mild infection in children. Understanding its behaviour is of considerable public health importance because of congenital rubella syndrome, which results from infection with rubella during early pregnancy and may entail a variety of birth defects. The recurrent dynamics of rubella are relatively poorly resolved, and appear to show considerable diversity globally. Here, we investigate the behaviour of a stochastic seasonally forced susceptible–infected–recovered model to characterize the determinants of these dynamics and illustrate patterns by comparison with measles. We perform a systematic analysis of spectra of stochastic fluctuations around stable attractors of the corresponding deterministic model and compare them with spectra from full stochastic simulations in large populations. This approach allows us to quantify the effects of demographic stochasticity and to give a coherent picture of measles and rubella dynamics, explaining essential differences in the recurrent patterns exhibited by these diseases. We discuss the implications of our findings in the context of vaccination and changing birth rates as well as the persistence of these two childhood infections. PMID:24026472

  1. Analysis of stochastic model for non-linear volcanic dynamics

    NASA Astrophysics Data System (ADS)

    Alexandrov, D.; Bashkirtseva, I.; Ryashko, L.

    2014-12-01

    Motivated by important geophysical applications we consider a dynamic model of the magma-plug system previously derived by Iverson et al. (2006) under the influence of stochastic forcing. Due to strong nonlinearity of the friction force for solid plug along its margins, the initial deterministic system exhibits impulsive oscillations. Two types of dynamic behavior of the system under the influence of the parametric stochastic forcing have been found: random trajectories are scattered on both sides of the deterministic cycle or grouped on its internal side only. It is shown that dispersions are highly inhomogeneous along cycles in the presence of noises. The effects of noise-induced shifts, pressure stabilization and localization of random trajectories have been revealed with increasing the noise intensity. The plug velocity, pressure and displacement are highly dependent of noise intensity as well. These new stochastic phenomena are related with the nonlinear peculiarities of the deterministic phase portrait. It is demonstrated that the repetitive stick-slip motions of the magma-plug system in the case of stochastic forcing can be connected with drumbeat earthquakes.

  2. Efficient analysis of stochastic gene dynamics in the non-adiabatic regime using piecewise deterministic Markov processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Yen Ting; Buchler, Nicolas E.

    Single-cell experiments show that gene expression is stochastic and bursty, a feature that can emerge from slow switching between promoter states with different activities. In addition to slow chromatin and/or DNA looping dynamics, one source of long-lived promoter states is the slow binding and unbinding kinetics of transcription factors to promoters, i.e. the non-adiabatic binding regime. Here, we introduce a simple analytical framework, known as a piecewise deterministic Markov process (PDMP), that accurately describes the stochastic dynamics of gene expression in the non-adiabatic regime. We illustrate the utility of the PDMP on a non-trivial dynamical system by analysing the propertiesmore » of a titration-based oscillator in the non-adiabatic limit. We first show how to transform the underlying chemical master equation into a PDMP where the slow transitions between promoter states are stochastic, but whose rates depend upon the faster deterministic dynamics of the transcription factors regulated by these promoters. We show that the PDMP accurately describes the observed periods of stochastic cycles in activator and repressor-based titration oscillators. We then generalize our PDMP analysis to more complicated versions of titration-based oscillators to explain how multiple binding sites lengthen the period and improve coherence. Finally, we show how noise-induced oscillation previously observed in a titration-based oscillator arises from non-adiabatic and discrete binding events at the promoter site.« less

  3. Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes

    PubMed Central

    Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik

    2014-01-01

    Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815

  4. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  5. Efficient analysis of stochastic gene dynamics in the non-adiabatic regime using piecewise deterministic Markov processes

    DOE PAGES

    Lin, Yen Ting; Buchler, Nicolas E.

    2018-01-31

    Single-cell experiments show that gene expression is stochastic and bursty, a feature that can emerge from slow switching between promoter states with different activities. In addition to slow chromatin and/or DNA looping dynamics, one source of long-lived promoter states is the slow binding and unbinding kinetics of transcription factors to promoters, i.e. the non-adiabatic binding regime. Here, we introduce a simple analytical framework, known as a piecewise deterministic Markov process (PDMP), that accurately describes the stochastic dynamics of gene expression in the non-adiabatic regime. We illustrate the utility of the PDMP on a non-trivial dynamical system by analysing the propertiesmore » of a titration-based oscillator in the non-adiabatic limit. We first show how to transform the underlying chemical master equation into a PDMP where the slow transitions between promoter states are stochastic, but whose rates depend upon the faster deterministic dynamics of the transcription factors regulated by these promoters. We show that the PDMP accurately describes the observed periods of stochastic cycles in activator and repressor-based titration oscillators. We then generalize our PDMP analysis to more complicated versions of titration-based oscillators to explain how multiple binding sites lengthen the period and improve coherence. Finally, we show how noise-induced oscillation previously observed in a titration-based oscillator arises from non-adiabatic and discrete binding events at the promoter site.« less

  6. A stochastic visco-hyperelastic model of human placenta tissue for finite element crash simulations.

    PubMed

    Hu, Jingwen; Klinich, Kathleen D; Miller, Carl S; Rupp, Jonathan D; Nazmi, Giseli; Pearlman, Mark D; Schneider, Lawrence W

    2011-03-01

    Placental abruption is the most common cause of fetal deaths in motor-vehicle crashes, but studies on the mechanical properties of human placenta are rare. This study presents a new method of developing a stochastic visco-hyperelastic material model of human placenta tissue using a combination of uniaxial tensile testing, specimen-specific finite element (FE) modeling, and stochastic optimization techniques. In our previous study, uniaxial tensile tests of 21 placenta specimens have been performed using a strain rate of 12/s. In this study, additional uniaxial tensile tests were performed using strain rates of 1/s and 0.1/s on 25 placenta specimens. Response corridors for the three loading rates were developed based on the normalized data achieved by test reconstructions of each specimen using specimen-specific FE models. Material parameters of a visco-hyperelastic model and their associated standard deviations were tuned to match both the means and standard deviations of all three response corridors using a stochastic optimization method. The results show a very good agreement between the tested and simulated response corridors, indicating that stochastic analysis can improve estimation of variability in material model parameters. The proposed method can be applied to develop stochastic material models of other biological soft tissues.

  7. Weak Galilean invariance as a selection principle for coarse-grained diffusive models.

    PubMed

    Cairoli, Andrea; Klages, Rainer; Baule, Adrian

    2018-05-29

    How does the mathematical description of a system change in different reference frames? Galilei first addressed this fundamental question by formulating the famous principle of Galilean invariance. It prescribes that the equations of motion of closed systems remain the same in different inertial frames related by Galilean transformations, thus imposing strong constraints on the dynamical rules. However, real world systems are often described by coarse-grained models integrating complex internal and external interactions indistinguishably as friction and stochastic forces. Since Galilean invariance is then violated, there is seemingly no alternative principle to assess a priori the physical consistency of a given stochastic model in different inertial frames. Here, starting from the Kac-Zwanzig Hamiltonian model generating Brownian motion, we show how Galilean invariance is broken during the coarse-graining procedure when deriving stochastic equations. Our analysis leads to a set of rules characterizing systems in different inertial frames that have to be satisfied by general stochastic models, which we call "weak Galilean invariance." Several well-known stochastic processes are invariant in these terms, except the continuous-time random walk for which we derive the correct invariant description. Our results are particularly relevant for the modeling of biological systems, as they provide a theoretical principle to select physically consistent stochastic models before a validation against experimental data.

  8. Pioneers on the Astrosociological Frontier: Introduction to the First Symposium on Astrosociology

    NASA Astrophysics Data System (ADS)

    Pass, Jim

    2009-03-01

    Astrosociology is a relatively new multidisciplinary field that scientifically investigates astrosocial phenomena (i.e., social, cultural, and behavioral patterns related to space exploration and related issues). The "astrosociological frontier" represents an analogous framework to that of space as the "final frontier," as both territories are quite empty of human activity and ripe for exploration. This focus on the astrosociological frontier provides insights about the need for a social-scientific field to place the human dimension in its proper place alongside familiar space community concerns such as engineering. The astrosociological frontier refers to the lack of development of astrosociology as a scientific field—or anything like it earlier during the space age. It includes both the 1) unoccupied "landscape" in academia characterized by the lack of astrosociology in its curricula and 2) dearth of space research focused on social-scientific (i.e., astrosociological) topics both inside and outside of traditional academia in collaboration with traditional space community members and the new space entrepreneurs. Within academia, the "frontier" is characterized by a lack of courses, programs, and departments dedicated to astrosociology. In the future, proponents of this new field expect the astrosociological frontier to become characterized by a growing number of "settlements" in curricula across the country and world. As things stand, however, the early "astrosociological pioneers" include those who seek to explore these underappreciated issues within academic and professional climates that discourage them from pursuing their interests. Thus, the "1st Symposium on Astrosociology" at the 2009 SPESIF conference represents an important expedition consisting of pioneering participants willing to venture into a little-explored territory with the goal of developing astrosociology.

  9. R-U policy frontiers for health data de-identification

    PubMed Central

    Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley A

    2015-01-01

    Objective The Health Insurance Portability and Accountability Act Privacy Rule enables healthcare organizations to share de-identified data via two routes. They can either 1) show re-identification risk is small (e.g., via a formal model, such as k-anonymity) with respect to an anticipated recipient or 2) apply a rule-based policy (i.e., Safe Harbor) that enumerates attributes to be altered (e.g., dates to years). The latter is often invoked because it is interpretable, but it fails to tailor protections to the capabilities of the recipient. The paper shows rule-based policies can be mapped to a utility (U) and re-identification risk (R) space, which can be searched for a collection, or frontier, of policies that systematically trade off between these goals. Methods We extend an algorithm to efficiently compose an R-U frontier using a lattice of policy options. Risk is proportional to the number of patients to which a record corresponds, while utility is proportional to similarity of the original and de-identified distribution. We allow our method to search 20 000 rule-based policies (out of 2700) and compare the resulting frontier with k-anonymous solutions and Safe Harbor using the demographics of 10 U.S. states. Results The results demonstrate the rule-based frontier 1) consists, on average, of 5000 policies, 2% of which enable better utility with less risk than Safe Harbor and 2) the policies cover a broader spectrum of utility and risk than k-anonymity frontiers. Conclusions R-U frontiers of de-identification policies can be discovered efficiently, allowing healthcare organizations to tailor protections to anticipated needs and trustworthiness of recipients. PMID:25911674

  10. The response analysis of fractional-order stochastic system via generalized cell mapping method.

    PubMed

    Wang, Liang; Xue, Lili; Sun, Chunyan; Yue, Xiaole; Xu, Wei

    2018-01-01

    This paper is concerned with the response of a fractional-order stochastic system. The short memory principle is introduced to ensure that the response of the system is a Markov process. The generalized cell mapping method is applied to display the global dynamics of the noise-free system, such as attractors, basins of attraction, basin boundary, saddle, and invariant manifolds. The stochastic generalized cell mapping method is employed to obtain the evolutionary process of probability density functions of the response. The fractional-order ϕ 6 oscillator and the fractional-order smooth and discontinuous oscillator are taken as examples to give the implementations of our strategies. Studies have shown that the evolutionary direction of the probability density function of the fractional-order stochastic system is consistent with the unstable manifold. The effectiveness of the method is confirmed using Monte Carlo results.

  11. Persistence and ergodicity of a stochastic single species model with Allee effect under regime switching

    NASA Astrophysics Data System (ADS)

    Yu, Xingwang; Yuan, Sanling; Zhang, Tonghua

    2018-06-01

    Allee effect can interact with environment stochasticity and is active when population numbers are small. Our goal of this paper is to investigate such effect on population dynamics. More precisely, we develop and investigate a stochastic single species model with Allee effect under regime switching. We first prove the existence of global positive solution of the model. Then, we perform the survival analysis to seek sufficient conditions for the extinction, non-persistence in mean, persistence in mean and stochastic permanence. By constructing a suitable Lyapunov function, we show that the model is positive recurrent and ergodic. Our results indicate that the regime switching can suppress the extinction of the species. Finally, numerical simulations are carried out to illustrate the obtained theoretical results, where a real-life example is also discussed showing the inclusion of Allee effect in the model provides a better match to the data.

  12. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  13. Stochastic determination of matrix determinants

    NASA Astrophysics Data System (ADS)

    Dorn, Sebastian; Enßlin, Torsten A.

    2015-07-01

    Matrix determinants play an important role in data analysis, in particular when Gaussian processes are involved. Due to currently exploding data volumes, linear operations—matrices—acting on the data are often not accessible directly but are only represented indirectly in form of a computer routine. Such a routine implements the transformation a data vector undergoes under matrix multiplication. While efficient probing routines to estimate a matrix's diagonal or trace, based solely on such computationally affordable matrix-vector multiplications, are well known and frequently used in signal inference, there is no stochastic estimate for its determinant. We introduce a probing method for the logarithm of a determinant of a linear operator. Our method rests upon a reformulation of the log-determinant by an integral representation and the transformation of the involved terms into stochastic expressions. This stochastic determinant determination enables large-size applications in Bayesian inference, in particular evidence calculations, model comparison, and posterior determination.

  14. Stochastic determination of matrix determinants.

    PubMed

    Dorn, Sebastian; Ensslin, Torsten A

    2015-07-01

    Matrix determinants play an important role in data analysis, in particular when Gaussian processes are involved. Due to currently exploding data volumes, linear operations-matrices-acting on the data are often not accessible directly but are only represented indirectly in form of a computer routine. Such a routine implements the transformation a data vector undergoes under matrix multiplication. While efficient probing routines to estimate a matrix's diagonal or trace, based solely on such computationally affordable matrix-vector multiplications, are well known and frequently used in signal inference, there is no stochastic estimate for its determinant. We introduce a probing method for the logarithm of a determinant of a linear operator. Our method rests upon a reformulation of the log-determinant by an integral representation and the transformation of the involved terms into stochastic expressions. This stochastic determinant determination enables large-size applications in Bayesian inference, in particular evidence calculations, model comparison, and posterior determination.

  15. Computational Study of Breathing-type Processes in Driven, Confined, Granular Alignments

    DTIC Science & Technology

    2012-04-17

    Government of India, Title: : “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 29, 2011 2. Physics Seminar, Indian...Institute of Science, Bangalore, India, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” June 30, 2011 3. Physics...Department Colloquium, SUNY Buffalo, Title: “Newton’s cradle, Fermi, Pasta , Ulam chain & the nonlinear many body frontier,” January 20, 2011. 4

  16. Discovery of a Supernova in HST imaging of the MACSJ0717 Frontier Field

    NASA Astrophysics Data System (ADS)

    Rodney, Steven A.; Lotz, Jennifer; Strolger, Louis-Gregory

    2013-10-01

    We report the discovery of a supernova (SN) in Hubble Space Telescope (HST) observations centered on the galaxy cluster MACSJ0717. It was discovered in the F814W (i) band of the Advanced Camera for Surveys (ACS), in observations that were collected as part of the ongoing HST Frontier Fields (HFF) program (PI:J.Lotz, HST PID 13498). The FrontierSN ID for this object is SN HFF13Zar (nicknamed "SN Zara").

  17. Model selection for integrated pest management with stochasticity.

    PubMed

    Akman, Olcay; Comar, Timothy D; Hrozencik, Daniel

    2018-04-07

    In Song and Xiang (2006), an integrated pest management model with periodically varying climatic conditions was introduced. In order to address a wider range of environmental effects, the authors here have embarked upon a series of studies resulting in a more flexible modeling approach. In Akman et al. (2013), the impact of randomly changing environmental conditions is examined by incorporating stochasticity into the birth pulse of the prey species. In Akman et al. (2014), the authors introduce a class of models via a mixture of two birth-pulse terms and determined conditions for the global and local asymptotic stability of the pest eradication solution. With this work, the authors unify the stochastic and mixture model components to create further flexibility in modeling the impacts of random environmental changes on an integrated pest management system. In particular, we first determine the conditions under which solutions of our deterministic mixture model are permanent. We then analyze the stochastic model to find the optimal value of the mixing parameter that minimizes the variance in the efficacy of the pesticide. Additionally, we perform a sensitivity analysis to show that the corresponding pesticide efficacy determined by this optimization technique is indeed robust. Through numerical simulations we show that permanence can be preserved in our stochastic model. Our study of the stochastic version of the model indicates that our results on the deterministic model provide informative conclusions about the behavior of the stochastic model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. E-waste collection in Italy: Results from an exploratory analysis.

    PubMed

    Favot, Marinella; Grassetti, Luca

    2017-09-01

    This study looks at the performance of household electrical and electronic waste (WEEE) collection in 20 Italian regions from 2008 to 2015. The impact of several explicative variables on the results of e-waste collection is evaluated. The independent variables are socio-economic and demographic ones (age, gender, household size, education level, migration and income) along with technical-organisational variables (population density, presence of metropoles, macro regions, characteristics of the territory, percentage of household waste collected separately and number of e-waste collection points). The results show that the presence of collection points, the percentage of household waste collected separately and the percentage of females are positively correlated with the kg collected per inhabitant per year. For example, a variation of 1% of input (presence of collection points) corresponds to a 0.25% variation in the output (collection results) while 1% difference in the percentage of females in the population corresponds to a 7.549% difference in the collection rate. Population density, instead, is negatively correlated. It is interesting to note that there is a discrepancy between the Southern regions and the Centre regions (the former have an outcome 0.66 times lower than the latter) while the Northern regions perform similarly to the Centre ones. Moreover, the first year (2008) had a very low performance compared to the following years when the scheme constantly improved, mainly due to the additional collection points available. The Stochastic Frontier Model allows for the identification of the optimal production function among the 20 Italian regions. The best performing region is Tuscany (in the Centre), followed by Sardinia and Sicily (in the South). Copyright © 2017. Published by Elsevier Ltd.

  19. The Virtuous Circles of Clinical Information Systems: a Modern Utopia.

    PubMed

    Degoulet, P

    2016-11-10

    Clinical information systems (CIS) are developed with the aim of improving both the efficiency and the quality of care. This position paper is based on the hypothesis that such vision is partly a utopian view of the emerging eSociety. Examples are drawn from 15 years of experience with the fully integrated Georges Pompidou University Hospital (HEGP) CIS and temporal data series extracted from the data warehouses of Assistance Publique - Hôpitaux de Paris (AP-HP) acute care hospitals which share the same administrative organization as HEGP. Three main virtuous circles are considered: user satisfaction vs. system use, system use vs. cost efficiency, and system use vs quality of care. In structural equation models (SEM), the positive bidirectional relationship between user satisfaction and use was only observed in the early HEGP CIS deployment phase (first four years) but disappeared in late post-adoption (≥8 years). From 2009 to 2013, financial efficiency of 20 AP-HP hospitals evaluated with stochastic frontier analysis (SFA) models diminished by 0.5% per year. The lower decrease of efficiency observed between the three hospitals equipped with a more mature CIS and the 17 other hospitals was of the same order of magnitude than the difference observed between pediatric and non-pediatric hospitals. Outcome quality benefits that would bring evidence to the system use vs. quality loop are unlikely to be obtained in a near future since they require integration with population-based outcome measures including mortality, morbidity, and quality of life that may not be easily available. Barriers to making the transformation of the utopian part of the CIS virtuous circles happen should be overcome to actually benefit the emerging eSociety.

  20. Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios, E-mail: garab@math.uoc.gr; Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003; Katsoulakis, Markos A., E-mail: markos@math.umass.edu

    2014-03-28

    In this paper we propose a new class of coupling methods for the sensitivity analysis of high dimensional stochastic systems and in particular for lattice Kinetic Monte Carlo (KMC). Sensitivity analysis for stochastic systems is typically based on approximating continuous derivatives with respect to model parameters by the mean value of samples from a finite difference scheme. Instead of using independent samples the proposed algorithm reduces the variance of the estimator by developing a strongly correlated-“coupled”- stochastic process for both the perturbed and unperturbed stochastic processes, defined in a common state space. The novelty of our construction is that themore » new coupled process depends on the targeted observables, e.g., coverage, Hamiltonian, spatial correlations, surface roughness, etc., hence we refer to the proposed method as goal-oriented sensitivity analysis. In particular, the rates of the coupled Continuous Time Markov Chain are obtained as solutions to a goal-oriented optimization problem, depending on the observable of interest, by considering the minimization functional of the corresponding variance. We show that this functional can be used as a diagnostic tool for the design and evaluation of different classes of couplings. Furthermore, the resulting KMC sensitivity algorithm has an easy implementation that is based on the Bortz–Kalos–Lebowitz algorithm's philosophy, where events are divided in classes depending on level sets of the observable of interest. Finally, we demonstrate in several examples including adsorption, desorption, and diffusion Kinetic Monte Carlo that for the same confidence interval and observable, the proposed goal-oriented algorithm can be two orders of magnitude faster than existing coupling algorithms for spatial KMC such as the Common Random Number approach. We also provide a complete implementation of the proposed sensitivity analysis algorithms, including various spatial KMC examples, in a supplementary MATLAB source code.« less

  1. Frontiers of Two-Dimensional Correlation Spectroscopy. Part 1. New concepts and noteworthy developments

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.

  2. The Climate Adaptation Frontier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Benjamin L

    2013-01-01

    Climate adaptation has emerged as a mainstream risk management strategy for assisting in maintaining socio-ecological systems within the boundaries of a safe operating space. Yet, there are limits to the ability of systems to adapt. Here, we introduce the concept of an adaptation frontier , which is defined as a socio-ecological system s transitional adaptive operating space between safe and unsafe domains. A number of driving forces are responsible for determining the sustainability of systems on the frontier. These include path dependence, adaptation/development deficits, values conflicts and discounting of future loss and damage. The cumulative implications of these driving forcesmore » are highly uncertain. Nevertheless, the fact that a broad range of systems already persist at the edge of their frontiers suggests a high likelihood that some limits will eventually be exceeded. The resulting system transformation is likely to manifest as anticipatory modification of management objectives or loss and damage. These outcomes vary significantly with respect to their ethical implications. Successful navigation of the adaptation frontier will necessitate new paradigms of risk governance to elicit knowledge that encourages reflexive reevaluation of societal values that enable or constrain sustainability.« less

  3. Center for Materials at Irradiation and Mechanical Extremes at LANL (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nastasi, Michael

    "Center for Materials at Irradiation and Mechanical Extremes (CMIME) at LANL" was submitted by CMIME to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CMIME, an EFRC directed by Michael Nastasi at Los Alamos National Laboratory is a partnership of scientists from four institutions: LANL (lead), Carnegie Mellon University, the University of Illinois at Urbana-Champaign, and the Massachusetts Institute of Technology. The Office of Basic Energy Sciences in themore » U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  4. Ethos of independence across regions in the United States: the production-adoption model of cultural change.

    PubMed

    Kitayama, Shinobu; Conway, Lucian Gideon; Pietromonaco, Paula R; Park, Hyekyung; Plaut, Victoria C

    2010-09-01

    Contemporary U.S. culture has a highly individualistic ethos. Nevertheless, exactly how this ethos was historically fostered remains unanalyzed. A new model of dynamic cultural change maintains that sparsely populated, novel environments that impose major threats to survival, such as the Western frontier in the United States during the 18th and 19th centuries, breed strong values of independence, which in turn guide the production of new practices that encourage self-promotion and focused, competitive work. Faced with few significant threats to survival, residents in traditional areas are likely to seek social prestige by adopting existing practices of other, higher status groups. Because of both the massive economic success of the frontier and the official endorsement of the frontier by the federal government, eastern residents of the United States in the 18th and 19th centuries may have actively adopted the frontier practices of independence, thus incorporating the frontier ethos of independence to form the contemporary U.S. national culture. Available evidence is reviewed, and implications for further research on cultural change are suggested. Copyright 2010 APA, all rights reserved.

  5. Center for Materials at Irradiation and Mechanical Extremes at LANL (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Michael Nastasi (Director, Center for Materials at Irradiation and Mechanical Extremes); CMIME Staff

    2017-12-09

    'Center for Materials at Irradiation and Mechanical Extremes (CMIME) at LANL' was submitted by CMIME to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CMIME, an EFRC directed by Michael Nastasi at Los Alamos National Laboratory is a partnership of scientists from four institutions: LANL (lead), Carnegia Mellon University, the University of Illinois at Urbana Champaign, and the Massachusetts Institute of Technology. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  6. Observation and analysis of the Coulter effect through carbon nanotube and graphene nanopores.

    PubMed

    Agrawal, Kumar Varoon; Drahushuk, Lee W; Strano, Michael S

    2016-02-13

    Carbon nanotubes (CNTs) and graphene are the rolled and flat analogues of graphitic carbon, respectively, with hexagonal crystalline lattices, and show exceptional molecular transport properties. The empirical study of a single isolated nanopore requires, as evidence, the observation of stochastic, telegraphic noise from a blocking molecule commensurate in size with the pore. This standard is used ubiquitously in patch clamp studies of single, isolated biological ion channels and a wide range of inorganic, synthetic nanopores. In this work, we show that observation and study of stochastic fluctuations for carbon nanopores, both CNTs and graphene-based, enable precision characterization of pore properties that is otherwise unattainable. In the case of voltage clamp measurements of long (0.5-1 mm) CNTs between 0.9 and 2.2 nm in diameter, Coulter blocking of cationic species reveals the complex structuring of the fluid phase for confined water in this diameter range. In the case of graphene, we have pioneered the study and the analysis of stochastic fluctuations in gas transport from a pressurized, graphene-covered micro-well compartment that reveal switching between different values of the membrane permeance attributed to chemical rearrangements of individual graphene pores. This analysis remains the only way to study such single isolated graphene nanopores under these realistic transport conditions of pore rearrangements, in keeping with the thesis of this work. In summary, observation and analysis of Coulter blocking or stochastic fluctuations of permeating flux is an invaluable tool to understand graphene and graphitic nanopores including CNTs. © 2015 The Author(s).

  7. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  8. Climate change threatens polar bear populations: a stochastic demographic analysis.

    PubMed

    Hunter, Christine M; Caswell, Hal; Runge, Michael C; Regehr, Eric V; Amstrup, Steve C; Stirling, Ian

    2010-10-01

    The polar bear (Ursus maritimus) depends on sea ice for feeding, breeding, and movement. Significant reductions in Arctic sea ice are forecast to continue because of climate warming. We evaluated the impacts of climate change on polar bears in the southern Beaufort Sea by means of a demographic analysis, combining deterministic, stochastic, environment-dependent matrix population models with forecasts of future sea ice conditions from IPCC general circulation models (GCMs). The matrix population models classified individuals by age and breeding status; mothers and dependent cubs were treated as units. Parameter estimates were obtained from a capture-recapture study conducted from 2001 to 2006. Candidate statistical models allowed vital rates to vary with time and as functions of a sea ice covariate. Model averaging was used to produce the vital rate estimates, and a parametric bootstrap procedure was used to quantify model selection and parameter estimation uncertainty. Deterministic models projected population growth in years with more extensive ice coverage (2001-2003) and population decline in years with less ice coverage (2004-2005). LTRE (life table response experiment) analysis showed that the reduction in lambda in years with low sea ice was due primarily to reduced adult female survival, and secondarily to reduced breeding. A stochastic model with two environmental states, good and poor sea ice conditions, projected a declining stochastic growth rate, log lambdas, as the frequency of poor ice years increased. The observed frequency of poor ice years since 1979 would imply log lambdas approximately - 0.01, which agrees with available (albeit crude) observations of population size. The stochastic model was linked to a set of 10 GCMs compiled by the IPCC; the models were chosen for their ability to reproduce historical observations of sea ice and were forced with "business as usual" (A1B) greenhouse gas emissions. The resulting stochastic population projections showed drastic declines in the polar bear population by the end of the 21st century. These projections were instrumental in the decision to list the polar bear as a threatened species under the U.S. Endangered Species Act.

  9. Climate change threatens polar bear populations: A stochastic demographic analysis

    USGS Publications Warehouse

    Hunter, C.M.; Caswell, H.; Runge, M.C.; Regehr, E.V.; Amstrup, Steven C.; Stirling, I.

    2010-01-01

    The polar bear (Ursus maritimus) depends on sea ice for feeding, breeding, and movement. Significant reductions in Arctic sea ice are forecast to continue because of climate warming. We evaluated the impacts of climate change on polar bears in the southern Beaufort Sea by means of a demographic analysis, combining deterministic, stochastic, environment-dependent matrix population models with forecasts of future sea ice conditions from IPCC general circulation models (GCMs). The matrix population models classified individuals by age and breeding status; mothers and dependent cubs were treated as units. Parameter estimates were obtained from a capture-recapture study conducted from 2001 to 2006. Candidate statistical models allowed vital rates to vary with time and as functions of a sea ice covariate. Model averaging was used to produce the vital rate estimates, and a parametric bootstrap procedure was used to quantify model selection and parameter estimation uncertainty. Deterministic models projected population growth in years with more extensive ice coverage (2001-2003) and population decline in years with less ice coverage (2004-2005). LTRE (life table response experiment) analysis showed that the reduction in ?? in years with low sea ice was due primarily to reduced adult female survival, and secondarily to reduced breeding. A stochastic model with two environmental states, good and poor sea ice conditions, projected a declining stochastic growth rate, log ??s, as the frequency of poor ice years increased. The observed frequency of poor ice years since 1979 would imply log ??s ' - 0.01, which agrees with available (albeit crude) observations of population size. The stochastic model was linked to a set of 10 GCMs compiled by the IPCC; the models were chosen for their ability to reproduce historical observations of sea ice and were forced with "business as usual" (A1B) greenhouse gas emissions. The resulting stochastic population projections showed drastic declines in the polar bear population by the end of the 21st century. These projections were instrumental in the decision to list the polar bear as a threatened species under the U.S. Endangered Species Act. ?? 2010 by the Ecological Society of America.

  10. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  11. The Mid-Cretaceous Frontier Formation near the Moxa Arch, southwestern Wyoming

    USGS Publications Warehouse

    Mereweather, E.A.; Blackmon, P.D.; Webb, J.C.

    1984-01-01

    The Frontier Formation in the Green River Basin of Wyoming, Utah, and Colorado, consists of sandstone, siltstone, and shale, and minor conglomerate, coal, and bentonite. These strata were deposited in several marine and nonmarine environments during early Late Cretaceous time. At north-trending outcrops along the eastern edge of the overthrust belt, the Frontier is of Cenomanian, Turonian, and early Coniacian age, and commonly is about 610 m (2,000 ft) thick. The formation in that area conformably overlies the Lower Cretaceous Aspen Shale and is divided into the following members, in ascending order: Chalk Creek, Coalville, Allen Hollow, Oyster Ridge Sandstone, and Dry Hollow. In west-trending outcrops on the northern flank of the Uinta Mountains in Utah, the Frontier is middle and late Turonian, and is about 60 m (200 ft) thick. These strata disconformably overlie the Lower Cretaceous Mowry Shale. In boreholes on the Moxa arch, the upper part of the Frontier is of middle Turonian to early Coniacian age and unconformably overlies the lower part of the formation, which is early Cenomanian at the south end and probably Cenomanian to early Turonian at the north end. The Frontier on the arch thickens northward from less than 100 m (328 ft) to more than 300 m (984 ft) and conformably overlies the Mowry. The marine and nonmarine Frontier near the Uinta Mountains, marine and mnmarine beds in the upper part of the formation on the Moxa arch and the largely nonmarine Dry Hollow Member at the top of the Frontier in the overthrust belt are similar in age. Older strata in the formation, which are represented by the disconformable basal contact of the Frontier near the Uinta Mountains, thicken northward along the Moxa arch and westward between the arch and the overthrust belt. The large changes in thickness of the Frontier in the Green River Basin were caused mainly by differential uplift and truncation of the lower part of the formation during the early to middle Turonian and by the shoreward addition of progressively younger sandstone units at the top of the formation during the late Turonian and early Coniacian. The sandstone in cores of the Frontier, from boreholes on the Moxa arch and the northern plunge of the Rock Springs uplift, consists of very fine grained and fine-grained litharenites and sublitharenites that were deposited in deltaic and shallow-water marine environments. These rocks consist mainly of quartz, chert, rock fragments, mixed-layer illite-smectite, mica-illite, and chlorite. Samples of the sandstone have porosities of 4.7 to 23.0 percent and permeabilities of 0.14 to 6.80 millidarcies, and seem to represent poor to fair reservoir beds for oil and gas. The shale in cores of the Frontier Formation and the overlying basal Hilliard Shale, from the Moxa arch, Rock Springs uplift, and overthrust belt, was deposited in deltaic and offshore-marine environments. Samples of the shale are composed largely of quartz, micaillite, mixed-layer illite-smectite, kaolin, and chlorite. They also contain from 0.27 to 4.42 percent organic carbon, in humic and sapropelic organic matter. Most of the sampled shale units are thermally mature, in terms of oil generation, and a few probably are source rocks for oil and gas.

  12. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  13. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  14. Stochastic approach for an unbiased estimation of the probability of a successful separation in conventional chromatography and sequential elution liquid chromatography.

    PubMed

    Ennis, Erin J; Foley, Joe P

    2016-07-15

    A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach

  15. A comparison of two- and three-dimensional stochastic models of regional solute movement

    USGS Publications Warehouse

    Shapiro, A.M.; Cvetkovic, V.D.

    1990-01-01

    Recent models of solute movement in porous media that are based on a stochastic description of the porous medium properties have been dedicated primarily to a three-dimensional interpretation of solute movement. In many practical problems, however, it is more convenient and consistent with measuring techniques to consider flow and solute transport as an areal, two-dimensional phenomenon. The physics of solute movement, however, is dependent on the three-dimensional heterogeneity in the formation. A comparison of two- and three-dimensional stochastic interpretations of solute movement in a porous medium having a statistically isotropic hydraulic conductivity field is investigated. To provide an equitable comparison between the two- and three-dimensional analyses, the stochastic properties of the transmissivity are defined in terms of the stochastic properties of the hydraulic conductivity. The variance of the transmissivity is shown to be significantly reduced in comparison to that of the hydraulic conductivity, and the transmissivity is spatially correlated over larger distances. These factors influence the two-dimensional interpretations of solute movement by underestimating the longitudinal and transverse growth of the solute plume in comparison to its description as a three-dimensional phenomenon. Although this analysis is based on small perturbation approximations and the special case of a statistically isotropic hydraulic conductivity field, it casts doubt on the use of a stochastic interpretation of the transmissivity in describing regional scale movement. However, by assuming the transmissivity to be the vertical integration of the hydraulic conductivity field at a given position, the stochastic properties of the hydraulic conductivity can be estimated from the stochastic properties of the transmissivity and applied to obtain a more accurate interpretation of solute movement. ?? 1990 Kluwer Academic Publishers.

  16. Fort Independence: An Eighteenth-Century Frontier Homesite and Militia Post in South Carolina.

    DTIC Science & Technology

    1982-12-01

    zone was designated Zone 3. The matrix was a moist , black muck with some rocks present, as well as chunks of burned and unburned logs, a 60 x 90cm...area of strong Tory sentiment, Fort ." Independence was important in maintaining South Carolina’s frontier at a critical time. The fort was burned by...Fort Independence was important in maintaining South Carolina’s frontier at a critical time. The fort was burned by Tories in early 1779

  17. FERMILAB ACCELERATOR R&D PROGRAM TOWARDS INTENSITY FRONTIER ACCELERATORS : STATUS AND PROGRESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir

    2016-11-15

    The 2014 P5 report indicated the accelerator-based neutrino and rare decay physics research as a centrepiece of the US domestic HEP program at Fermilab. Operation, upgrade and development of the accelerators for the near- term and longer-term particle physics program at the Intensity Frontier face formidable challenges. Here we discuss key elements of the accelerator physics and technology R&D program toward future multi-MW proton accelerators and present its status and progress. INTENSITY FRONTIER ACCELERATORS

  18. Alaska exceptionality hypothesis: Is Alaska wilderness really different?

    Treesearch

    Gregory Brown

    2002-01-01

    The common idiom of Alaska as “The Last Frontier” suggests that the relative remoteness and unsettled character of Alaska create a unique Alaskan identity, one that is both a “frontier” and the “last” of its kind. The frontier idiom portrays the place and people of Alaska as exceptional or different from the places and people who reside in the Lower Forty- Eight States...

  19. State Summary of New Mexico. Ed Watch Online.

    ERIC Educational Resources Information Center

    Education Trust, Washington, DC.

    This report provides data on the academic achievement gap that separates low-income and minority students from other students, examining how well different groups of students perform in New Mexico and noting inequities in teacher quality, course offerings, and funding. Included are tables and data that provide: a frontier gap analysis (a…

  20. Topological horseshoe analysis and field-programmable gate array implementation of a fractional-order four-wing chaotic attractor

    NASA Astrophysics Data System (ADS)

    Dong, En-Zeng; Wang, Zhen; Yu, Xiao; Chen, Zeng-Qiang; Wang, Zeng-Hui

    2018-01-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 61502340 and 61374169), the Application Base and Frontier Technology Research Project of Tianjin, China (Grant No. 15JCYBJC51800), and the South African National Research Foundation Incentive Grants (Grant No. 81705).

Top