Lumpy Demand and the Diagrammatics of Aggregation.
ERIC Educational Resources Information Center
Shmanske, Stephen; Packey, Daniel
1999-01-01
Illustrates how a simple discontinuity in an individual's demand curve, or inverse-demand curve, affects the shape of market aggregate curves. Shows, for private goods, that an infinitesimal change in quantity can lead to large changes in consumption patterns; for collective goods, the analysis suggests a theory of coalition building. (DSK)
Analytical Problems and Suggestions in the Analysis of Behavioral Economic Demand Curves.
Yu, Jihnhee; Liu, Liu; Collins, R Lorraine; Vincent, Paula C; Epstein, Leonard H
2014-01-01
Behavioral economic demand curves (Hursh, Raslear, Shurtleff, Bauman, & Simmons, 1988) are innovative approaches to characterize the relationships between consumption of a substance and its price. In this article, we investigate common analytical issues in the use of behavioral economic demand curves, which can cause inconsistent interpretations of demand curves, and then we provide methodological suggestions to address those analytical issues. We first demonstrate that log transformation with different added values for handling zeros changes model parameter estimates dramatically. Second, demand curves are often analyzed using an overparameterized model that results in an inefficient use of the available data and a lack of assessment of the variability among individuals. To address these issues, we apply a nonlinear mixed effects model based on multivariate error structures that has not been used previously to analyze behavioral economic demand curves in the literature. We also propose analytical formulas for the relevant standard errors of derived values such as P max, O max, and elasticity. The proposed model stabilizes the derived values regardless of using different added increments and provides substantially smaller standard errors. We illustrate the data analysis procedure using data from a relative reinforcement efficacy study of simulated marijuana purchasing.
Lo, Po-Han; Tsou, Mei-Yung; Chang, Kuang-Yi
2015-09-01
Patient-controlled epidural analgesia (PCEA) is commonly used for pain relief after total knee arthroplasty (TKA). This study aimed to model the trajectory of analgesic demand over time after TKA and explore its influential factors using latent curve analysis. Data were retrospectively collected from 916 patients receiving unilateral or bilateral TKA and postoperative PCEA. PCEA demands during 12-hour intervals for 48 hours were directly retrieved from infusion pumps. Potentially influential factors of PCEA demand, including age, height, weight, body mass index, sex, and infusion pump settings, were also collected. A latent curve analysis with 2 latent variables, the intercept (baseline) and slope (trend), was applied to model the changes in PCEA demand over time. The effects of influential factors on these 2 latent variables were estimated to examine how these factors interacted with time to alter the trajectory of PCEA demand over time. On average, the difference in analgesic demand between the first and second 12-hour intervals was only 15% of that between the first and third 12-hour intervals. No significant difference in PCEA demand was noted between the third and fourth 12-hour intervals. Aging tended to decrease the baseline PCEA demand but body mass index and infusion rate were positively correlated with the baseline. Only sex significantly affected the trend parameter and male individuals tended to have a smoother decreasing trend of analgesic demands over time. Patients receiving bilateral procedures did not consume more analgesics than their unilateral counterparts. Goodness of fit analysis indicated acceptable model fit to the observed data. Latent curve analysis provided valuable information about how analgesic demand after TKA changed over time and how patient characteristics affected its trajectory.
Latent factor structure of a behavioral economic cigarette demand curve in adolescent smokers
Bidwell, L. Cinnamon; MacKillop, James; Murphy, James G.; Tidey, Jennifer W.; Colby, Suzanne M.
2012-01-01
Behavioral economic demand curves, or quantitative representations of drug consumption across a range of prices, have been used to assess motivation for a variety of drugs. Such curves generate multiple measures of drug demand that are associated with cigarette consumption and nicotine dependence. However, little is known about the relationships among these facets of demand. The aim of the study was to quantify these relationships in adolescent smokers by using exploratory factor analysis to examine the underlying structure of the facets of nicotine incentive value generated from a demand curve measure. Participants were 138 adolescent smokers who completed a hypothetical cigarette purchase task, which assessed estimated cigarette consumption at escalating levels of price/cigarette. Demand curves and five facets of demand were generated from the measure: Elasticity (i.e., 1/α or proportionate price sensitivity); Intensity (i.e., consumption at zero price); Omax (i.e., maximum financial expenditure on cigarettes); Pmax (i.e., price at which expenditure is maximized); and Breakpoint (i.e., the price that suppresses consumption to zero). Principal components analysis was used to examine the latent structure among the variables. The results revealed a two-factor solution, which were interpreted as “Persistence,” reflecting insensitivity to escalating price, and “Amplitude,” reflecting the absolute levels of consumption and price. These findings suggest a two factor structure of nicotine incentive value as measured via a demand curve. If supported, these findings have implications for understanding the relationships among individual demand indices in future behavioral economic studies and may further contribute to understanding of the nature of cigarette reinforcement. PMID:22727784
Bentzley, Brandon S.; Fender, Kimberly M.; Aston-Jones, Gary
2012-01-01
Rationale Behavioral-economic demand curve analysis offers several useful measures of drug self-administration. Although generation of demand curves previously required multiple days, recent within-session procedures allow curve construction from a single 110-min cocaine self-administration session, making behavioral-economic analyses available to a broad range of self-administration experiments. However, a mathematical approach of curve fitting has not been reported for the within-session threshold procedure. Objectives We review demand curve analysis in drug self-administration experiments and provide a quantitative method for fitting curves to single-session data that incorporates relative stability of brain drug concentration. Methods Sprague-Dawley rats were trained to self-administer cocaine, and then tested with the threshold procedure in which the cocaine dose was sequentially decreased on a fixed ratio-1 schedule. Price points (responses/mg cocaine) outside of relatively stable brain cocaine concentrations were removed before curves were fit. Curve-fit accuracy was determined by the degree of correlation between graphical and calculated parameters for cocaine consumption at low price (Q0) and the price at which maximal responding occurred (Pmax). Results Removing price points that occurred at relatively unstable brain cocaine concentrations generated precise estimates of Q0 and resulted in Pmax values with significantly closer agreement with graphical Pmax than conventional methods. Conclusion The exponential demand equation can be fit to single-session data using the threshold procedure for cocaine self-administration. Removing data points that occur during relatively unstable brain cocaine concentrations resulted in more accurate estimates of demand curve slope than graphical methods, permitting a more comprehensive analysis of drug self-administration via a behavioral-economic framework. PMID:23086021
Latent factor structure of a behavioral economic cigarette demand curve in adolescent smokers.
Bidwell, L Cinnamon; MacKillop, James; Murphy, James G; Tidey, Jennifer W; Colby, Suzanne M
2012-11-01
Behavioral economic demand curves, or quantitative representations of drug consumption across a range of prices, have been used to assess motivation for a variety of drugs. Such curves generate multiple measures of drug demand that are associated with cigarette consumption and nicotine dependence. However, little is known about the relationships among these facets of demand. The aim of the study was to quantify these relationships in adolescent smokers by using exploratory factor analysis to examine the underlying structure of the facets of nicotine incentive value generated from a demand curve measure. Participants were 138 adolescent smokers who completed a hypothetical cigarette purchase task, which assessed estimated cigarette consumption at escalating levels of price/cigarette. Demand curves and five facets of demand were generated from the measure: Elasticity (i.e., 1/α or proportionate price sensitivity); Intensity (i.e., consumption at zero price); O(max) (i.e., maximum financial expenditure on cigarettes); P(max) (i.e., price at which expenditure is maximized); and Breakpoint (i.e., the price that suppresses consumption to zero). Principal components analysis was used to examine the latent structure among the variables. The results revealed a two-factor solution, which were interpreted as "Persistence," reflecting insensitivity to escalating price, and "Amplitude," reflecting the absolute levels of consumption and price. These findings suggest a two factor structure of nicotine incentive value as measured via a demand curve. If supported, these findings have implications for understanding the relationships among individual demand indices in future behavioral economic studies and may further contribute to understanding of the nature of cigarette reinforcement. Copyright © 2012 Elsevier Ltd. All rights reserved.
Tan, Lavinia; Hackenberg, Timothy D
2015-11-01
Pigeons' demand and preference for specific and generalized tokens was examined in a token economy. Pigeons could produce and exchange different colored tokens for food, for water, or for food or water. Token production was measured across three phases, which examined: (1) across-session price increases (typical demand curve method); (2) within-session price increases (progressive-ratio, PR, schedule); and (3) concurrent pairwise choices between the token types. Exponential demand curves were fitted to the response data and accounted for over 90% total variance. Demand curve parameter values, Pmax , Omax and α showed that demand was ordered in the following way: food tokens, generalized tokens, water tokens, both in Phase 1 and in Phase 3. This suggests that the preferences were predictable on the basis of elasticity and response output from the demand analysis. Pmax and Omax values failed to consistently predict breakpoints and peak response rates in the PR schedules in Phase 2, however, suggesting limits on a unitary conception of reinforcer efficacy. The patterns of generalized token production and exchange in Phase 3 suggest that the generalized tokens served as substitutes for the specific food and water tokens. Taken together, the present findings demonstrate the utility of behavioral economic concepts in the analysis of generalized reinforcement. © Society for the Experimental Analysis of Behavior.
Tests of Behavioral-Economic Assessments of Relative Reinforcer Efficacy II: Economic Complements
ERIC Educational Resources Information Center
Madden, Gregory J.; Smethells, John R.; Ewan, Eric E.; Hursh, Steven R.
2007-01-01
This experiment was conducted to test the predictions of two behavioral-economic approaches to quantifying relative reinforcer efficacy. The normalized demand analysis suggests that characteristics of averaged normalized demand curves may be used to predict progressive-ratio breakpoints and peak responding. By contrast, the demand analysis holds…
ERIC Educational Resources Information Center
Chiu-Irion, Vicky
Developed as part of a 37.5-hour microeconomics course, this lesson plan focuses on the concepts of supply and demand analysis used to determine market equilibrium. The objectives of the 50-minute lesson are to enable the student to: (1) explain how a demand schedule is derived from raw data; (2) graph a demand curve from the demand schedule; (3)…
The Aggregate Demand Curve: A Reply.
ERIC Educational Resources Information Center
Hansen, Richard B.; And Others
1987-01-01
Responds to claims about the instructional value of the downward-sloping aggregate demand curve in teaching principles of macroeconomics. Examines the effects of interest-rates and the role of money on demand curves. Concludes by arguing against the use of downward-sloping aggregate demand curves in textbooks. (RKM)
Teaching Keynes's Principle of Effective Demand Using the Aggregate Labor Market Diagram.
ERIC Educational Resources Information Center
Dalziel, Paul; Lavoie, Marc
2003-01-01
Suggests a method to teach John Keynes's principle of effective demand using a standard aggregate labor market diagram familiar to students taking advanced undergraduate macroeconomics courses. States the analysis incorporates Michal Kalecki's version to show Keynesian unemployment as a point on the aggregate labor demand curve inside the…
Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.
Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony
2016-12-01
The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.
Demand curves for hypothetical cocaine in cocaine-dependent individuals.
Bruner, Natalie R; Johnson, Matthew W
2014-03-01
Drug purchasing tasks have been successfully used to examine demand for hypothetical consumption of abused drugs including heroin, nicotine, and alcohol. In these tasks, drug users make hypothetical choices whether to buy drugs, and if so, at what quantity, at various potential prices. These tasks allow for behavioral economic assessment of that drug's intensity of demand (preferred level of consumption at extremely low prices) and demand elasticity (sensitivity of consumption to price), among other metrics. However, a purchasing task for cocaine in cocaine-dependent individuals has not been investigated. This study examined a novel Cocaine Purchasing Task and the relation between resulting demand metrics and self-reported cocaine use data. Participants completed a questionnaire assessing hypothetical purchases of cocaine units at prices ranging from $0.01 to $1,000. Demand curves were generated from responses on the Cocaine Purchasing Task. Correlations compared metrics from the demand curve to measures of real-world cocaine use. Group and individual data were well modeled by a demand curve function. The validity of the Cocaine Purchasing Task was supported by a significant correlation between the demand curve metrics of demand intensity and O max (determined from Cocaine Purchasing Task data) and self-reported measures of cocaine use. Partial correlations revealed that after controlling for demand intensity, demand elasticity and the related measure, P max, were significantly correlated with real-world cocaine use. Results indicate that the Cocaine Purchasing Task produces orderly demand curve data, and that these data relate to real-world measures of cocaine use.
Tests of Behavioral-Economic Assessments of Relative Reinforcer Efficacy: Economic Substitutes
ERIC Educational Resources Information Center
Madden, Gregory J.; Smethells, John R.; Ewan, Eric E.; Hursh, Steven R.
2007-01-01
This experiment was conducted to test predictions of two behavioral-economic approaches to quantifying relative reinforcer efficacy. According to the first of these approaches, characteristics of averaged normalized demand curves may be used to predict progressive-ratio breakpoints and peak responding. The second approach, the demand analysis,…
Cosmetic surgery procedures as luxury goods: measuring price and demand in facial plastic surgery.
Alsarraf, Ramsey; Alsarraf, Nicole W; Larrabee, Wayne F; Johnson, Calvin M
2002-01-01
To evaluate the relationship between cosmetic facial plastic surgery procedure price and demand, and to test the hypothesis that these procedures function as luxury goods in the marketplace, with an upward-sloping demand curve. Data were derived from a survey that was sent to every (N = 1727) active fellow, member, or associate of the American Academy of Facial Plastic and Reconstructive Surgery, assessing the costs and frequency of 4 common cosmetic facial plastic surgery procedures (face-lift, brow-lift, blepharoplasty, and rhinoplasty) for 1999 and 1989. An economic analysis was performed to assess the relationship of price and demand for these procedures. A significant association was found between increasing surgeons' fees and total charges for cosmetic facial plastic surgery procedures and increasing demand for these procedures, as measured by their annual frequency (P=.003). After a multiple regression analysis correcting for confounding variables, this association of increased price with increased demand holds for each of the 4 procedures studied, across all US regions, and for both periods surveyed. Cosmetic facial plastic surgery procedures do appear to function as luxury goods in the marketplace, with an upward-sloping demand curve. This stands in contrast to other, traditional, goods for which demand typically declines as price increases. It appears that economic methods can be used to evaluate cosmetic procedure trends; however, these methods must be founded on the appropriate economic theory.
Belke, Terry W; Pierce, W David
2009-02-01
Twelve female Long-Evans rats were exposed to concurrent variable (VR) ratio schedules of sucrose and wheel-running reinforcement (Sucrose VR 10 Wheel VR 10; Sucrose VR 5 Wheel VR 20; Sucrose VR 20 Wheel VR 5) with predetermined budgets (number of responses). The allocation of lever pressing to the sucrose and wheel-running alternatives was assessed at high and low body weights. Results showed that wheel-running rate and lever-pressing rates for sucrose and wheel running increased, but the choice of wheel running decreased at the low body weight. A regression analysis of relative consumption as a function of relative price showed that consumption shifted toward sucrose and interacted with price differences in a manner consistent with increased substitutability. Demand curves showed that demand for sucrose became less elastic while demand for wheel running became more elastic at the low body weight. These findings reflect an increase in the difference in relative value of sucrose and wheel running as body weight decreased. Discussion focuses on the limitations of response rates as measures of reinforcement value. In addition, we address the commonalities between matching and demand curve equations for the analysis of changes in relative reinforcement value.
Demand Curves for Hypothetical Cocaine in Cocaine-Dependent Individuals
Bruner, Natalie R.; Johnson, Matthew W.
2013-01-01
Rationale Drug purchasing tasks have been successfully used to examine demand for hypothetical consumption of abused drugs including heroin, nicotine, and alcohol. In these tasks drug users make hypothetical choices whether to buy drugs, and if so, at what quantity, at various potential prices. These tasks allow for behavioral economic assessment of that drug's intensity of demand (preferred level of consumption at extremely low prices) and demand elasticity (sensitivity of consumption to price), among other metrics. However, a purchasing task for cocaine in cocaine-dependent individuals has not been investigated. Objectives This study examined a novel Cocaine Purchasing Task and the relation between resulting demand metrics and self-reported cocaine use data. Methods Participants completed a questionnaire assessing hypothetical purchases of cocaine units at prices ranging from $0.01 to $1,000. Demand curves were generated from responses on the Cocaine Purchasing Task. Correlations compared metrics from the demand curve to measures of real-world cocaine use. Results Group and individual data were well modeled by a demand curve function. The validity of the Cocaine Purchasing Task was supported by a significant correlation between the demand curve metrics of demand intensity and Omax (determined from Cocaine Purchasing Task data) and self-reported measures of cocaine use. Partial correlations revealed that after controlling for demand intensity, demand elasticity and the related measure, Pmax, were significantly correlated with real-world cocaine use. Conclusions Results indicate that the Cocaine Purchasing Task produces orderly demand curve data, and that these data relate to real-world measures of cocaine use. PMID:24217899
Koltun, G.F.
2001-01-01
This report provides data and methods to aid in the hydrologic design or evaluation of impounding reservoirs and side-channel reservoirs used for water supply in Ohio. Data from 117 streamflow-gaging stations throughout Ohio were analyzed by means of nonsequential-mass-curve-analysis techniques to develop relations between storage requirements, water demand, duration, and frequency. Information also is provided on minimum runoff for selected durations and frequencies. Systematic record lengths for the streamflow-gaging stations ranged from about 10 to 75 years; however, in many cases, additional streamflow record was synthesized. For impounding reservoirs, families of curves are provided to facilitate the estimation of storage requirements as a function of demand and the ratio of the 7-day, 2-year low flow to the mean annual flow. Information is provided with which to evaluate separately the effects of evaporation on storage requirements. Comparisons of storage requirements for impounding reservoirs determined by nonsequential-mass-curve-analysis techniques with storage requirements determined by annual-mass-curve techniques that employ probability routing to account for carryover-storage requirements indicate that large differences in computed required storages can result from the two methods, particularly for conditions where demand cannot be met from within-year storage. For side-channel reservoirs, tables of demand-storage-frequency information are provided for a primary pump relation consisting of one variable-speed pump with a pumping capacity that ranges from 0.1 to 20 times demand. Tables of adjustment ratios are provided to facilitate determination of storage requirements for 19 other pump sets consisting of assorted combinations of fixed-speed pumps or variable-speed pumps with aggregate pumping capacities smaller than or equal to the primary pump relation. The effects of evaporation on side-channel reservoir storage requirements are incorporated into the storage-requirement estimates. The effects of an instream-flow requirement equal to the 80-percent-duration flow are also incorporated into the storage-requirement estimates.
A Note on Comparing the Elasticities of Demand Curves.
ERIC Educational Resources Information Center
Nieswiadomy, Michael
1986-01-01
Demonstrates a simple and useful way to compare the elasticity of demand at each price (or quantity) for different demand curves. The technique is particularly useful for the intermediate microeconomic course. (Author)
Pietzka, Ariane T.; Stöger, Anna; Huhulescu, Steliana; Allerberger, Franz; Ruppitsch, Werner
2011-01-01
The ability to accurately track Listeria monocytogenes strains involved in outbreaks is essential for control and prevention of listeriosis. Because current typing techniques are time-consuming, cost-intensive, technically demanding, and difficult to standardize, we developed a rapid and cost-effective method for typing of L. monocytogenes. In all, 172 clinical L. monocytogenes isolates and 20 isolates from culture collections were typed by high-resolution melting (HRM) curve analysis of a specific locus of the internalin B gene (inlB). All obtained HRM curve profiles were verified by sequence analysis. The 192 tested L. monocytogenes isolates yielded 15 specific HRM curve profiles. Sequence analysis revealed that these 15 HRM curve profiles correspond to 18 distinct inlB sequence types. The HRM curve profiles obtained correlated with the five phylogenetic groups I.1, I.2, II.1, II.2, and III. Thus, HRM curve analysis constitutes an inexpensive assay and represents an improvement in typing relative to classical serotyping or multiplex PCR typing protocols. This method provides a rapid and powerful screening tool for simultaneous preliminary typing of up to 384 samples in approximately 2 hours. PMID:21227395
Supply and demand in physician markets: a panel data analysis of GP services in Australia.
McRae, Ian; Butler, James R G
2014-09-01
To understand the trends in any physician services market it is necessary to understand the nature of both supply and demand, but few studies have jointly examined supply and demand in these markets. This study uses aggregate panel data on general practitioner (GP) services at the Statistical Local Area level in Australia spanning eight years to estimate supply and demand equations for GP services. The structural equations of the model are estimated separately using population-weighted fixed effects panel modelling with the two stage least squares formulation of the generalised method of moments approach (GMM (2SLS)). The estimated price elasticity of demand of [Formula: see text] is comparable with other studies. The direct impact of GP density on demand, while significant, proves almost immaterial in the context of near vertical supply curves. Supply changes are therefore due to shifts in the position of the curves, partly determined by a time trend. The model is validated by comparing post-panel model predictions with actual market outcomes over a period of three years and is found to provide surprisingly accurate projections over a period of significant policy change. The study confirms the need to jointly consider supply and demand in exploring the behaviour of physician services markets.
Modeling Integrated Water-User Decisions with Intermittent Supplies
NASA Astrophysics Data System (ADS)
Lund, J. R.; Rosenberg, D.
2006-12-01
We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.
The Use of Artificial Neural Networks for Forecasting the Electric Demand of Stand-Alone Consumers
NASA Astrophysics Data System (ADS)
Ivanin, O. A.; Direktor, L. B.
2018-05-01
The problem of short-term forecasting of electric power demand of stand-alone consumers (small inhabited localities) situated outside centralized power supply areas is considered. The basic approaches to modeling the electric power demand depending on the forecasting time frame and the problems set, as well as the specific features of such modeling, are described. The advantages and disadvantages of the methods used for the short-term forecast of the electric demand are indicated, and difficulties involved in the solution of the problem are outlined. The basic principles of arranging artificial neural networks are set forth; it is also shown that the proposed method is preferable when the input information necessary for prediction is lacking or incomplete. The selection of the parameters that should be included into the list of the input data for modeling the electric power demand of residential areas using artificial neural networks is validated. The structure of a neural network is proposed for solving the problem of modeling the electric power demand of residential areas. The specific features of generation of the training dataset are outlined. The results of test modeling of daily electric demand curves for some settlements of Kamchatka and Yakutia based on known actual electric demand curves are provided. The reliability of the test modeling has been validated. A high value of the deviation of the modeled curve from the reference curve obtained in one of the four reference calculations is explained. The input data and the predicted power demand curves for the rural settlement of Kuokuiskii Nasleg are provided. The power demand curves were modeled for four characteristic days of the year, and they can be used in the future for designing a power supply system for the settlement. To enhance the accuracy of the method, a series of measures based on specific features of a neural network's functioning are proposed.
Dynamic analysis and testing of a curved girder bridge.
DOT National Transportation Integrated Search
2006-01-01
As a result of increasing highway construction and expansion, a corresponding need to increase traffic capacity in heavily populated areas, and ever-increasing constraints on available land for transportation use, there has been an increasing demand ...
Pricing strategy for aesthetic surgery: economic analysis of a resident clinic's change in fees.
Krieger, L M; Shaw, W W
1999-02-01
The laws of microeconomics explain how prices affect consumer purchasing decisions and thus overall revenues and profits. These principles can easily be applied to the behavior aesthetic plastic surgery patients. The UCLA Division of Plastic Surgery resident aesthetics clinic recently offered a radical price change for its services. The effects of this change on demand for services and revenue were tracked. Economic analysis was applied to see if this price change resulted in the maximization of total revenues, or if additional price changes could further optimize them. Economic analysis of pricing involves several steps. The first step is to assess demand. The number of procedures performed by a given practice at different price levels can be plotted to create a demand curve. From this curve, price sensitivities of consumers can be calculated (price elasticity of demand). This information can then be used to determine the pricing level that creates demand for the exact number of procedures that yield optimal revenues. In economic parlance, revenues are maximized by pricing services such that elasticity is equal to 1 (the point of unit elasticity). At the UCLA resident clinic, average total fees per procedure were reduced by 40 percent. This resulted in a 250-percent increase in procedures performed for representative 4-month periods before and after the price change. Net revenues increased by 52 percent. Economic analysis showed that the price elasticity of demand before the price change was 6.2. After the price change it was 1. We conclude that the magnitude of the price change resulted in a fee schedule that yielded the highest possible revenues from the resident clinic. These results show that changes in price do affect total revenue and that the nature of these effects can be understood, predicted, and maximized using the tools of microeconomics.
High-resolution Behavioral Economic Analysis of Cigarette Demand to Inform Tax Policy
MacKillop, James; Few, Lauren R.; Murphy, James G.; Wier, Lauren M.; Acker, John; Murphy, Cara; Stojek, Monika; Carrigan, Maureen; Chaloupka, Frank
2012-01-01
Aims Novel methods in behavioral economics permit the systematic assessment of the relationship between cigarette consumption and price. Toward informing tax policy, the goals of this study were to conduct a high-resolution analysis of cigarette demand in a large sample of adult smokers and to use the data to estimate the effects of tax increases in ten U.S. States. Design In-person descriptive survey assessment. Setting Academic departments at three universities. Participants Adult daily smokers (i.e., 5+ cigarettes/day; 18+ years old; ≥8th grade education); N = 1056. Measurements Estimated cigarette demand, demographics, expired carbon monoxide. Findings The cigarette demand curve exhibited highly variable levels of price sensitivity, especially in the form of ‘left-digit effects’ (i.e., very high price sensitivity as pack prices transitioned from one whole number to the next; e.g., $5.80-$6/pack). A $1 tax increase in the ten states was projected to reduce the economic burden of smoking by an average of $531M (range: $93.6M-$976.5M) and increase gross tax revenue by an average of 162% (range: 114%- 247%). Conclusions Tobacco price sensitivity is nonlinear across the demand curve and in particular for pack-level left-digit price transitions. Tax increases in U.S. states with similar price and tax rates to the sample are projected to result in substantial decreases in smoking-related costs and substantial increases in tax revenues. PMID:22845784
High-resolution behavioral economic analysis of cigarette demand to inform tax policy.
MacKillop, James; Few, Lauren R; Murphy, James G; Wier, Lauren M; Acker, John; Murphy, Cara; Stojek, Monika; Carrigan, Maureen; Chaloupka, Frank
2012-12-01
Novel methods in behavioral economics permit the systematic assessment of the relationship between cigarette consumption and price. Towards informing tax policy, the goals of this study were to conduct a high-resolution analysis of cigarette demand in a large sample of adult smokers and to use the data to estimate the effects of tax increases in 10 US States. In-person descriptive survey assessment. Academic departments at three universities. Adult daily smokers (i.e. more than five cigarettes/day; 18+ years old; ≥8th grade education); n = 1056. Estimated cigarette demand, demographics, expired carbon monoxide. The cigarette demand curve exhibited highly variable levels of price sensitivity, especially in the form of 'left-digit effects' (i.e. very high price sensitivity as pack prices transitioned from one whole number to the next; e.g. $5.80-6/pack). A $1 tax increase in the 10 states was projected to reduce the economic burden of smoking by an average of $530.6 million (range: $93.6-976.5 million) and increase gross tax revenue by an average of 162% (range: 114-247%). Tobacco price sensitivity is non-linear across the demand curve and in particular for pack-level left-digit price transitions. Tax increases in US states with similar price and tax rates to the sample are projected to result in substantial decreases in smoking-related costs and substantial increases in tax revenues. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.
Galuska, Chad M.; Banna, Kelly M.; Willse, Lena Vaughn; Yahyavi-Firouz-Abadi, Noushin; See, Ronald E.
2011-01-01
The present study examined whether continued access to methamphetamine or food reinforcement changed economic demand for both. The relationship between demand elasticity and cue-induced reinstatement was also determined. Male Long-Evans rats lever-pressed under increasing fixed-ratio requirements for either food pellets or methamphetamine (20 μg/50 μl infusion). For two groups, demand curves were obtained before and after continued access (12 days, 2-hr sessions) to the reinforcer under a fixed-ratio 3 schedule. A third group was given continued access to methamphetamine between determinations of food demand and a fourth group abstained from methamphetamine between determinations. All groups underwent extinction sessions, followed by a cue-induced reinstatement test. Although food demand was less elastic than methamphetamine demand, continued access to methamphetamine shifted the methamphetamine demand curve upward and the food demand curve downward. In some rats, methamphetamine demand also became less elastic. Continued access to food had no effect on food demand. Reinstatement was higher after continued access to methamphetamine relative to food. For methamphetamine, elasticity and reinstatement measures were correlated. We conclude that continued access to methamphetamine – but not food – alters demand in ways suggestive of methamphetamine accruing reinforcing strength. Demand elasticity and reinstatement measures appear to be related indices of drug-seeking. PMID:21597363
Elasticity of Demand for Tuition Fees at an Institution of Higher Education
ERIC Educational Resources Information Center
Langelett, George; Chang, Kuo-Liang; Ola' Akinfenwa, Samson; Jorgensen, Nicholas; Bhattarai, Kopila
2015-01-01
Using a conjoint survey of 161 students at South Dakota State University (SDSU), we mapped a probability-of-enrolment curve for SDSU students, consistent with demand theory. A quasi-demand curve was created from the conditional-logit model. This study shows that along with the price of tuition fees, distance from home, availability of majors, and…
A behavioral economic measure of demand for alcohol predicts brief intervention outcomes.
MacKillop, James; Murphy, James G
2007-07-10
Considerable basic and clinical research supports a behavioral economic conceptualization of alcohol and drug dependence. One behavioral economic approach to assess motivation for a drug is the use of demand curves, or quantitative representations of drug consumption and drug-reinforced responding across a range of prices. This study used a hypothetical alcohol purchase task to generate demand curves, and examined whether the resulting demand curve parameters predicted drinking outcomes following a brief intervention. Participants were 51 college student drinkers (67% female; 94% Caucasian; drinks/week: M=24.57, S.D.=8.77) who completed a brief alcohol intervention. Consistent with predictions, a number of demand curve indices significantly predicted post-intervention alcohol use and frequency of heavy drinking episodes, even after controlling for baseline drinking and other pertinent covariates. Most prominently, O(max) (i.e., maximum alcohol expenditure) and breakpoint (i.e., sensitivity of consumption to increasing price) predicted greater drinking at 6-month post-intervention follow-up. These results indicate that a behavioral economic measure of alcohol demand may have utility in characterizing the malleability of alcohol consumption. Moreover, these results support the utility of translating experimental assays of reinforcement into clinical research.
Textbook Factor Demand Curves.
ERIC Educational Resources Information Center
Davis, Joe C.
1994-01-01
Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)
NASA Astrophysics Data System (ADS)
Hu, Ming-Che
Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received. Theoretical analyses prove the general existence of this bias for both competitive and oligopolistic models when production costs and demand curves are uncertain. Also demonstrated is an optimistic bias for the net benefits of introducing a new technology into a market when the cost of the new technology is uncertainty. The optimistic biases are quantified for a model of the northwest European electricity market (including Belgium, France, Germany and the Netherlands). Demand uncertainty results in an optimistic bias of 150,000-220,000 [Euro]/hr of total surplus and natural gas price uncertainty yields a smaller bias of 8,000-10,000 [Euro]/hr for total surplus. Further, adding a new uncertain technology (biomass) to the set of possible generation methods almost doubles the optimistic bias (14,000-18,000 [Euro]/hr). The third question concerns ex ante evaluation of the Reliability Pricing Model (RPM)---the new PJM capacity market---launched in June 2007. A Monte Carlo simulation model is developed to simulate PJM capacity market and predict market performance, producer revenue, and consumer payments. An important input to RPM is a demand curve for capacity; several alternative demand curves are compared, and sensitivity analyses conducted of those conclusions. One conclusion is that the sloped demand curves are more robust because those demand curves gives higher reliability with lower consumer payments. In addition, the performance of the curves is evaluated for a more sophisticated market design in which the demand curve can be adjusted in response to previous market outcomes and where the capital costs may change unexpectedly. The simulation shows that curve adjustment increases system reliability with lower consumer payments. Also the effect of learning-by-doing, leading to lower plant capital costs, leads to higher average reserve margin and lower consumer payments. In contrast, a the sudden rise in capital costs causes a decrease in reliability and an increase in consumer payments.
Tuition Rate Setting for Organized Camps: An Economic Analysis. An Occasional Paper.
ERIC Educational Resources Information Center
Doucette, Robert E.; Levine, Frank M.
1979-01-01
An economic analysis of setting tuition rates for organized camps addresses four topics of general interest: (1) measuring the economic value (revenues and expenses) of a camp; (2) measuring the true costs (fixed holding costs, fixed costs, and variable operating costs) of operation; (3) establishing a demand curve for measuring camp revenue; and…
Galuska, Chad M; Banna, Kelly M; Willse, Lena Vaughn; Yahyavi-Firouz-Abadi, Noushin; See, Ronald E
2011-08-01
This study examined whether continued access to methamphetamine or food reinforcement changed economic demand for both. The relationship between demand elasticity and cue-induced reinstatement was also determined. Male Long-Evans rats were lever pressed under increasing fixed-ratio requirements for either food pellets or methamphetamine (20 μg/50 μl infusion). For two groups, demand curves were obtained before and after continued access (12 days, 2-h sessions) to the reinforcer under a fixed-ratio 3 schedule. A third group was given continued access to methamphetamine between determinations of food demand and a fourth group abstained from methamphetamine between determinations. All groups underwent extinction sessions, followed by a cue-induced reinstatement test. Although food demand was less elastic than methamphetamine demand, continued access to methamphetamine shifted the methamphetamine demand curve upward and the food demand curve downward. In some rats, methamphetamine demand also became less elastic. Continued access to food had no effect on food demand. Reinstatement was higher after continued access to methamphetamine relative to food. For methamphetamine, elasticity and reinstatement measures were correlated. Continued access to methamphetamine, but not food, alters demand in ways suggestive of methamphetamine accruing reinforcing strength. Demand elasticity thus provides a useful measure of abuse liability that may predict future relapse to renewed drug-seeking and drug use.
Behavioral Economics and Empirical Public Policy
ERIC Educational Resources Information Center
Hursh, Steven R.; Roma, Peter G.
2013-01-01
The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively…
Validity of a demand curve measure of nicotine reinforcement with adolescent smokers.
Murphy, James G; MacKillop, James; Tidey, Jennifer W; Brazil, Linda A; Colby, Suzanne M
2011-01-15
High or inelastic demand for drugs is central to many laboratory and theoretical models of drug abuse, but it has not been widely measured with human substance abusers. The authors used a simulated cigarette purchase task to generate a demand curve measure of nicotine reinforcement in a sample of 138 adolescent smokers. Participants reported the number of cigarettes they would purchase and smoke in a hypothetical day across a range of prices, and their responses were well-described by a regression equation that has been used to construct demand curves in drug self-administration studies. Several demand curve measures were generated, including breakpoint, intensity, elasticity, P(max), and O(max). Although simulated cigarette smoking was price sensitive, smoking levels were high (8+ cigarettes/day) at prices up to 50¢ per cigarette, and the majority of the sample reported that they would purchase at least 1 cigarette at prices as high as $2.50 per cigarette. Higher scores on the demand indices O(max) (maximum cigarette purchase expenditure), intensity (reported smoking level when cigarettes were free), and breakpoint (the first price to completely suppress consumption), and lower elasticity (sensitivity of cigarette consumption to increases in cost), were associated with greater levels of naturalistic smoking and nicotine dependence. Greater demand intensity was associated with lower motivation to change smoking. These results provide initial support for the validity of a self-report cigarette purchase task as a measure of economic demand for nicotine with adolescent smokers. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Lesson on Demand. Lesson Plan.
ERIC Educational Resources Information Center
Weaver, Sue
This lesson plan helps students understand the role consumer demand plays in the market system, i.e., how interactions in the marketplace help determine pricing. Students will participate in an activity that demonstrates the concepts of demand, demand schedule, demand curve, and the law of demand. The lesson plan provides student objectives;…
NASA Technical Reports Server (NTRS)
Donoue, George; Hoffman, Karla; Sherry, Lance; Ferguson, John; Kara, Abdul Qadar
2010-01-01
The air transportation system is a significant driver of the U.S. economy, providing safe, affordable, and rapid transportation. During the past three decades airspace and airport capacity has not grown in step with demand for air transportation; the failure to increase capacity at the same rate as the growth in demand results in unreliable service and systemic delay. This report describes the results of an analysis of airline strategic decision-making that affects geographic access, economic access, and airline finances, extending the analysis of these factors using historic data (from Part 1 of the report). The Airline Schedule Optimization Model (ASOM) was used to evaluate how exogenous factors (passenger demand, airline operating costs, and airport capacity limits) affect geographic access (markets-served, scheduled flights, aircraft size), economic access (airfares), airline finances (profit), and air transportation efficiency (aircraft size). This analysis captures the impact of the implementation of airport capacity limits, as well as the effect of increased hedged fuel prices, which serve as a proxy for increased costs per flight that might occur if auctions or congestion pricing are imposed; also incorporated are demand elasticity curves based on historical data that provide information about how passenger demand is affected by airfare changes.
An analysis of the lumber planning process: Part I
Peter Koch
1955-01-01
Report of study of the peripheral-milling process of planing lumber. Relationships were determined between cutterhead horsepower and various combinations of specimen, cutterhead, and feed factors. Power demand curves are interpreted by comparion with simultaneous one micro-second photos of chips. Secondary consideration is given to quality of surface obtained. The...
An analysis of the lumber planning process: Part II
Peter Koch
1956-01-01
This study is part II of an investigation pertaining to the peripheral-milling process of planing lumber. Some relationships were determined between cutterhead horsepower and various combinations of specimen, cutterhead, and feed factors. Power demand curves were interpreted through comparison with simultaneously taken one micro-second photos of the forming chips....
ERIC Educational Resources Information Center
Brunori, Maurizio
2012-01-01
Before the outbreak of World War II, Jeffries Wyman postulated that the "Bohr effect" in hemoglobin demanded the oxygen linked dissociation of the imidazole of two histidines of the polypeptide. This proposal emerged from a rigorous analysis of the acid-base titration curves of oxy- and deoxy-hemoglobin, at a time when the information on the…
Does the International Substitution Effect Help Explain the Slope of the Aggregate Demand Curve?
ERIC Educational Resources Information Center
Fields, T. Windsor; Elwood, S. Kirk
1998-01-01
Observes that the textbook explanation of the relationship between the international substitution effect and the downward slope of the aggregate demand curve is generally presented uncritically. Argues that the international substitution effect is sufficiently flawed and that it should be eliminated in teaching as a justification for the slope of…
Behavioral economic analysis of demand for fuel in North America.
Reed, Derek D; Partington, Scott W; Kaplan, Brent A; Roma, Peter G; Hursh, Steven R
2013-01-01
Emerging research clearly indicates that human behavior is contributing to climate change, notably, the use of fossil fuels as a form of energy for everyday behaviors. This dependence on oil in North America has led to assertions that the current level of demand is the social equivalent to an "addiction." The purpose of this study was to apply behavioral economic demand curves-a broadly applicable method of evaluating relative reinforcer efficacy in behavioral models of addiction-to North American oil consumption to examine whether such claims of oil addiction are warranted. Toward this end, we examined government data from the United States and Canada on per capita energy consumption for transportation and oil prices between 1995 and 2008. Our findings indicate that consumption either persisted or simultaneously increased despite sharp increases in oil price per barrel over the past decade. © Society for the Experimental Analysis of Behavior.
Essays in market power mitigation and supply function equilibrium
NASA Astrophysics Data System (ADS)
Subramainam, Thiagarajah Natchie
Market power mitigation has been an integral part of wholesale electricity markets since deregulation. In wholesale electricity markets, different regions in the US take different approaches to regulating market power. While the exercise of market power has received considerable attention in the literature, the issue of market power mitigation has attracted scant attention. In the first chapter, I examine the market power mitigation rules used in New York ISO (Independent System Operator) and California ISO (CAISO) with respect to day-ahead and real-time energy markets. I test whether markups associated with New York in-city generators would be lower with an alternative approach to mitigation, the CAISO approach. Results indicate the difference in markups between these two mitigation rules is driven by the shape of residual demand curves for suppliers. Analysis of residual demand curves faced by New York in-city suppliers show similar markups under both mitigation rules when no one supplier is necessary to meet the demand (i.e., when no supplier is pivotal). However, when some supplier is crucial for the market to clear, the mitigation rule adopted by the NYISO consistently leads to higher markups than would the CAISO rule. This result suggest that market power episodes in New York is confined to periods where some supplier is pivotal. As a result, I find that applying the CAISOs' mitigation rules to the New York market could lower wholesale electricity prices by 18%. The second chapter of my dissertation focuses on supply function equilibrium. In power markets, suppliers submit offer curves in auctions, indicating their willingness to supply at different price levels. Although firms are allowed to submit different offer curves for different time periods, surprisingly many firms stick to a single offer curve for the entire day. This essentially means that firms are submitting a single offer curve for multiple demand realizations. A suitable framework to analyze such oligopolistic competition between power market suppliers is supply function equilibrium models. Using detailed bidding data, I develop equilibrium in supply functions by restricting supplier offers to a class of supply functions. By collating equilibrium supply functions corresponding to different realizations of demand, I obtain a single optimal supply function for the entire day. Then I compare the resulting supply function with actual day-ahead offers in New York. In addition to supply function equilibrium, I also develop a conservative bidding approach in which each firm assumes that rivals bid at marginal costs. Results show that the supply functions derived from equilibrium bidding model in this paper is not consistent with actual bidding in New York. This result is mainly driven by the class of supply functions used in this study to generate the equilibrium. Further, actual offers do not resemble offers generated by the conservative bidding algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark
2017-07-12
'H2@Scale' is a concept based on the opportunity for hydrogen to act as an intermediate between energy sources and uses. Hydrogen has the potential to be used like the primary intermediate in use today, electricity, because it too is fungible. This presentation summarizes the H2@Scale analysis efforts performed during the first third of 2017. Results of technical potential uses and supply options are summarized and show that the technical potential demand for hydrogen is 60 million metric tons per year and that the U.S. has sufficient domestic resources to meet that demand. A high level infrastructure analysis is also presentedmore » that shows an 85% increase in energy on the grid if all hydrogen is produced from grid electricity. However, a preliminary spatial assessment shows that supply is sufficient in most counties across the U.S. The presentation also shows plans for analysis of the economic potential for the H2@Scale concept. Those plans involve developing supply and demand curves for potential hydrogen generation options and as compared to other options for use of that hydrogen.« less
Ellicott Creek Basin, New York. Water Resources Development. Phase 2. Volume 2. Appendices.
1973-08-01
1 HI" . . .. - 1 240 I 200 a N 160 L 1.20 Comper demand curve consumer surplus $56, 400 Io 080 I " 040 10,000 20,000 30,000 40,000 50,000 60,000...Annual Comper - Doys ELLICOTT CREEK NEW YORK 2000 CAMPER DEMAND CURVE AT THE PROPOSED SANDRIDGE RESERVOIR US. ARMY ENGINEER DISTRICT, BUFFALO TO ACCOMPANY
Retiring the Short-Run Aggregate Supply Curve
ERIC Educational Resources Information Center
Elwood, S. Kirk
2010-01-01
The author argues that the aggregate demand/aggregate supply (AD/AS) model is significantly improved--although certainly not perfected--by trimming it of the short-run aggregate supply (SRAS) curve. Problems with the SRAS curve are shown first for the AD/AS model that casts the AD curve as identifying the equilibrium level of output associated…
Rasmussen, Erin B; Reilly, William; Buckley, Jessica; Boomhower, Steven R
2012-02-01
Research on free-food intake suggests that cannabinoids are implicated in the regulation of feeding. Few studies, however, have characterized how environmental factors that affect food procurement interact with cannabinoid drugs that reduce food intake. Demand analysis provides a framework to understand how cannabinoid blockers, such as rimonabant, interact with effort in reducing demand for food. The present study examined the effects rimonabant had on demand for sucrose in obese Zucker rats when effort to obtain food varied and characterized the data using the exponential ("essential value") model of demand. Twenty-nine male (15 lean, 14 obese) Zucker rats lever-pressed under eight fixed ratio (FR) schedules of sucrose reinforcement, in which the number of lever-presses to gain access to a single sucrose pellet varied between 1 and 300. After behavior stabilized under each FR schedule, acute doses of rimonabant (1-10mg/kg) were administered prior to some sessions. The number of food reinforcers and responses in each condition was averaged and the exponential and linear demand equations were fit to the data. These demand equations quantify the value of a reinforcer by its sensitivity to price (FR) increases. Under vehicle conditions, obese Zucker rats consumed more sucrose pellets than leans at smaller fixed ratios; however, they were equally sensitive to price increases with both models of demand. Rimonabant dose-dependently reduced reinforcers and responses for lean and obese rats across all FR schedules. Data from the exponential analysis suggest that rimonabant dose-dependently increased elasticity, i.e., reduced the essential value of sucrose, a finding that is consistent with graphical depictions of normalized demand curves. Copyright © 2011 Elsevier Inc. All rights reserved.
Biogas and Hydrogen Systems Market Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milbrandt, Anelia; Bush, Brian; Melaina, Marc
2016-03-31
This analysis provides an overview of the market for biogas-derived hydrogen and its use in transportation applications. It examines the current hydrogen production technologies from biogas, capacity and production, infrastructure, potential and demand, as well as key market areas. It also estimates the production cost of hydrogen from biogas and provides supply curves at a national level and at point source.
ERIC Educational Resources Information Center
Briggs, Laura Clark
2017-01-01
Research on secondary student reading comprehension performance is scant, yet demands for improved literacy at college and career levels indicate that an understanding of trends and growth patterns is necessary to better inform teaching and learning for high school students. To improve understanding of reading performance at the secondary level,…
A Transient Dopamine Signal Represents Avoidance Value and Causally Influences the Demand to Avoid
Pultorak, Katherine J.; Schelp, Scott A.; Isaacs, Dominic P.; Krzystyniak, Gregory
2018-01-01
Abstract While an extensive literature supports the notion that mesocorticolimbic dopamine plays a role in negative reinforcement, recent evidence suggests that dopamine exclusively encodes the value of positive reinforcement. In the present study, we employed a behavioral economics approach to investigate whether dopamine plays a role in the valuation of negative reinforcement. Using rats as subjects, we first applied fast-scan cyclic voltammetry (FSCV) to determine that dopamine concentration decreases with the number of lever presses required to avoid electrical footshock (i.e., the economic price of avoidance). Analysis of the rate of decay of avoidance demand curves, which depict an inverse relationship between avoidance and increasing price, allows for inference of the worth an animal places on avoidance outcomes. Rapidly decaying demand curves indicate increased price sensitivity, or low worth placed on avoidance outcomes, while slow rates of decay indicate reduced price sensitivity, or greater worth placed on avoidance outcomes. We therefore used optogenetics to assess how inducing dopamine release causally modifies the demand to avoid electrical footshock in an economic setting. Increasing release at an avoidance predictive cue made animals more sensitive to price, consistent with a negative reward prediction error (i.e., the animal perceives they received a worse outcome than expected). Increasing release at avoidance made animals less sensitive to price, consistent with a positive reward prediction error (i.e., the animal perceives they received a better outcome than expected). These data demonstrate that transient dopamine release events represent the value of avoidance outcomes and can predictably modify the demand to avoid. PMID:29766047
Influence of air and water temperature on fill characteristics curve
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lefevre, M.R.
1985-01-01
In a previous paper, the author discussed approximations of the Merkel Theory, as well as other approximations included in the CTI recommended method of calculation of the Demand curves. The paper concluded that the familiar difference of enthalpies, used as a cooling potential, which is the Merkel Theory, could continue to be used by simply adding a corrective multiplying factor derived from a direct comparison of the exact theory and the Merkel Theory. At the end of the paper the author briefly showed that the corrections to the Demand curve was only one part of the picture and that theremore » was also an influence of the temperatures of the Characteristic curve side. The object of this paper is to now review the influence of the air and water temperature on the Characteristic curve. This completes the work presented last year.« less
Symptoms of Depression and PTSD are Associated with Elevated Alcohol Demand
Murphy, James G.; Yurasek, Ali M.; Dennhardt, Ashley A.; Skidmore, Jessica R.; McDevitt-Murphy, Meghan E.; MacKillop, James; Martens, Matthew P.
2013-01-01
BACKGROUND Behavioral economic demand curves measure individual differences in motivation for alcohol and have been associated with problematic patterns of alcohol use, but little is known about the variables that may contribute to elevated demand. Negative visceral states have been theorized to increase demand for alcohol and to contribute to excessive drinking patterns, but little empirical research has evaluated this possibility. The present study tested the hypothesis that symptoms of depression and PTSD would be uniquely associated with elevated alcohol demand even after taking into account differences in typical drinking levels. METHOD An Alcohol Purchase Task (APT) was used to generate a demand curve measure of alcohol reinforcement in a sample of 133 college students (50.4% male, 64.4% Caucasian, 29.5% African-American) who reported at least one heavy drinking episode (5/4 or more drinks in one occasion for a man/woman) in the past month. Participants also completed standard measures of alcohol consumption and symptoms of depression and PTSD. RESULTS Regression analyses indicated that symptoms of depression were associated with higher demand intensity (alcohol consumption when price = 0; ΔR2 = .05, p = .002) and lower elasticity (ΔR2 = .04, p = .03), and that PTSD symptoms were associated with all five demand curve metrics (ΔR2 = .04 – .07, ps < .05). CONCLUSIONS These findings provide support for behavioral economic models of addiction that highlight the role of aversive visceral states in increasing the reward value of alcohol and provide an additional theoretical model to explain the association between negative affect and problematic drinking patterns. PMID:22809894
Symptoms of depression and PTSD are associated with elevated alcohol demand.
Murphy, James G; Yurasek, Ali M; Dennhardt, Ashley A; Skidmore, Jessica R; McDevitt-Murphy, Meghan E; MacKillop, James; Martens, Matthew P
2013-01-01
Behavioral economic demand curves measure individual differences in motivation for alcohol and have been associated with problematic patterns of alcohol use, but little is known about the variables that may contribute to elevated demand. Negative visceral states have been theorized to increase demand for alcohol and to contribute to excessive drinking patterns, but little empirical research has evaluated this possibility. The present study tested the hypothesis that symptoms of depression and PTSD would be uniquely associated with elevated alcohol demand even after taking into account differences in typical drinking levels. An Alcohol Purchase Task (APT) was used to generate a demand curve measure of alcohol reinforcement in a sample of 133 college students (50.4% male, 64.4% Caucasian, 29.5% African-American) who reported at least one heavy drinking episode (5/4 or more drinks in one occasion for a man/woman) in the past month. Participants also completed standard measures of alcohol consumption and symptoms of depression and PTSD. Regression analyses indicated that symptoms of depression were associated with higher demand intensity (alcohol consumption when price=0; ΔR(2)=.05, p=.002) and lower elasticity (ΔR(2)=.04, p=.03), and that PTSD symptoms were associated with all five demand curve metrics (ΔR(2)=.04-.07, ps<.05). These findings provide support for behavioral economic models of addiction that highlight the role of aversive visceral states in increasing the reward value of alcohol and provide an additional theoretical model to explain the association between negative affect and problematic drinking patterns. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Fuzzy Multi-Objective Transportation Planning with Modified S-Curve Membership Function
NASA Astrophysics Data System (ADS)
Peidro, D.; Vasant, P.
2009-08-01
In this paper, the S-Curve membership function methodology is used in a transportation planning decision (TPD) problem. An interactive method for solving multi-objective TPD problems with fuzzy goals, available supply and forecast demand is developed. The proposed method attempts simultaneously to minimize the total production and transportation costs and the total delivery time with reference to budget constraints and available supply, machine capacities at each source, as well as forecast demand and warehouse space constraints at each destination. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in TPD problems, with linear membership functions.
Henley, Amy J; DiGennaro Reed, Florence D; Reed, Derek D; Kaplan, Brent A
2016-09-01
Incentives are a popular method to achieve desired employee performance; however, research on optimal incentive magnitude is lacking. Behavioral economic demand curves model persistence of responding in the face of increasing cost and may be suitable to examine the reinforcing value of incentives on work performance. The present use-inspired basic study integrated an experiential human operant task within a crowdsourcing platform to evaluate the applicability of behavioral economics for quantifying changes in workforce attrition. Participants included 88 Amazon Mechanical Turk Workers who earned either a $0.05 or $0.10 incentive for completing a progressively increasing response requirement. Analyses revealed statistically significant differences in breakpoint between the two groups. Additionally, a novel translation of the Kaplan-Meier survival-curve analyses for use within a demand curve framework allowed for examination of elasticity of workforce attrition. Results indicate greater inelastic attrition in the $0.05 group. We discuss the benefits of a behavioral economic approach to modeling employee behavior, how the metrics obtained from the elasticity of workforce attrition analyses (e.g., P max ) may be used to set goals for employee behavior while balancing organizational costs, and how economy type may have influenced observed outcomes. © 2016 Society for the Experimental Analysis of Behavior.
Monopoly Output and Welfare: The Role of Curvature of the Demand Function.
ERIC Educational Resources Information Center
Malueg, David A.
1994-01-01
Discusses linear demand functions and constant marginal costs related to a monopoly in a market economy. Illustrates the demand function by using a curve. Includes an appendix with two figures and accompanying mathematical formulae illustrating the concepts presented in the article. (CFR)
Annotated bibliography of economic literature on wetlands
Douglas, Aaron J.
1989-01-01
This bibliography is intended for the use of wetlands scientists, policy analysts, and natural resource professionals who have little acquaintance with natural resource economics, and natural resource professionals who have some background in economic analysis and wish to sharpen their appreciation of the specialized methods used to value the nonmarket uses of wetland resources. It is not intended to serve as a first primer of natural resource economics. The purpose of including this discussion is to introduce the reader to the fact that specialized language and analytic techniques are used in this field, and that summary discussion of these techniques are not available in introductory or intermediate level economics textbooks. A key difficulty in economic analysis lies in the need that economists have to express common-sense terms such as "demand" or "supply" in a precise way; this facilitates the interpretation of data and is a powerful aid in making internally consistent, policy analysis. Natural resource economists would like to find a consistent, intuitively plausible measure of the social benefits conferred by some good or service. The most common fallacy noneconomists make in this field is to use expenditures as a measure of well-being or benefits. This measure is defective; expenditures may rise, while benefits fall. The following simple example should clarify the issue. Suppose that a certain population center, in the 1940's, is located 5 miles from a riverine recreation site. Suppose that a factory opens up 15 miles away from the site during the 1950's, and closes at the end of the 1960's; and that during this 20-year period, the bulk of this region's populace resides 15 miles from the site, close to the factory. In the 1970's, the populace of the region returns to the old population center, 5 miles from the recreation site. The benefits conferred by the site diminished during the 1950's and 1960's, even though travel (and even total) expenditures associated with the use of the site may have risen during this period. Both intuition and formal analysis suggest that accurate estimates of benefits conferred by a good or service provide quantitative indices of the availability of good substitutes for the good or service in question., The fewer low-priced substitutes, the greater the benefits conferred by the good. The prices (quantities) of available substitutes may be needed to specify empirically estimated demand (supply) curves. If so, omission of these variables will produce biased estimates of net benefits conferred if the approach used to estimate social benefits is based on the shape and position of an empirically estimated demand (supply) curve. In general, a good grasp of the meaning of both demand and supply (curves) is needed to produce sound estimates of benefits conferred by some commodity. Demand and supply curves are discussed in the following section that deals with various techniques for estimating benefits conferred by outdoor recreation sites. With wetlands functions and resources, the divergence between large total values and very low (zero) marginal values lies behind much of the controversy as to the appropriate procedure for imputing values to wetlands preservation benefits. If some type of wetland habitat is not a limiting factor in the production of some target wildlife species, the marginal value product of that habitat type is zero. The total social marginal product of the wetlands habitat type or complex may be very large, but if the removal of the last unit does not diminish total output, reallocating the land to more valuable economic activities increases social welfare.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byers, Conleigh; Levin, Todd; Botterud, Audun
A review of capacity markets in the United States in the context of increasing levels of variable renewable energy finds substantial differences with respect to incentives for operational performance, methods to calculate qualifying capacity for variable renewable energy and energy storage, and demand curves for capacity. The review also reveals large differences in historical capacity market clearing prices. The authors conclude that electricity market design must continue to evolve to achieve cost-effective policies for resource adequacy.
Air pollution: Household soiling and consumer welfare losses
Watson, W.D.; Jaksch, J.A.
1982-01-01
This paper uses demand and supply functions for cleanliness to estimate household benefits from reduced particulate matter soiling. A demand curve for household cleanliness is estimated, based upon the assumption that households prefer more cleanliness to less. Empirical coefficients, related to particulate pollution levels, for shifting the cleanliness supply curve, are taken from available studies. Consumer welfare gains, aggregated across 123 SMSAs, from achieving the Federal primary particulate standard, are estimated to range from $0.9 to $3.2 million per year (1971 dollars). ?? 1982.
Caulkins, Jonathan P; Kilmer, Beau; MacCoun, Robert J; Pacula, Rosalie Liccardo; Reuter, Peter
2012-05-01
No modern jurisdiction has ever legalized commercial production, distribution and possession of cannabis for recreational purposes. This paper presents insights about the effect of legalization on production costs and consumption and highlights important design choices. Insights were uncovered through our analysis of recent legalization proposals in California. The effect on the cost of producing cannabis is largely based on existing estimates of current wholesale prices, current costs of producing cannabis and other legal agricultural goods, and the type(s) of production that will be permitted. The effect on consumption is based on production costs, regulatory regime, tax rate, price elasticity of demand, shape of the demand curve and non-price effects (e.g. change in stigma). Removing prohibitions on producing and distributing cannabis will dramatically reduce wholesale prices. The effect on consumption and tax revenues will depend on many design choices, including: the tax level, whether there is an incentive for a continued black market, whether to tax and/or regulate cannabinoid levels, whether there are allowances for home cultivation, whether advertising is restricted, and how the regulatory system is designed and adjusted. The legal production costs of cannabis will be dramatically below current wholesale prices, enough so that taxes and regulation will be insufficient to raise retail price to prohibition levels. We expect legalization will increase consumption substantially, but the size of the increase is uncertain since it depends on design choices and the unknown shape of the cannabis demand curve. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.
Scaling Reward Value with Demand Curves versus Preference Tests
Schwartz, Lindsay P.; Silberberg, Alan; Casey, Anna H.; Paukner, Annika; Suomi, Stephen J.
2016-01-01
In Experiment 1, six capuchins lifted a weight during a 10-minute session to receive a food piece. Across conditions, the weight was increased across six different amounts for three different food types. The number of food pieces obtained as a function of the weight lifted was fitted by a demand equation that is hypothesized to quantify food value. For most subjects, this analysis showed that the three food types differed little in value. In Experiment 2, these monkeys were given pairwise choices among these food types. In 13 of 18 comparisons, preferences at least equaled a 3-to-1 ratio; in seven comparisons, preference was absolute. There was no relation between values based on degree of preference versus values based on the demand equation. When choices in the present report were compared to similar data with these subjects from another study, between-study lability in preference emerged. This outcome contrasts with the finding in demand analysis that test-retest reliability is high. We attribute the unreliability and extreme assignment of value based on preference tests to high substitutability between foods. We suggest use of demand analysis instead of preference tests for studies that compare the values of different foods. A better strategy might be to avoid manipulating value by using different foods. Where possible, value should be manipulated by varying amounts of a single food type because, over an appropriate range, more food is consistently more valuable than less. Such an approach would be immune to problems in between-food substitutability. PMID:26908005
Basu, Anirban
2011-01-01
The United States aspires to use information from comparative effectiveness research (CER) to reduce waste and contain costs without instituting a formal rationing mechanism or compromising patient or physician autonomy with regard to treatment choices. With such ambitious goals, traditional combinations of research designs and analytical methods used in CER may lead to disappointing results. In this paper, I study how alternate regimes of comparative effectiveness information help shape the marginal benefits (demand) curve in the population and how such perceived demand curves impact decision-making at the individual patient level and welfare at the societal level. I highlight the need to individualize comparative effectiveness research in order to generate the true (normative) demand curve for treatments. I discuss methodological principles that guide research designs for such studies. Using an example of the comparative effect of substance abuse treatments on crime, I use novel econometric methods to salvage individualized information from an existing dataset. PMID:21601299
A Method for Formulizing Disaster Evacuation Demand Curves Based on SI Model
Song, Yulei; Yan, Xuedong
2016-01-01
The prediction of evacuation demand curves is a crucial step in the disaster evacuation plan making, which directly affects the performance of the disaster evacuation. In this paper, we discuss the factors influencing individual evacuation decision making (whether and when to leave) and summarize them into four kinds: individual characteristics, social influence, geographic location, and warning degree. In the view of social contagion of decision making, a method based on Susceptible-Infective (SI) model is proposed to formulize the disaster evacuation demand curves to address both social influence and other factors’ effects. The disaster event of the “Tianjin Explosions” is used as a case study to illustrate the modeling results influenced by the four factors and perform the sensitivity analyses of the key parameters of the model. Some interesting phenomena are found and discussed, which is meaningful for authorities to make specific evacuation plans. For example, due to the lower social influence in isolated communities, extra actions might be taken to accelerate evacuation process in those communities. PMID:27735875
Basu, Anirban
2011-05-01
The United States aspires to use information from comparative effectiveness research (CER) to reduce waste and contain costs without instituting a formal rationing mechanism or compromising patient or physician autonomy with regard to treatment choices. With such ambitious goals, traditional combinations of research designs and analytical methods used in CER may lead to disappointing results. In this paper, I study how alternate regimes of comparative effectiveness information help shape the marginal benefits (demand) curve in the population and how such perceived demand curves impact decision-making at the individual patient level and welfare at the societal level. I highlight the need to individualize comparative effectiveness research in order to generate the true (normative) demand curve for treatments. I discuss methodological principles that guide research designs for such studies. Using an example of the comparative effect of substance abuse treatments on crime, I use novel econometric methods to salvage individualized information from an existing dataset. Copyright © 2011 Elsevier B.V. All rights reserved.
Projecting Electricity Demand in 2050
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hostick, Donna J.; Belzer, David B.; Hadley, Stanton W.
2014-07-01
This paper describes the development of end-use electricity projections and load curves that were developed for the Renewable Electricity (RE) Futures Study (hereafter RE Futures), which explored the prospect of higher percentages (30% - 90%) of total electricity generation that could be supplied by renewable sources in the United States. As input to RE Futures, two projections of electricity demand were produced representing reasonable upper and lower bounds of electricity demand out to 2050. The electric sector models used in RE Futures required underlying load profiles, so RE Futures also produced load profile data in two formats: 8760 hourly datamore » for the year 2050 for the GridView model, and in 2-year increments for 17 time slices as input to the Regional Energy Deployment System (ReEDS) model. The process for developing demand projections and load profiles involved three steps: discussion regarding the scenario approach and general assumptions, literature reviews to determine readily available data, and development of the demand curves and load profiles.« less
Frameworks for amending reservoir water management
Mower, Ethan; Miranda, Leandro E.
2013-01-01
Managing water storage and withdrawals in many reservoirs requires establishing seasonal targets for water levels (i.e., rule curves) that are influenced by regional precipitation and diverse water demands. Rule curves are established as an attempt to balance various water needs such as flood control, irrigation, and environmental benefits such as fish and wildlife management. The processes and challenges associated with amending rule curves to balance multiuse needs are complicated and mostly unfamiliar to non-US Army Corps of Engineers (USACE) natural resource managers and to the public. To inform natural resource managers and the public we describe the policies and process involved in amending rule curves in USACE reservoirs, including 3 frameworks: a general investigation, a continuing authority program, and the water control plan. Our review suggests that water management in reservoirs can be amended, but generally a multitude of constraints and competing demands must be addressed before such a change can be realized.
ERIC Educational Resources Information Center
Wade-Galuska, Tammy; Galuska, Chad M.; Winger, Gail
2011-01-01
Choice procedures have indicated that the relative reinforcing effectiveness of opioid drugs increases during opioid withdrawal. The demand curve, an absolute measure of reinforcer value, has not been applied to this question. The present study assessed whether mild morphine withdrawal would increase demand for or choice of remifentanil or…
Probabilistic Cross-identification of Cosmic Events
NASA Astrophysics Data System (ADS)
Budavári, Tamás
2011-08-01
I discuss a novel approach to identifying cosmic events in separate and independent observations. The focus is on the true events, such as supernova explosions, that happen once and, hence, whose measurements are not repeatable. Their classification and analysis must make the best use of all available data. Bayesian hypothesis testing is used to associate streams of events in space and time. Probabilities are assigned to the matches by studying their rates of occurrence. A case study of Type Ia supernovae illustrates how to use light curves in the cross-identification process. Constraints from realistic light curves happen to be well approximated by Gaussians in time, which makes the matching process very efficient. Model-dependent associations are computationally more demanding but can further boost one's confidence.
What Mathematical Competencies Are Needed for Success in College.
ERIC Educational Resources Information Center
Garofalo, Joe
1990-01-01
Identifies requisite math skills for a microeconomics course, offering samples of supply curves, demand curves, equilibrium prices, elasticity, and complex graph problems. Recommends developmental mathematics competencies, including problem solving, reasoning, connections, communication, number and operation sense, algebra, relationships,…
Ramirez, Jason J.; Dennhardt, Ashley A.; Baldwin, Scott A.; Murphy, James G.; Lindgren, Kristen P.
2016-01-01
Behavioral economic demand curve indices of alcohol consumption reflect decisions to consume alcohol at varying costs. Although these indices predict alcohol-related problems beyond established predictors, little is known about the determinants of elevated demand. Two cognitive constructs that may underlie alcohol demand are alcohol-approach inclinations and drinking identity. The aim of this study was to evaluate implicit and explicit measures of these constructs as predictors of alcohol demand curve indices. College student drinkers (N = 223, 59% female) completed implicit and explicit measures of drinking identity and alcohol-approach inclinations at three timepoints separated by three-month intervals, and completed the Alcohol Purchase Task to assess demand at Time 3. Given no change in our alcohol-approach inclinations and drinking identity measures over time, random intercept-only models were used to predict two demand indices: Amplitude, which represents maximum hypothetical alcohol consumption and expenditures, and Persistence, which represents sensitivity to increasing prices. When modeled separately, implicit and explicit measures of drinking identity and alcohol-approach inclinations positively predicted demand indices. When implicit and explicit measures were included in the same model, both measures of drinking identity predicted Amplitude, but only explicit drinking identity predicted Persistence. In contrast, explicit measures of alcohol-approach inclinations, but not implicit measures, predicted both demand indices. Therefore, there was more support for explicit, versus implicit, measures as unique predictors of alcohol demand. Overall, drinking identity and alcohol-approach inclinations both exhibit positive associations with alcohol demand and represent potentially modifiable cognitive constructs that may underlie elevated demand in college student drinkers. PMID:27379444
Area Under the Curve as a Novel Metric of Behavioral Economic Demand for Alcohol
Amlung, Michael; Yurasek, Ali; McCarty, Kayleigh N.; MacKillop, James; Murphy, James G.
2015-01-01
Behavioral economic purchase tasks can be readily used to assess demand for a number of addictive substances including alcohol, tobacco and illicit drugs. However, several methodological limitations associated with the techniques used to quantify demand may reduce the utility of demand measures. In the present study, we sought to introduce area under the curve (AUC), commonly used to quantify degree of delay discounting, as a novel index of demand. A sample of 207 heavy drinking college students completed a standard alcohol purchase task and provided information about typical weekly drinking patterns and alcohol-related problems. Level of alcohol demand was quantified using AUC – which reflects the entire amount of consumption across all drink prices - as well as the standard demand indices (e.g., intensity, breakpoint, Omax, Pmax, and elasticity). Results indicated that AUC was significantly correlated with each of the other demand indices (rs = .42–.92), with particularly strong associations with Omax (r = .92). In regression models, AUC and intensity were significant predictors of weekly drinking quantity and AUC uniquely predicted alcohol-related problems, even after controlling for drinking level. In a parallel set of analyses, Omax also predicted drinking quantity and alcohol problems, although Omax was not a unique predictor of the latter. These results offer initial support for using AUC as an index of alcohol demand. Additional research is necessary to further validate this approach and to examine its utility in quantifying demand for other addictive substances such as tobacco and illicit drugs. PMID:25895013
Regional price targets appropriate for advanced coal extraction
NASA Technical Reports Server (NTRS)
Terasawa, K. L.; Whipple, D. M.
1980-01-01
A methodology is presented for predicting coal prices in regional markets for the target time frames 1985 and 2000 that could subsequently be used to guide the development of an advanced coal extraction system. The model constructed is a supply and demand model that focuses on underground mining since the advanced technology is expected to be developed for these reserves by the target years. Coal reserve data and the cost of operating a mine are used to obtain the minimum acceptable selling price that would induce the producer to bring the mine into production. Based on this information, market supply curves can be generated. Demand by region is calculated based on an EEA methodology that emphasizes demand by electric utilities and demand by industry. The demand and supply curves are then used to obtain the price targets. The results show a growth in the size of the markets for compliance and low sulphur coal regions. A significant rise in the real price of coal is not expected even by the year 2000. The model predicts heavy reliance on mines with thick seams, larger block size and deep overburden.
NASA Astrophysics Data System (ADS)
Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun
2018-05-01
Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.
NASA Astrophysics Data System (ADS)
Donier, J.; Bouchaud, J.-P.
2016-12-01
In standard Walrasian auctions, the price of a good is defined as the point where the supply and demand curves intersect. Since both curves are generically regular, the response to small perturbations is linearly small. However, a crucial ingredient is absent of the theory, namely transactions themselves. What happens after they occur? To answer the question, we develop a dynamic theory for supply and demand based on agents with heterogeneous beliefs. When the inter-auction time is infinitely long, the Walrasian mechanism is recovered. When transactions are allowed to happen in continuous time, a peculiar property emerges: close to the price, supply and demand vanish quadratically, which we empirically confirm on the Bitcoin. This explains why price impact in financial markets is universally observed to behave as the square root of the excess volume. The consequences are important, as they imply that the very fact of clearing the market makes prices hypersensitive to small fluctuations.
A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology
Huang, Y.-N.; Whittaker, A.S.; Luco, N.
2011-01-01
A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.
Export product diversification and the environmental Kuznets curve: evidence from Turkey.
Gozgor, Giray; Can, Muhlis
2016-11-01
Countries try to stabilize the demand for energy on one hand and sustain economic growth on the other, but the worsening global warming and climate change problems have put pressure on them. This paper estimates the environmental Kuznets curve over the period 1971-2010 in Turkey both in the short and the long run. For this purpose, the unit root test with structural breaks and the cointegration analysis with multiple endogenous structural breaks are used. The effects of energy consumption and export product diversification on CO 2 emissions are also controlled in the dynamic empirical models. It is observed that the environmental Kuznets curve hypothesis is valid in Turkey in both the short run and the long run. The positive effect on energy consumption on CO 2 emissions is also obtained in the long run. In addition, it is found that a greater product diversification of exports yields higher CO 2 emissions in the long run. Inferences and policy implications are also discussed.
Integrated hydrologic modeling of a transboundary aquifer system —Lower Rio Grande
Hanson, Randall T.; Schmid, Wolfgang; Knight, Jacob E.; Maddock, Thomas
2013-01-01
For more than 30 years the agreements developed for the aquifer systems of the lower Rio Grande and related river compacts of the Rio Grande River have evolved into a complex setting of transboundary conjunctive use. The conjunctive use now includes many facets of water rights, water use, and emerging demands between the states of New Mexico and Texas, the United States and Mexico, and various water-supply agencies. The analysis of the complex relations between irrigation and streamflow supplyand-demand components and the effects of surface-water and groundwater use requires an integrated hydrologic model to track all of the use and movement of water. MODFLOW with the Farm Process (MFFMP) provides the integrated approach needed to assess the stream-aquifer interactions that are dynamically affected by irrigation demands on streamflow allotments that are supplemented with groundwater pumpage. As a first step to the ongoing full implementation of MF-FMP by the USGS, the existing model (LRG_2007) was modified to include some FMP features, demonstrating the ability to simulate the existing streamflow-diversion relations known as the D2 and D3 curves, departure of downstream deliveries from these curves during low allocation years and with increasing efficiency upstream, and the dynamic relation between surface-water conveyance and estimates of pumpage and recharge. This new MF-FMP modeling framework can now internally analyze complex relations within the Lower Rio Grande Hydrologic Model (LRGHM_2011) that previous techniques had limited ability to assess.
Area under the curve as a novel metric of behavioral economic demand for alcohol.
Amlung, Michael; Yurasek, Ali; McCarty, Kayleigh N; MacKillop, James; Murphy, James G
2015-06-01
Behavioral economic purchase tasks can be readily used to assess demand for a number of addictive substances, including alcohol, tobacco, and illicit drugs. However, several methodological limitations associated with the techniques used to quantify demand may reduce the utility of demand measures. In the present study, we sought to introduce area under the curve (AUC), commonly used to quantify degree of delay discounting, as a novel index of demand. A sample of 207 heavy-drinking college students completed a standard alcohol purchase task and provided information about typical weekly drinking patterns and alcohol-related problems. Level of alcohol demand was quantified using AUC--which reflects the entire amount of consumption across all drink prices--as well as the standard demand indices (e.g., intensity, breakpoint, Omax, Pmax, and elasticity). Results indicated that AUC was significantly correlated with each of the other demand indices (rs = .42-.92), with particularly strong associations with Omax (r = .92). In regression models, AUC and intensity were significant predictors of weekly drinking quantity, and AUC uniquely predicted alcohol-related problems, even after controlling for drinking level. In a parallel set of analyses, Omax also predicted drinking quantity and alcohol problems, although Omax was not a unique predictor of the latter. These results offer initial support for using AUC as an index of alcohol demand. Additional research is necessary to further validate this approach and to examine its utility in quantifying demand for other addictive substances such as tobacco and illicit drugs. (c) 2015 APA, all rights reserved).
The Aggregate Demand Curve and Its Proper Interpretation.
ERIC Educational Resources Information Center
Hansen, Richard B.; And Others
1985-01-01
Textbook authors, presenting aggregate demand-aggregate supply (AD-AS), are admonished to set their houses in order. The traditional Keynesian cross model should continue to be used as a pedagogical device. A version superior to the AD-AS models found in many texts is presented. (Author/RM)
Kumar, Keshav
2017-11-01
Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.
Teaching Economic Principles Interactively: A Cannibal's Dinner Party
ERIC Educational Resources Information Center
Bergstrom, Theodore C.
2009-01-01
The author describes techniques that he uses to interactively teach economics principles. He describes an experiment on market entry and gives examples of applications of classroom clickers. Clicker applications include (a) collecting data about student preferences that can be used to construct demand curves and supply curves, (b) checking…
Winning the War on Drugs - An Economic Perspective
1990-05-01
consumer demand for illegal drugs is influenced by the same market mechanisms that influence consumer behavior towards legal goods. In this regard...increased. (19:23-45; 20:2694) From the consumer behavior described , it can be generally concluded that the demand curves for cocaine use can be 17...mechanisms currently being employed. A closer examination of the real character or cocaine demand might reveal which market mechanisms intluence consumer
Two Propositions on the Application of Point Elasticities to Finite Price Changes.
ERIC Educational Resources Information Center
Daskin, Alan J.
1992-01-01
Considers counterintuitive propositions about using point elasticities to estimate quantity changes in response to price changes. Suggests that elasticity increases with price along a linear demand curve, but falling quantity demand offsets it. Argues that point elasticity with finite percentage change in price only approximates percentage change…
The Curvilinear Relationship between State Neuroticism and Momentary Task Performance
Debusscher, Jonas; Hofmans, Joeri; De Fruyt, Filip
2014-01-01
A daily diary and two experience sampling studies were carried out to investigate curvilinearity of the within-person relationship between state neuroticism and task performance, as well as the moderating effects of within-person variation in momentary job demands (i.e., work pressure and task complexity). In one, results showed that under high work pressure, the state neuroticism–task performance relationship was best described by an exponentially decreasing curve, whereas an inverted U-shaped curve was found for tasks low in work pressure, while in another study, a similar trend was visible for task complexity. In the final study, the state neuroticism–momentary task performance relationship was a linear one, and this relationship was moderated by momentary task complexity. Together, results from all three studies showed that it is important to take into account the moderating effects of momentary job demands because within-person variation in job demands affects the way in which state neuroticism relates to momentary levels of task performance. Specifically, we found that experiencing low levels of state neuroticism may be most beneficial in high demanding tasks, whereas more moderate levels of state neuroticism are optimal under low momentary job demands. PMID:25238547
Preequating with Empirical Item Characteristic Curves: An Observed-Score Preequating Method
ERIC Educational Resources Information Center
Zu, Jiyun; Puhan, Gautam
2014-01-01
Preequating is in demand because it reduces score reporting time. In this article, we evaluated an observed-score preequating method: the empirical item characteristic curve (EICC) method, which makes preequating without item response theory (IRT) possible. EICC preequating results were compared with a criterion equating and with IRT true-score…
Water quality management using statistical analysis and time-series prediction model
NASA Astrophysics Data System (ADS)
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
NASA Technical Reports Server (NTRS)
Sherry, Lance; Ferguson, John; Hoffman, Karla; Donohue, George; Beradino, Frank
2012-01-01
This report describes the Airline Fleet, Route, and Schedule Optimization Model (AFRS-OM) that is designed to provide insights into airline decision-making with regards to markets served, schedule of flights on these markets, the type of aircraft assigned to each scheduled flight, load factors, airfares, and airline profits. The main inputs to the model are hedged fuel prices, airport capacity limits, and candidate markets. Embedded in the model are aircraft performance and associated cost factors, and willingness-to-pay (i.e. demand vs. airfare curves). Case studies demonstrate the application of the model for analysis of the effects of increased capacity and changes in operating costs (e.g. fuel prices). Although there are differences between airports (due to differences in the magnitude of travel demand and sensitivity to airfare), the system is more sensitive to changes in fuel prices than capacity. Further, the benefits of modernization in the form of increased capacity could be undermined by increases in hedged fuel prices
Factors associated with the relationship between motorcycle deaths and economic growth.
Law, Teik Hua; Noland, Robert B; Evans, Andrew W
2009-03-01
This paper examines the Kuznets curve relationship for motorcycle deaths. The Kuznets curve describes the inverted U-shape relationship between economic development and, in this case, motorcycle deaths. In early stages of development we expect deaths to increase with increasing motorization. Eventually deaths decrease as technical, policy and political institutions respond to demands for increased safety. We examine this effect as well as some of the factors which might explain the Kuznets relationship: in particular motorcycle helmet laws, medical care and technology improvements, and variables representing the quality of political institutions. We apply a fixed effects negative binomial regression analysis on a panel of 25 countries covering the period 1970-1999. Our results broadly suggest that implementation of road safety regulation, improvement in the quality of political institutions, and medical care and technology developments have contributed to reduced motorcycle deaths.
Faculty Demand in Higher Education
ERIC Educational Resources Information Center
Rosenthal, Danielle
2007-01-01
The objective of this study is to identify the factors that shift the demand curve for faculty at not-for-profit private institutions. It is unique in that to the author's knowledge no other study has directly addressed the question of how the positive correlation between average faculty salaries and faculty-student ratios can be reconciled with…
40 CFR 1033.140 - Rated power.
Code of Federal Regulations, 2011 CFR
2011-07-01
... value to the nearest whole horsepower. Generally, this will be the brake power of the engine in notch 8... each possible operator demand setpoint or “notch”. See 40 CFR 1065.1001 for the definition of operator... discrete operator demand setpoints, or notches, the nominal power curve would be a series of eight power...
Using Empirical Point Elasticities To Teach Tax Incidence.
ERIC Educational Resources Information Center
Swinton, John R.; Thomas, Christopher R.
2001-01-01
Advocates use of point elasticities rather than arc elasticities or slopes of demand and supply curves to teach students about the economic impacts of excise taxes. Uses several available estimates of point elasticities of demand and supply of sugar to calculate the economic impacts of a penny-per-pound tax on sugar. (RLH)
Illustrating Consumer Theory with the CES Utility Function
ERIC Educational Resources Information Center
Tohamy, Soumaya M.; Mixon, J. Wilson, Jr.
2004-01-01
The authors use Microsoft Excel to derive compensated and uncompensated demand curves. They use a constant elasticity of substitution (CES) utility function to show how changes in a good's price or income affect the quantities demanded of that good and of the other composite good, using Excel's Solver. They provide three contributions. First, they…
Latent factor structure of a behavioral economic marijuana demand curve.
Aston, Elizabeth R; Farris, Samantha G; MacKillop, James; Metrik, Jane
2017-08-01
Drug demand, or relative value, can be assessed via analysis of behavioral economic purchase task performance. Five demand indices are typically obtained from drug purchase tasks. The goal of this research was to determine whether metrics of marijuana reinforcement from a marijuana purchase task (MPT) exhibit a latent factor structure that efficiently characterizes marijuana demand. Participants were regular marijuana users (n = 99; 37.4% female, 71.5% marijuana use days [5 days/week], 15.2% cannabis dependent) who completed study assessments, including the MPT, during a baseline session. Principal component analysis was used to examine the latent structure underlying MPT indices. Concurrent validity was assessed via examination of relationships between latent factors and marijuana use, past quit attempts, and marijuana expectancies. A two-factor solution was confirmed as the best fitting structure, accounting for 88.5% of the overall variance. Factor 1 (65.8% variance) reflected "Persistence," indicating sensitivity to escalating marijuana price, which comprised four MPT indices (elasticity, O max , P max , and breakpoint). Factor 2 (22.7% variance) reflected "Amplitude," indicating the amount consumed at unrestricted price (intensity). Persistence factor scores were associated with fewer past marijuana quit attempts and lower expectancies of negative use outcomes. Amplitude factor scores were associated with more frequent use, dependence symptoms, craving severity, and positive marijuana outcome expectancies. Consistent with research on alcohol and cigarette purchase tasks, the MPT can be characterized with a latent two-factor structure. Thus, demand for marijuana appears to encompass distinct dimensions of price sensitivity and volumetric consumption, with differential relations to other aspects of marijuana motivation.
ERIC Educational Resources Information Center
Gordon, Warren B.
2006-01-01
This paper examines the elasticity of demand, and shows that geometrically, it may be interpreted as the ratio of two simple distances along the tangent line: the distance from the point on the curve to the x-intercept to the distance from the point on the curve to the y-intercept. It also shows that total revenue is maximized at the transition…
Electric power market agent design
NASA Astrophysics Data System (ADS)
Oh, Hyungseon
The electric power industry in many countries has been restructured in the hope of a more economically efficient system. In the restructured system, traditional operating and planning tools based on true marginal cost do not perform well since information required is strictly confidential. For developing a new tool, it is necessary to understand offer behavior. The main objective of this study is to create a new tool for power system planning. For the purpose, this dissertation develops models for a market and market participants. A new model is developed in this work for explaining a supply-side offer curve, and several variables are introduced to characterize the curve. Demand is estimated using a neural network, and a numerical optimization process is used to determine the values of the variables that maximize the profit of the agent. The amount of data required for the optimization is chosen with the aid of nonlinear dynamics. To suggest an optimal demand-side bidding function, two optimization problems are constructed and solved for maximizing consumer satisfaction based on the properties of two different types of demands: price-based demand and must-be-served demand. Several different simulations are performed to test how an agent reacts in various situations. The offer behavior depends on locational benefit as well as the offer strategies of competitors.
An Initial Econometric Consideration of Supply and Demand in the Guaranteed Student Loan Program.
ERIC Educational Resources Information Center
Bayus, Barry; Kendis, Kurt
1982-01-01
In this econometric model of the Guaranteed Student Loan Program (GSLP), supply is related to banks' liquidity and yield curves, all lenders' economic costs and returns, and Student Loan Marketing Association activity. GSLP demand is based on loan costs, family debt position, and net student need for financial aid. (RW)
A Regression Study of Demand, Cost and Pricing Public Library Circulation Services.
ERIC Educational Resources Information Center
Stratton, Peter J.
This paper examines three aspects of the public library's circulation service: (1) a demand function for the service is estimated; (2) a long-run unit circulation cost curve is developed; and (3) using the economist's notion of "efficiency," a general model for the pricing of the circulation service is presented. The estimated demand…
On the required complexity of vehicle dynamic models for use in simulation-based highway design.
Brown, Alexander; Brennan, Sean
2014-06-01
This paper presents the results of a comprehensive project whose goal is to identify roadway design practices that maximize the margin of safety between the friction supply and friction demand. This study is motivated by the concern for increased accident rates on curves with steep downgrades, geometries that contain features that interact in all three dimensions - planar curves, grade, and superelevation. This complexity makes the prediction of vehicle skidding quite difficult, particularly for simple simulation models that have historically been used for road geometry design guidance. To obtain estimates of friction margin, this study considers a range of vehicle models, including: a point-mass model used by the American Association of State Highway Transportation Officials (AASHTO) design policy, a steady-state "bicycle model" formulation that considers only per-axle forces, a transient formulation of the bicycle model commonly used in vehicle stability control systems, and finally, a full multi-body simulation (CarSim and TruckSim) regularly used in the automotive industry for high-fidelity vehicle behavior prediction. The presence of skidding--the friction demand exceeding supply--was calculated for each model considering a wide range of vehicles and road situations. The results indicate that the most complicated vehicle models are generally unnecessary for predicting skidding events. However, there are specific maneuvers, namely braking events within lane changes and curves, which consistently predict the worst-case friction margins across all models. This suggests that any vehicle model used for roadway safety analysis should include the effects of combined cornering and braking. The point-mass model typically used by highway design professionals may not be appropriate to predict vehicle behavior on high-speed curves during braking in low-friction situations. However, engineers can use the results of this study to help select the appropriate vehicle dynamic model complexity to use in the highway design process. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Graves, Philip E.; Sexton, Robert L.; Calimeris, Lauren M.
2011-01-01
The surprise value of many economic observations makes the economics discipline quite interesting for many students. One such anomaly is that providing "free" education in an effort to reduce the number of dropouts can often result in a lower level of educational quality purchased. This result is easy to show with indifference curves, but many…
The use of operant technology to measure behavioral priorities in captive animals.
Cooper, J J; Mason, G J
2001-08-01
Addressing the behavioral priorities of captive animals and the development of practical, objective measures of the value of environmental resources is a principal objective of animal welfare science. In theory, consumer demand approaches derived from human microeconomics should provide valid measures of the value of environmental resources. In practice, however, a number of empirical and theoretical problems have rendered these measures difficult to interpret in studies with animals. A common approach has been to impose a cost on access to resources and to use time with each resource as a measure of consumption to construct demand curves. This can be recorded easily by automatic means, but in a number of studies, it has been found that animals compensate for increased cost of access with longer visit time. Furthermore, direct observation of the test animals' behavior has shown that resource interaction is more intense once the animals have overcome higher costs. As a consequence, measures based on time with the resource may underestimate resource consumption at higher access costs, and demand curves derived from these measures may not be a true reflection of the value of different resources. An alternative approach to demand curves is reservation price, which is the maximum price individual animals are prepared to pay to gain access to resources. In studies using this approach, farmed mink (Mustela vison) paid higher prices for food and swimming water than for resources such as tunnels, water bowls, pet toys, and empty compartments. This indicates that the mink placed a higher value on food and swimming water than on other resources.
Medellín-Azuara, Josué; Harou, Julien J; Howitt, Richard E
2010-11-01
Given the high proportion of water used for agriculture in certain regions, the economic value of agricultural water can be an important tool for water management and policy development. This value is quantified using economic demand curves for irrigation water. Such demand functions show the incremental contribution of water to agricultural production. Water demand curves are estimated using econometric or optimisation techniques. Calibrated agricultural optimisation models allow the derivation of demand curves using smaller datasets than econometric models. This paper introduces these subject areas then explores the effect of spatial aggregation (upscaling) on the valuation of water for irrigated agriculture. A case study from the Rio Grande-Rio Bravo Basin in North Mexico investigates differences in valuation at farm and regional aggregated levels under four scenarios: technological change, warm-dry climate change, changes in agricultural commodity prices, and water costs for agriculture. The scenarios consider changes due to external shocks or new policies. Positive mathematical programming (PMP), a calibrated optimisation method, is the deductive valuation method used. An exponential cost function is compared to the quadratic cost functions typically used in PMP. Results indicate that the economic value of water at the farm level and the regionally aggregated level are similar, but that the variability and distributional effects of each scenario are affected by aggregation. Moderately aggregated agricultural production models are effective at capturing average-farm adaptation to policy changes and external shocks. Farm-level models best reveal the distribution of scenario impacts. Copyright © 2009 Elsevier B.V. All rights reserved.
The symmetry and coupling properties of solutions in general anisotropic multilayer waveguides.
Hernando Quintanilla, F; Lowe, M J S; Craster, R V
2017-01-01
Multilayered plate and shell structures play an important role in many engineering settings where, for instance, coated pipes are commonplace such as in the petrochemical, aerospace, and power generation industries. There are numerous demands, and indeed requirements, on nondestructive evaluation (NDE) to detect defects or to measure material properties using guided waves; to choose the most suitable inspection approach, it is essential to know the properties of the guided wave solutions for any given multilayered system and this requires dispersion curves computed reliably, robustly, and accurately. Here, the circumstances are elucidated, and possible layer combinations, under which guided wave solutions, in multilayered systems composed of generally anisotropic layers in flat and cylindrical geometries, have specific properties of coupling and parity; the partial wave decomposition of the wave field is utilised to unravel the behaviour. A classification into five families is introduced and the authors claim that this is the fundamental way to approach generally anisotropic waveguides. This coupling and parity provides information to be used in the design of more efficient and robust dispersion curve tracing algorithms. A critical benefit is that the analysis enables the separation of solutions into categories for which dispersion curves do not cross; this allows the curves to be calculated simply and without ambiguity.
An economics systems analysis of land mobile radio telephone services
NASA Technical Reports Server (NTRS)
Leroy, B. E.; Stevenson, S. M.
1980-01-01
The economic interaction of the terrestrial and satellite systems is considered. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as a function of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/sq km) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price/demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.
Orexin-1 receptor signaling increases motivation for cocaine-associated cues
Bentzley, Brandon S.; Aston-Jones, Gary
2015-01-01
The orexin/hypocretin system is involved in multiple cocaine addiction processes that involve drug-associated environmental cues, including cue-induced reinstatement of extinguished cocaine seeking and expression of conditioned place preference. However, the orexin system does not play a role in several behaviors that are less cue-dependent, such as cocaine-primed reinstatement of extinguished cocaine seeking and low-effort cocaine self-administration. We hypothesized that cocaine-associated cues, but not cocaine alone, engage signaling at orexin-1 receptors (OX1R), and this cue-engaged OX1R signaling increases motivation for cocaine. Motivation for cocaine was measured in Sprague-Dawley rats with behavioral-economic demand curve analysis after pretreatment with the OX1R antagonist SB-334867 (SB) or vehicle with and without light+tone cues. Demand for cocaine was higher when cocaine-associated cues were present, and SB only reduced cocaine demand in the presence of these cues. We then asked if cocaine demand is linked to cued-reinstatement of cocaine seeking, as both procedures are partially driven by cocaine-associated cues in an orexin-dependent manner. SB blocked cue-induced reinstatement behavior, and baseline demand predicted SB efficacy with the largest effect in high demand animals, i.e., animals with the greatest cue-dependent behavior. We conclude that OX1R signaling increases the reinforcing efficacy of cocaine-associated cues but not for cocaine alone. This supports our view that orexin plays a prominent role in the ability of conditioned cues to activate motivational responses. PMID:25754681
NASA Astrophysics Data System (ADS)
Siahpolo, Navid; Gerami, Mohsen; Vahdani, Reza
2016-09-01
Evaluating the capability of elastic Load Patterns (LPs) including seismic codes and modified LPs such as Method of Modal Combination (MMC) and Upper Bound Pushover Analysis (UBPA) in estimating inelastic demands of non deteriorating steel moment frames is the main objective of this study. The Static Nonlinear Procedure (NSP) is implemented and the results of NSP are compared with Nonlinear Time History Analysis (NTHA). The focus is on the effects of near-fault pulselike ground motions. The primary demands of interest are the maximum floor displacement, the maximum story drift angle over the height, the maximum global ductility, the maximum inter-story ductility and the capacity curves. Five types of LPs are selected and the inelastic demands are calculated under four levels of inter-story target ductility ( μ t) using OpenSees software. The results show that the increase in μ t coincides with the migration of the peak demands over the height from the top to the bottom stories. Therefore, all LPs estimate the story lateral displacement accurately at the lower stories. The results are almost independent of the number of stories. While, the inter-story drift angle (IDR) obtained from MMC method has the most appropriate accuracy among the other LPs. Although, the accuracy of this method decreases with increasing μ t so that with increasing number of stories, IDR is smaller or greater than the values resulted from NTHA depending on the position of captured results. In addition, increasing μ t decreases the accuracy of all LPs in determination of critical story position. In this case, the MMC method has the best coincidence with distribution of inter-story ductility over the height.
Poor impulse control predicts inelastic demand for nicotine but not alcohol in rats.
Diergaarde, Leontien; van Mourik, Yvar; Pattij, Tommy; Schoffelmeer, Anton N M; De Vries, Taco J
2012-05-01
Tobacco and alcohol dependence are characterized by continued use despite deleterious health, social and occupational consequences, implying that addicted individuals pay a high price for their use. In behavioral economic terms, such persistent consumption despite increased costs can be conceptualized as inelastic demand. Recent animal studies demonstrated that high-impulsive individuals are more willing to work for nicotine or cocaine infusions than their low-impulsive counterparts, indicating that this trait might be causally related to inelastic drug demand. By employing progressive ratio schedules of reinforcement combined with a behavioral economics approach of analysis, we determined whether trait impulsivity is associated with an insensitivity of nicotine or alcohol consumption to price increments. Rats were trained on a delayed discounting task, measuring impulsive choice. Hereafter, high- and low-impulsive rats were selected and trained to nose poke for intravenous nicotine or oral alcohol. Upon stable self-administration on a continuous reinforcement schedule, the price (i.e. response requirement) was increased. Demand curves, depicting the relationship between price and consumption, were produced using Hursh's exponential demand equation. Similar to human observations, nicotine and alcohol consumption in rats fitted this equation, thereby demonstrating the validity of our model. Moreover, high-impulsive rats displayed inelastic nicotine demand, as their nicotine consumption was less sensitive to price increments as compared with that in low-impulsive rats. Impulsive choice was not related to differences in alcohol demand elasticity. Our model seems well suited for studying nicotine and alcohol demand in rats and, as such, might contribute to our understanding of tobacco and alcohol dependence. © 2011 The Authors, Addiction Biology © 2011 Society for the Study of Addiction.
Dose and elasticity of demand for self-administered cocaine in rats.
Kearns, David N; Silberberg, Alan
2016-04-01
The present experiment tested whether the elasticity of demand for self-administered cocaine in rats is dose-dependent. Subjects lever pressed for three different doses of intravenous cocaine - 0.11, 0.33, and 1.0 mg/kg/infusion - on a demand procedure where the number of lever presses required per infusion increased within a session. The main finding was that demand for the 0.11 mg/kg dose was more elastic than it was for the two larger doses. There was no difference in demand elasticity between the 0.33 and 1.0 mg/kg doses. These results parallel findings previously reported in monkeys. The present study also demonstrated that a within-session procedure can be used to generate reliable demand curves.
Optimization of a Future RLV Business Case using Multiple Strategic Market Prices
NASA Astrophysics Data System (ADS)
Charania, A.; Olds, J. R.
2002-01-01
There is a lack of depth in the current paradigm of conceptual level economic models used to evaluate the value and viability of future capital projects such as a commercial reusable launch vehicle (RLV). Current modeling methods assume a single price is charged to all customers, public or private, in order to optimize the economic metrics of interest. This assumption may not be valid given the different utility functions for space services of public and private entities. The government's requirements are generally more inflexible than its commercial counterparts. A government's launch schedules are much more rigid, choices of international launch services restricted, and launch specifications generally more stringent as well as numerous. These requirements generally make the government's demand curve more inelastic. Subsequently, a launch vehicle provider will charge a higher price (launch price per kg) to the government and may obtain a higher level of financial profit compared to an equivalent a commercial payload. This profit is not a sufficient condition to enable RLV development by itself but can help in making the financial situation slightly better. An RLV can potentially address multiple payload markets; each market has a different price elasticity of demand for both the commercial and government customer. Thus, a more resilient examination of the economic landscape requires optimization of multiple prices in which each price affects a different demand curve. Such an examination is performed here using the Cost and Business Analysis Module (CABAM), an MS-Excel spreadsheet-based model that attempts to couple both the demand and supply for space transportation services in the future. The demand takes the form of market assumptions (both near-term and far-term) and the supply comes from user-defined vehicles that are placed into the model. CABAM represents RLV projects as commercial endeavors with the possibility to model the effects of government contribution, tax-breaks, loan guarantees, etc. The optimization performed here is for a 3rd Generation RLV program. The economic metric being optimized (maximized) is Net Present Value (NPV) based upon a given company financial structure and cost of capital assumptions. Such an optimization process demands more sophisticated optimizers and can result in non-unique solutions/local minimums if using gradient-based optimization. Domain spanning/evolutionary algorithms are used to obtain the optimized solution in the design space. These capabilities generally increase model calculation time but incorporate realistic pricing portfolios than just assuming one unified price for all launch markets. This analysis is conducted with CABAM running in Phoenix Integration's ModelCenter 4.0 collaborative design environment using the SpaceWorks Engineering, Inc. (SEI) OptWorks suite of optimization components.
Validity of the alcohol purchase task: a meta-analysis.
Kiselica, Andrew M; Webber, Troy A; Bornovalova, Marina A
2016-05-01
Behavioral economists assess alcohol consumption as a function of unit price. This method allows construction of demand curves and demand indices, which are thought to provide precise numerical estimates of risk for alcohol problems. One of the more commonly used behavioral economic measures is the Alcohol Purchase Task (APT). Although the APT has shown promise as a measure of risk for alcohol problems, the construct validity and incremental utility of the APT remain unclear. This paper presents a meta-analysis of the APT literature. Sixteen studies were included in the meta-analysis. Studies were gathered via searches of the PsycInfo, PubMed, Web of Science and EconLit research databases. Random-effects meta-analyses with inverse variance weighting were used to calculate summary effect sizes for each demand index-drinking outcome relationship. Moderation of these effects by drinking status (regular versus heavy drinkers) was examined. Additionally, tests of the incremental utility of the APT indices in predicting drinking problems above and beyond measuring alcohol consumption were performed. The APT indices were correlated in the expected directions with drinking outcomes, although many effects were small in size. These effects were typically not moderated by the drinking status of the samples. Additionally, the intensity metric demonstrated incremental utility in predicting alcohol use disorder symptoms beyond measuring drinking. The Alcohol Purchase Task appears to have good construct validity, but limited incremental utility in estimating risk for alcohol problems. © 2015 Society for the Study of Addiction.
Behavioral economic analysis of drug preference using multiple choice procedure data.
Greenwald, Mark K
2008-01-11
The multiple choice procedure has been used to evaluate preference for psychoactive drugs, relative to money amounts (price), in human subjects. The present re-analysis shows that MCP data are compatible with behavioral economic analysis of drug choices. Demand curves were constructed from studies with intravenous fentanyl, intramuscular hydromorphone and oral methadone in opioid-dependent individuals; oral d-amphetamine, oral MDMA alone and during fluoxetine treatment, and smoked marijuana alone or following naltrexone pretreatment in recreational drug users. For each participant and dose, the MCP crossover point was converted into unit price (UP) by dividing the money value ($) by the drug dose (mg/70kg). At the crossover value, the dose ceases to function as a reinforcer, so "0" was entered for this and higher UPs to reflect lack of drug choice. At lower UPs, the dose functions as a reinforcer and "1" was entered to reflect drug choice. Data for UP vs. average percent choice were plotted in log-log space to generate demand functions. Rank of order of opioid inelasticity (slope of non-linear regression) was: fentanyl>hydromorphone (continuing heroin users)>methadone>hydromorphone (heroin abstainers). Rank order of psychostimulant inelasticity was d-amphetamine>MDMA>MDMA+fluoxetine. Smoked marijuana was more inelastic with high-dose naltrexone. These findings show this method translates individuals' drug preferences into estimates of population demand, which has the potential to yield insights into pharmacotherapy efficacy, abuse liability assessment, and individual differences in susceptibility to drug abuse.
Behavioral Economic Analysis of Drug Preference Using Multiple Choice Procedure Data
Greenwald, Mark K.
2008-01-01
The Multiple Choice Procedure has been used to evaluate preference for psychoactive drugs, relative to money amounts (price), in human subjects. The present re-analysis shows that MCP data are compatible with behavioral economic analysis of drug choices. Demand curves were constructed from studies with intravenous fentanyl, intramuscular hydromorphone and oral methadone in opioid-dependent individuals; oral d-amphetamine, oral MDMA alone and during fluoxetine treatment, and smoked marijuana alone or following naltrexone pretreatment in recreational drug users. For each participant and dose, the MCP crossover point was converted into unit price (UP) by dividing the money value ($) by the drug dose (mg/70 kg). At the crossover value, the dose ceases to function as a reinforcer, so “0” was entered for this and higher UPs to reflect lack of drug choice. At lower UPs, the dose functions as a reinforcer and “1” was entered to reflect drug choice. Data for UP vs. average percent choice were plotted in log-log space to generate demand functions. Rank of order of opioid inelasticity (slope of non-linear regression) was: fentanyl > hydromorphone (continuing heroin users) > methadone > hydromorphone (heroin abstainers). Rank order of psychostimulant inelasticity was d-amphetamine > MDMA > MDMA + fluoxetine. Smoked marijuana was more inelastic with high-dose naltrexone. These findings show this method translates individuals’ drug preferences into estimates of population demand, which has the potential to yield insights into pharmacotherapy efficacy, abuse liability assessment, and individual differences in susceptibility to drug abuse. PMID:17949924
NASA Astrophysics Data System (ADS)
Priyadarshini, Lakshmi
Frequently transported packaging goods are more prone to damage due to impact, jolting or vibration in transit. Fragile goods, for example, glass, ceramics, porcelain are susceptible to mechanical stresses. Hence ancillary materials like cushions play an important role when utilized within package. In this work, an analytical model of a 3D cellular structure is established based on Kelvin model and lattice structure. The research will provide a comparative study between the 3D printed Kelvin unit structure and 3D printed lattice structure. The comparative investigation is based on parameters defining cushion performance such as cushion creep, indentation, and cushion curve analysis. The applications of 3D printing is in rapid prototyping where the study will provide information of which model delivers better form of energy absorption. 3D printed foam will be shown as a cost-effective approach as prototype. The research also investigates about the selection of material for 3D printing process. As cushion development demands flexible material, three-dimensional printing with material having elastomeric properties is required. Further, the concept of cushion design is based on Kelvin model structure and lattice structure. The analytical solution provides the cushion curve analysis with respect to the results observed when load is applied over the cushion. The results are reported on basis of attenuation and amplification curves.
A questionnaire approach to measuring the relative reinforcing efficacy of snack foods
Epstein, Leonard H.; Dearing, Kelly K.; Roba, Lora G.
2010-01-01
Behavioral choice theory and laboratory choice paradigms can provide a framework to understand the reinforcing efficacy or reinforcing value of food. Reinforcing efficacy is measured in the laboratory by assessing how much effort one will engage in to gain access to food as the amount of work progressively increases. However, this method to establish demand curves as estimates of reinforcer efficacy is time consuming and limits the number of reinforcers that can be tested. The general aim of this study was to compare the reinforcing efficacy of snack foods using a behavioral task that requires subjects to respond to gain access to portions of food (LAB task) with a questionnaire version of a purchasing task designed to determine demand curves (QUES task) in nonobese and obese adults (n = 24). Results showed correlations between the maximal amount of money that individuals were willing to spend for food (QUES Omax) and the maximal amount of responses made on the highest reinforcement schedule completed (LAB Omax) (r = 0.45, p < 0.05), and between BMI and the LAB Omax (r = 0.43, p < 0.05) and the QUES Omax (r = 0.52, p < 0.05). The study suggests the questionnaire provides valid measures of reinforcing efficacy that can be used in place of or in conjunction with traditional laboratory paradigms to establish demand curves that describe the behavioral maintaining properties of food. PMID:20188288
ERIC Educational Resources Information Center
Yan, Ji; Foxall, Gordon R.; Doyle, John R.
2012-01-01
Essential value is defined by Hursh and Silberberg (2008) as the value of reinforcers, presented in an exponential model (Equation 1). This study extends previous research concerned with animal behavior or human responding in therapeutic situations. We applied 9 available demand curves to consumer data that included 10,000+ data points collected…
Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India
NASA Astrophysics Data System (ADS)
Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.
2014-09-01
The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.
Grace, Randolph C; Kivell, Bronwyn M; Laugesen, Murray
2015-11-01
Cigarette purchase tasks (CPTs) are used increasingly to measure simulated demand curves for tobacco. However, there is currently limited information about the temporal stability of demand curves obtained from these tasks. We interviewed a sample (N = 210) of smokers in New Zealand both before and after a 10% increase in the tobacco excise tax that took effect on January 1, 2013. Participants were interviewed in November-December 2012 (wave 1) and February-March 2013 (wave 2). At each interview, participants completed a high-resolution CPT with 64 prices ranging from NZ $0.00 to NZ $5.00/cigarette, and questionnaires regarding their smoking habit. Roll-your-own smokers had higher levels of nicotine dependence and tobacco demand based on CPT responses than factory-made smokers. Although demand curves for waves 1 and 2 were similar, intentions to purchase cigarettes were significantly less at wave 2 for three prices (NZ $0.85, NZ $0.90, and NZ $0.95) that were just higher than the actual price after the tax increase, for both roll-your-own and factory-made smokers. Measures of elasticity (α) derived from Hursh and Silberberg's model were significantly greater at wave 2 than wave 1, and there was a significant reduction in smoking habit as measured by cigarettes/day and the Fagerström Test for Nicotine Dependence at wave 2. Purchase tasks can discriminate between smokers based on their tobacco preference, and although results are relatively stable over time, they depend on contextual factors such as the current real price for tobacco. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Murphy, James G; Yurasek, Ali M; Meshesha, Lidia Z; Dennhardt, Ashley A; MacKillop, James; Skidmore, Jessica R; Martens, Matthew P
2014-07-01
Behavioral economic demand curves measure alcohol consumption as a function of price and may capture clinically relevant individual differences in alcohol-reinforcing efficacy. This study used a novel, behavioral-economic, hypothetical demand-curve paradigm to examine the association between family history of alcohol misuse and individual differences in both alcohol demand and the relative sensitivity of alcohol demand to next-day responsibilities. Participants were 207 college students (47% male, 68.5% White, 27.4% African American, Mage = 19.5 years) who reported at least one heavy drinking episode (5/4 or more drinks on one occasion for a man/woman) in the past month and completed two versions of an alcohol purchase task (APT) that assessed hypothetical alcohol consumption across 17 drink prices. In one APT (standard), students imagined they had no next-day responsibilities, and in the other, they imagined having a 10:00 a.m. test the next day. A series of analyses of covariance indicated that participants with at least one biological parent or grandparent who had misused alcohol reported similar levels of alcohol demand on the standard APT but significantly less sensitivity to the next-day academic responsibility as measured by the percentage of reduction in demand intensity and breakpoint across the no-responsibility and next-day-test conditions. These findings provide initial evidence that APTs might clarify one potential mechanism of risk conferred by family history. Young adult heavy drinkers with a family history of problematic drinking may be less sensitive to next-day responsibilities that might modulate drinking in drinkers without a family history of alcohol problems.
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
Behavioral economics and empirical public policy.
Hursh, Steven R; Roma, Peter G
2013-01-01
The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively different reinforcers as well as quantifying the choice relations between concurrently available reinforcers. The potential of the behavioral economic approach to inform public policy is illustrated with examples from basic research, pre-clinical behavioral pharmacology, and clinical drug abuse research as well as emerging applications to public transportation and social behavior. Behavioral Economics can serve as a broadly applicable conceptual, methodological, and analytical framework for the development and evaluation of empirical public policy. © Society for the Experimental Analysis of Behavior.
Fuzzy Multi-Objective Vendor Selection Problem with Modified S-CURVE Membership Function
NASA Astrophysics Data System (ADS)
Díaz-Madroñero, Manuel; Peidro, David; Vasant, Pandian
2010-06-01
In this paper, the S-Curve membership function methodology is used in a vendor selection (VS) problem. An interactive method for solving multi-objective VS problems with fuzzy goals is developed. The proposed method attempts simultaneously to minimize the total order costs, the number of rejected items and the number of late delivered items with reference to several constraints such as meeting buyers' demand, vendors' capacity, vendors' quota flexibility, vendors' allocated budget, etc. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in VS problems, with linear membership functions.
Heroin and saccharin demand and preference in rats.
Schwartz, Lindsay P; Kim, Jung S; Silberberg, Alan; Kearns, David N
2017-09-01
Several recent studies have investigated the choice between heroin and a non-drug alternative reinforcer in rats. A common finding in these studies is that there are large individual differences in preference, with some rats preferring heroin and some preferring the non-drug alternative. The primary goal of the present study was to determine whether individual differences in how heroin or saccharin is valued, based on demand analysis, predicts choice. Rats lever-pressed for heroin infusions and saccharin reinforcers on fixed-ratio schedules. The essential value of each reinforcer was obtained from resulting demand curves. Rats were then trained on a mutually exclusive choice procedure where pressing one lever resulted in heroin and pressing another resulted in saccharin. After seven sessions of increased access to heroin or saccharin, rats were reexposed to the demand and choice procedures. Demand for heroin was more elastic than demand for saccharin (i.e., heroin had lower essential value than saccharin). When allowed to choose, most rats preferred saccharin. The essential value of heroin, but not saccharin, predicted preference. The essential value of both heroin and saccharin increased following a week of increased access to heroin, but similar saccharin exposure had no effect on essential value. Preference was unchanged after increased access to either reinforcer. Heroin-preferring rats differed from saccharin-preferring rats in how they valued heroin, but not saccharin. To the extent that choice models addiction-related behavior, these results suggest that overvaluation of opioids specifically, rather than undervaluation of non-drug alternatives, could identify susceptible individuals. Copyright © 2017 Elsevier B.V. All rights reserved.
Economic demand and essential value.
Hursh, Steven R; Silberberg, Alan
2008-01-01
The strength of a rat's eating reflex correlates with hunger level when strength is measured by the response frequency that precedes eating (B. F. Skinner, 1932a, 1932b). On the basis of this finding, Skinner argued response frequency could index reflex strength. Subsequent work documented difficulties with this notion because responding was affected not only by the strengthening properties of the reinforcer but also by the rate-shaping effects of the schedule. This article obviates this problem by measuring strength via methods from behavioral economics. This approach uses demand curves to map how reinforcer consumption changes with changes in the "price" different ratio schedules impose. An exponential equation is used to model these demand curves. The value of this exponential's rate constant is used to scale the strength or essential value of a reinforcer, independent of the scalar dimensions of the reinforcer. Essential value determines the consumption level to be expected at particular prices and the response level that will occur to support that consumption. This approach permits comparing reinforcers that differ in kind, contributing toward the goal of scaling reinforcer value. (c) 2008 APA, all rights reserved
ERIC Educational Resources Information Center
Smith, Peter
1983-01-01
A theory of the determination of the price level is described, using aggregate demand and aggregate supply curves. The theory is then utilized to examine inflation in the United Kingdom since 1959. (Author/RM)
NASA Technical Reports Server (NTRS)
Tideman, T. N.
1972-01-01
An economic approach to design efficient transportation systems involves maximizing an objective function that reflects both goals and costs. A demand curve can be derived by finding the quantities of a good that solve the maximization problem as one varies the price of that commodity, holding income and the prices of all other goods constant. A supply curve is derived by applying the idea of profit maximization of firms. The production function determines the relationship between input and output.
Robotic partial nephrectomy - Evaluation of the impact of case mix on the procedural learning curve.
Roman, A; Ahmed, K; Challacombe, B
2016-05-01
Although Robotic partial nephrectomy (RPN) is an emerging technique for the management of small renal masses, this approach is technically demanding. To date, there is limited data on the nature and progression of the learning curve in RPN. To analyse the impact of case mix on the RPN LC and to model the learning curve. The records of the first 100 RPN performed, were analysed at our institution that were carried out by a single surgeon (B.C) (June 2010-December 2013). Cases were split based on their Preoperative Aspects and Dimensions Used for an Anatomical (PADUA) score into the following groups: 6-7, 8-9 and >10. Using a split group (20 patients in each group) and incremental analysis, the mean, the curve of best fit and R(2) values were calculated for each group. Of 100 patients (F:28, M:72), the mean age was 56.4 ± 11.9 years. The number of patients in each PADUA score groups: 6-7, 8-9 and >10 were 61, 32 and 7 respectively. An increase in incidence of more complex cases throughout the cohort was evident within the 8-9 group (2010: 1 case, 2013: 16 cases). The learning process did not significantly affect the proxies used to assess surgical proficiency in this study (operative time and warm ischaemia time). Case difficulty is an important parameter that should be considered when evaluating procedural learning curves. There is not one well fitting model that can be used to model the learning curve. With increasing experience, clinicians tend to operate on more difficult cases. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
A cost-benefit analysis of demand for food.
Hursh, S R; Raslear, T G; Shurtleff, D; Bauman, R; Simmons, L
1988-01-01
Laboratory studies of consumer demand theory require assumptions regarding the definition of price in the absence of a medium of exchange (money). In this study we test the proposition that the fundamental dimension of price is a cost-benefit ratio expressed as the effort expended per unit of food value consumed. Using rats as subjects, we tested the generality of this "unit price" concept by varying four dimensions of price: fixed-ratio schedule, number of food pellets per fixed-ratio completion, probability of reinforcement, and response lever weight or effort. Two levels of the last three factors were combined in a 2 x 2 x 2 design giving eight groups. Each group was studied under a series of six FR schedules. Using the nominal values of all factors to determine unit price, we found that grams of food consumed plotted as a function of unit price followed a single demand curve. Similarly, total work output (responses x effort) conformed to a single function when plotted in terms of unit price. These observations provided a template for interpreting the effects of biological factors, such as brain lesions or drugs, that might alter the cost-benefit ratio. PMID:3209958
NASA Astrophysics Data System (ADS)
Ebersbach, K. F.; Fischer, A.; Layer, G.; Steinberger, W.; Wegner, M.; Wiesner, B.
1982-07-01
The energy demand in the sector of trade and commerce was registered and analyzed. Measures to improve the energy demand structure are presented. In several typical firms like hotels, office buildings, locksmith's shops, motor vehicle repair shops, butcher's shops, laundries and bakeries, detailed surveys of energy consumption were done and included in a statistic evaluation. Subjects analyzed were: development of the energy supply; technology of energy application; final energy demand broken down into demand for light, power, space heating and process heat as well as the demand for cooling; daily and annual load curves of energy consumption and their dependence on various parameters; and measures to improve the structure of energy demand. Detailed measurement points out negligences in the surveyed firms and shows possibilities for likely energy savings. In addition, standard values for specific energy consumption are obtained.
NASA Astrophysics Data System (ADS)
Ebersbach, K. F.; Fischer, A.; Layer, G.; Steinberger, W.; Wegner, M.; Wiesner, B.
1982-06-01
The energy demand in trade and commerce was analyzed. Measures to improve the energy demand structure are presented. In several typical firms, like hotels, office buildings, locksmith's shops, motor vehicle repair shops, butcher's shops, laundries and bakeries, energy consumption was surveyed and statistically evaluated. Subjects analyzed are: the development of the energy supply; the technology of energy application; the final energy demand broken down into demand for light, power, space heating and process heat as well as the demand for cooling; the daily and annual load curve of energy consumption and its dependence on various parameters; and measures to improve the structure of energy demand. The detailed measurement points out negligences in the surveyed firms and shows some possibilities for likely energy savings. In addition, standard values for specific energy consumption are obtained.
NASA Astrophysics Data System (ADS)
Abeygunawardane, Saranga Kumudu
2018-02-01
Any electrical utility prefers to implement demand side management and change the shape of the demand curve in a beneficial manner. This paper aims to assess the financial gains (or losses) to the generating sector through the implementation of demand side management programs. An optimization algorithm is developed to find the optimal generation mix that minimizes the daily total generating cost. This daily total generating cost includes the daily generating cost as well as the environmental damage cost. The proposed optimization algorithm is used to find the daily total generating cost for the base case and for several demand side management programs using the data obtained from the Sri Lankan power system. Results obtained for DSM programs are compared with the results obtained for the base case to assess the financial benefits of demand side management to the generating sector.
Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija
2017-12-02
The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.
Liu, Xiao-Hui; Wang, Wei-Liang; Lu, Shao-Yong; Wang, Yu-Fan; Ren, Zongming
2016-08-01
In Zaozhuang, economic development affects the discharge amount of industrial wastewater, chemical oxygen demand (COD), and ammonia nitrogen (NH3-N). To reveal the trend of water environmental quality related to the economy in Zaozhuang, this paper simulated the relationships between industrial wastewater discharge, COD, NH3-N load, and gross domestic product (GDP) per capita for Zaozhuang (2002-2012) using environmental Kuznets curve (EKC) models. The results showed that the added value of industrial GDP, the per capita GDP, and wastewater emission had average annual growth rates of 16.62, 16.19, and 17.89 %, respectively, from 2002 to 2012, while COD and NH3-N emission in 2012, compared with 2002, showed average annual decreases of 10.70 and 31.12 %, respectively. The export of EKC models revealed that industrial wastewater discharge had a typical inverted-U-shaped relationship with per capita GDP. However, both COD and NH3-N showed the binding curve of the left side of the "U" curve and left side U-shaped curve. The economy in Zaozhuang had been at the "fast-growing" stage, with low environmental pollution according to the industrial pollution level. In recent years, Zaozhuang has abated these heavy-pollution industries emphatically, so pollutants have been greatly reduced. Thus, Zaozhuang industrial wastewater treatment has been quite effective, with water quality improved significantly. The EKC models provided scientific evidence for estimating industrial wastewater discharge, COD, and NH3-N load as well as their changeable trends for Zaozhuang from an economic perspective.
Skidmore, Jessica R.; Murphy, James G.; Martens, Matthew P.
2014-01-01
The aims of the current study were to examine the associations among behavioral economic measures of alcohol value derived from three distinct measurement approaches, and to evaluate their respective relations with traditional indicators of alcohol problem severity in college drinkers. Five behavioral economic metrics were derived from hypothetical demand curves that quantify reward value by plotting consumption and expenditures as a function of price, another metric measured proportional behavioral allocation and enjoyment related to alcohol versus other activities, and a final metric measured relative discretionary expenditures on alcohol. The sample included 207 heavy drinking college students (53% female) who were recruited through an on-campus health center or university courses. Factor analysis revealed that the alcohol valuation construct comprises two factors: one factor that reflects participants’ levels of alcohol price sensitivity (demand persistence), and a second factor that reflects participants’ maximum consumption and monetary and behavioral allocation towards alcohol (amplitude of demand). The demand persistence and behavioral allocation metrics demonstrated the strongest and most consistent multivariate relations with alcohol-related problems, even when controlling for other well-established predictors. The results suggest that behavioral economic indices of reward value show meaningful relations with alcohol problem severity in young adults. Despite the presence of some gender differences, these measures appear to be useful problem indicators for men and women. PMID:24749779
Thermal energy storage heat exchanger: Molten salt heat exchanger design for utility power plants
NASA Technical Reports Server (NTRS)
Ferarra, A.; Yenetchi, G.; Haslett, R.; Kosson, R.
1977-01-01
The use of thermal energy storage (TES) in the latent heat of molten salts as a means of conserving fossil fuels and lowering the cost of electric power was evaluated. Public utility systems provided electric power on demand. This demand is generally maximum during late weekday afternoons, with considerably lower overnight and weekend loads. Typically, the average demand is only 60% to 80% of peak load. As peak load increases, the present practice is to purchase power from other grid facilities or to bring older less efficient fossil-fuel plants on line which increase the cost of electric power. The widespread use of oil-fired boilers, gas turbine and diesel equipment to meet peaking loads depletes our oil-based energy resources. Heat exchangers utilizing molten salts can be used to level the energy consumption curve. The study begins with a demand analysis and the consideration of several existing modern fossil-fuel and nuclear power plants for use as models. Salts are evaluated for thermodynamic, economic, corrosive, and safety characteristics. Heat exchanger concepts are explored and heat exchanger designs are conceived. Finally, the economics of TES conversions in existing plants and new construction is analyzed. The study concluded that TES is feasible in electric power generation. Substantial data are presented for TES design, and reference material for further investigation of techniques is included.
Econophysics: Master curve for price-impact function
NASA Astrophysics Data System (ADS)
Lillo, Fabrizio; Farmer, J. Doyne; Mantegna, Rosario N.
2003-01-01
The price reaction to a single transaction depends on transaction volume, the identity of the stock, and possibly many other factors. Here we show that, by taking into account the differences in liquidity for stocks of different size classes of market capitalization, we can rescale both the average price shift and the transaction volume to obtain a uniform price-impact curve for all size classes of firm for four different years (1995-98). This single-curve collapse of the price-impact function suggests that fluctuations from the supply-and-demand equilibrium for many financial assets, differing in economic sectors of activity and market capitalization, are governed by the same statistical rule.
Optimization of the Upper Surface of Hypersonic Vehicle Based on CFD Analysis
NASA Astrophysics Data System (ADS)
Gao, T. Y.; Cui, K.; Hu, S. C.; Wang, X. P.; Yang, G. W.
2011-09-01
For the hypersonic vehicle, the aerodynamic performance becomes more intensive. Therefore, it is a significant event to optimize the shape of the hypersonic vehicle to achieve the project demands. It is a key technology to promote the performance of the hypersonic vehicle with the method of shape optimization. Based on the existing vehicle, the optimization to the upper surface of the Simplified hypersonic vehicle was done to obtain a shape which suits the project demand. At the cruising condition, the upper surface was parameterized with the B-Spline curve method. The incremental parametric method and the reconstruction technology of the local mesh were applied here. The whole flow field was been calculated and the aerodynamic performance of the craft were obtained by the computational fluid dynamic (CFD) technology. Then the vehicle shape was optimized to achieve the maximum lift-drag ratio at attack angle 3°, 4° and 5°. The results will provide the reference for the practical design.
A static predictor of seismic demand on frames based on a post-elastic deflected shape
Mori, Y.; Yamanaka, T.; Luco, N.; Cornell, C.A.
2006-01-01
Predictors of seismic structural demands (such as inter-storey drift angles) that are less time-consuming than nonlinear dynamic analysis have proven useful for structural performance assessment and for design. Luco and Cornell previously proposed a simple predictor that extends the idea of modal superposition (of the first two modes) with the square-root-of-sum-of-squares (SRSS) rule by taking a first-mode inelastic spectral displacement into account. This predictor achieved a significant improvement over simply using the response of an elastic oscillator; however, it cannot capture well large displacements caused by local yielding. A possible improvement of Luco's predictor is discussed in this paper, where it is proposed to consider three enhancements: (i) a post-elastic first-mode shape approximated by the deflected shape from a nonlinear static pushover analysis (NSPA) at the step corresponding to the maximum drift of an equivalent inelastic single-degree-of-freedom (SDOF) system, (ii) a trilinear backbone curve for the SDOF system, and (iii) the elastic third-mode response for long-period buildings. Numerical examples demonstrate that the proposed predictor is less biased and results in less dispersion than Luco's original predictor. Copyright ?? 2006 John Wiley & Sons, Ltd.
Wang, Bing; Liu, Lei; Huang, Guohe
2017-11-01
Using the Environmental Kuznets Curve (EKC) hypothesis, this study explored the dynamic trends of water use and point source pollution in Urumqi (2000-2014) from an economic perspective. Retrospective analysis results indicated that total GDP and GDP per capita increased around tenfold and a fivefold since 2000. Total, municipal and industrial water use had average annual growth rates of 3.96, 7.01, and 3.69%, respectively. However, agricultural water use, emissions of COD and NH 3 -N showed average annual decreases of 3.06, 12.40, and 4.74%. Regression models reveal that total water demand in Urumqi would keep monotonically increasing relationships with GDP and GDP per capita in the foreseeable years. However, the relations of specific water usage and economic growth showed diverse trends. In the future, the discharge of COD and NH 3 -N would further reduce with economic growth. It could be concluded that Urumqi has almost passed the stage where economic growth had caused serious environment deterioration, but the increasing water demand in Urumqi is still an urgent problem. The obtained results would be helpful for water resources management and pollution control in the future.
NASA Technical Reports Server (NTRS)
Wolitz, K.; Brockmann, W.; Fischer, T.
1979-01-01
Acoustic emission analysis as a quasi-nondestructive test method makes it possible to differentiate clearly, in judging the total behavior of fiber-reinforced plastic composites, between critical failure modes (in the case of unidirectional composites fiber fractures) and non-critical failure modes (delamination processes or matrix fractures). A particular advantage is that, for varying pressure demands on the composites, the emitted acoustic pulses can be analyzed with regard to their amplitude distribution. In addition, definite indications as to how the damages occurred can be obtained from the time curves of the emitted acoustic pulses as well as from the particular frequency spectrum. Distinct analogies can be drawn between the various analytical methods with respect to whether the failure modes can be classified as critical or non-critical.
An economic systems analysis of land mobile radio telephone services
NASA Technical Reports Server (NTRS)
Leroy, B. E.; Stevenson, S. M.
1980-01-01
This paper deals with the economic interaction of the terrestrial and satellite land-mobile radio service systems. The cellular, trunked and satellite land-mobile systems are described. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as functions of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/km squared) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.
Grinding Method and Error Analysis of Eccentric Shaft Parts
NASA Astrophysics Data System (ADS)
Wang, Zhiming; Han, Qiushi; Li, Qiguang; Peng, Baoying; Li, Weihua
2017-12-01
RV reducer and various mechanical transmission parts are widely used in eccentric shaft parts, The demand of precision grinding technology for eccentric shaft parts now, In this paper, the model of X-C linkage relation of eccentric shaft grinding is studied; By inversion method, the contour curve of the wheel envelope is deduced, and the distance from the center of eccentric circle is constant. The simulation software of eccentric shaft grinding is developed, the correctness of the model is proved, the influence of the X-axis feed error, the C-axis feed error and the wheel radius error on the grinding process is analyzed, and the corresponding error calculation model is proposed. The simulation analysis is carried out to provide the basis for the contour error compensation.
Using behavioral statistical physics to understand supply and demand
NASA Astrophysics Data System (ADS)
Farmer, Doyne
2007-03-01
We construct a quantitative theory for a proxy for supply and demand curves using methods that look and feel a lot like physics. Neoclassical economics postulates that supply and demand curves can be explained as the result of rational agents selfishly maximizing their utility, but this approach has had very little empirical success. We take quite a different approach, building supply and demand curves out of impulsive responses to not-quite-random trading fluctuations. Because of reasons of empirical measurability, as a good proxy for changes in supply and demand we study the aggregate price impact function R(V), giving the average logarithmic price change R as a function of the signed trading volume V. (If a trade vi is initiated by a buyer, it has a plus sign, and vice versa for sellers; the signed trading volume for a series of N successive trades is VN(t) = ∑i=t^i=t+N vi). We develop a ``zero-intelligence" null hypothesis that each trade vi gives an impulsive kick f(vi) to the price, so that the average return RN(t) = ∑i=t^i=t+N f(vi). Under the assumption that vi is IID, R(VN) has a characteristic concave shape, becoming linear in the limit as N ->∞. Under some circumstances this is universal for large N, in the sense that it is independent of the functional form of f. While this null hypothesis gives useful qualitative intuition, to make it quantitatively correct, one must add two additional elements: (1) The signs of vi are a long-memory process and (2) the return R is efficient, in the sense that it is not possible to make profits with a linear prediction of the signs of vi. Using data from the London Stock Exchange we demonstrate that this theory works well, predicting both the magnitude and shape of R(VN). We show that the fluctuations in R are very large and for some purposes more important than the average behavior. A computer model for the fluctuations suggests the existence of an equation of state relating the diffusion rate of prices to the flow of trading orders.
Reliability Analysis of a Green Roof Under Different Storm Scenarios
NASA Astrophysics Data System (ADS)
William, R. K.; Stillwell, A. S.
2015-12-01
Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.
Economic concepts to address future water supply-demand imbalances in Iran, Morocco and Saudi Arabia
NASA Astrophysics Data System (ADS)
Hellegers, Petra; Immerzeel, Walter; Droogers, Peter
2013-10-01
In Middle East and North Africa (MENA) countries, renewable groundwater and surface water supply are limited while demand for water is growing rapidly. Climate change is expected to increase water demand even further. The main aim of this paper is to evaluate the water supply-demand imbalances in Iran, Morocco and Saudi Arabia in 2040-2050 under dry, average and wet climate change projections and to show on the basis of the marginal cost and marginal value of water the optimum mix of supply-side and demand-side adjustments to address the imbalance. A hydrological model has been used to estimate the water supply-demand imbalance. Water supply and demand curves have been used to explore for which (marginal value of) water usage the marginal cost of supply-enhancement becomes too expensive. The results indicate that in the future in all cases, except in Iran under the wet climate projection, the quantity of water demanded has to be reduced considerably to address the imbalance, which is indeed what is currently happening already.
NASA Astrophysics Data System (ADS)
Moore, K. M.; Jaeger, W. K.; Jones, J. A.
2013-12-01
A central characteristic of large river basins in the western US is the spatial and temporal disjunction between the supply of and demand for water. Water sources are typically concentrated in forested mountain regions distant from municipal and agricultural water users, while precipitation is super-abundant in winter and deficient in summer. To cope with these disparities, systems of reservoirs have been constructed throughout the West. These reservoir systems are managed to serve two main competing purposes: to control flooding during winter and spring, and to store spring runoff and deliver it to populated, agricultural valleys during the summer. The reservoirs also provide additional benefits, including recreation, hydropower and instream flows for stream ecology. Since the storage capacity of the reservoirs cannot be used for both flood control and storage at the same time, these uses are traded-off during spring, as the most important, or dominant use of the reservoir, shifts from buffering floods to storing water for summer use. This tradeoff is expressed in the operations rule curve, which specifies the maximum level to which a reservoir can be filled throughout the year, apart from real-time flood operations. These rule curves were often established at the time a reservoir was built. However, climate change and human impacts may be altering the timing and amplitude of flood events and water scarcity is expected to intensify with anticipated changes in climate, land cover and population. These changes imply that reservoir management using current rule curves may not match future societal values for the diverse uses of water from reservoirs. Despite a broad literature on mathematical optimization for reservoir operation, these methods are not often used because they 1) simplify the hydrologic system, raising doubts about the real-world applicability of the solutions, 2) exhibit perfect foresight and assume stationarity, whereas reservoir operators face uncertainty and risk daily, and 3) require complex computer programming. The proposed research addresses these critiques by pursuing a novel approach - the development of an analytical method to demonstrate how reservoir management could adapt to anticipated changes in water supply and demand, which incorporates some of the complexity of the hydrologic system, includes stochasticity, and can be readily implemented. Employing a normative economic framework of social welfare maximization, the research will 1) estimate the social benefits associated with reservoir uses, 2) analytically derive conditions for maximizing the benefits of reservoir operation, and 3) estimate the resulting optimal operating rules under future trajectories of climate, land cover, and population. The findings of this analysis will be used to address the following research questions: 1) How do the derived optimal operating rules compare to the existing rule curves? 2) How does the shape of the derived rule curves change under different scenarios of global change? 3) What is the change in net social benefits resulting from the use of these derived rule curves as compared to existing rule curves? 4) To the extent possible, what are the distributional and social justice implications of the derived changes in the rule curves?
Reinventing the Solar Power Satellite
NASA Technical Reports Server (NTRS)
Landis, Geoffrey A.
2004-01-01
The selling price of electrical power varies with time. The economic viability of space solar power is maximum if the power can be sold at peak power rates, instead of baseline rate. Price and demand of electricity was examined from spot-market data from four example markets: New England, New York City, suburban New York, and California. The data was averaged to show the average price and demand for power as a function of time of day and time of year. Demand varies roughly by a factor of two between the early-morning minimum demand, and the afternoon maximum; both the amount of peak power, and the location of the peak, depends significantly on the location and the weather. The demand curves were compared to the availability curves for solar energy and for tracking and non-tracking satellite solar power systems in order to compare the market value of terrestrial and solar electrical power. In part 2, new designs for a space solar power (SSP) system were analyzed to provide electrical power to Earth for economically competitive rates. The approach was to look at innovative power architectures to more practical approaches to space solar power. A significant barrier is the initial investment required before the first power is returned. Three new concepts for solar power satellites were invented and analyzed: a solar power satellite in the Earth-Sun L2 point, a geosynchronous no-moving parts solar power satellite, and a nontracking geosynchronous solar power satellite with integral phased array. The integral-array satellite had several advantages, including an initial investment cost approximately eight times lower than the conventional design.
NASA Astrophysics Data System (ADS)
Mousavi, Seyed Hosein; Nazemi, Ali; Hafezalkotob, Ashkan
2016-09-01
With the increasing use of different types of auctions in market designing, modeling of participants' behaviors to evaluate the market structure is one of the main discussions in the studies related to the deregulated power industries. In this article, we apply an approach of the optimal bidding behavior to the Iran wholesale electricity market as a restructured electric power industry and model how the participants of the market bid in the spot electricity market. The problem is formulated analytically using the Nash equilibrium concept composed of large numbers of players having discrete and very large strategy spaces. Then, we compute and draw supply curve of the competitive market in which all generators' proposed prices are equal to their marginal costs and supply curve of the real market in which the pricing mechanism is pay-as-bid. We finally calculate the lost welfare or inefficiency of the Nash equilibrium and the real market by comparing their supply curves with the competitive curve. We examine 3 cases on November 24 (2 cases) and July 24 (1 case), 2012. It is observed that in the Nash equilibrium on November 24 and demand of 23,487 MW, there are 212 allowed plants for the first case (plants are allowed to choose any quantity of generation except one of them that should be equal to maximum Power) and the economic efficiency or social welfare of Nash equilibrium is 2.77 times as much as the real market. In addition, there are 184 allowed plants for the second case (plants should offer their maximum power with different prices) and the efficiency or social welfare of Nash equilibrium is 3.6 times as much as the real market. On July 24 and demand of 42,421 MW, all 370 plants should generate maximum energy due to the high electricity demand that the economic efficiency or social welfare of the Nash equilibrium is about 2 times as much as the real market.
Estimation of Bid Curves in Power Exchanges using Time-varying Simultaneous-Equations Models
NASA Astrophysics Data System (ADS)
Ofuji, Kenta; Yamaguchi, Nobuyuki
Simultaneous-equations model (SEM) is generally used in economics to estimate interdependent endogenous variables such as price and quantity in a competitive, equilibrium market. In this paper, we have attempted to apply SEM to JEPX (Japan Electric Power eXchange) spot market, a single-price auction market, using the publicly available data of selling and buying bid volumes, system price and traded quantity. The aim of this analysis is to understand the magnitude of influences to the auctioned prices and quantity from the selling and buying bids, than to forecast prices and quantity for risk management purposes. In comparison with the Ordinary Least Squares (OLS) estimation where the estimation results represent average values that are independent of time, we employ a time-varying simultaneous-equations model (TV-SEM) to capture structural changes inherent in those influences, using State Space models with Kalman filter stepwise estimation. The results showed that the buying bid volumes has that highest magnitude of influences among the factors considered, exhibiting time-dependent changes, ranging as broad as about 240% of its average. The slope of the supply curve also varies across time, implying the elastic property of the supply commodity, while the demand curve remains comparatively inelastic and stable over time.
NASA Astrophysics Data System (ADS)
Beam, Craig A.
2002-04-01
Each year, approximately 60% of all US women over the age of 40 utilize mammography. Through the matrix of an imaging technology, this Population of Patients (POP) interacts with a population of approximately 20,000 physicians who interpret mammograms in the US. This latter Population of Diagnosticians (POD) operationally serves as the interface between an image-centric healthcare technology system and patient. Methods: using data collected from a large POD and POP based study, I evaluate the distribution of several ROC curve-related parameters in the POD and explore the health policy implications of a population ROC curve for mammography. Results and Conclusions: Principal Components Analysis suggests that two Binormal parameters are sufficient to explain variation in the POD and implies that the Binormal model is foundational to Health Policy Research in Mammography. A population ROC curve based on percentiles of the POD can be used to set targets to achieve national health policy goals. Medical Image Perception science provides the framework. Alternatively, a restrictive policy can be envisioned using performance criteria based on area. However, the data suggests this sort of policy would be too costly in terms of reduced healthcare service capacity in the US in the face of burgeoning demands.
Geometrically nonlinear analysis of layered composite plates and shells
NASA Technical Reports Server (NTRS)
Chao, W. C.; Reddy, J. N.
1983-01-01
A degenerated three dimensional finite element, based on the incremental total Lagrangian formulation of a three dimensional layered anisotropic medium was developed. Its use in the geometrically nonlinear, static and dynamic, analysis of layered composite plates and shells is demonstrated. A two dimenisonal finite element based on the Sanders shell theory with the von Karman (nonlinear) strains was developed. It is shown that the deflections obtained by the 2D shell element deviate from those obtained by the more accurate 3D element for deep shells. The 3D degenerated element can be used to model general shells that are not necessarily doubly curved. The 3D degenerated element is computationally more demanding than the 2D shell theory element for a given problem. It is found that the 3D element is an efficient element for the analysis of layered composite plates and shells undergoing large displacements and transient motion.
NASA Astrophysics Data System (ADS)
Strzepek, K. M.; Kirshen, P.; Yohe, G.
2001-05-01
The fundamental theme of this research was to investigate tradeoffs in model resolution for modeling water resources in the context of national economic development and capital investment decisions.. Based on a case study of China, the research team has developed water resource models at relatively fine scales, then investigated how they can be aggregated to regional or national scales and for use in national level planning decisions or global scale integrated assessment models of food and/or environmental change issues. The team has developed regional water supply and water demand functions.. Simplifying and aggregating the supply and demand functions will allow reduced form functions of the water sector for inclusion in large scale national economic models. Water Supply Cost functions were developed looking at both surface and groundwater supplies. Surface Water: Long time series of flows at the mouths of the 36 major river sub-basins in China are used in conjunction with different basin reservoir storage quantities to obtain storage-yield curves. These are then combined with reservoir and transmission cost data to obtain yield-cost or surface water demand curves. The methodology to obtain the long time series of flows for each basin is to fit a simple abcd water balance model to each basin. The costs of reservoir storage have been estimated by using a methodology developed in the USA that relates marginal storage costs to existing storage, slope and geological conditions. USA costs functions have then been adjusted to Chinese costs. The costs of some actual dams in China were used to "ground-truth" the methodology. Groundwater: The purpose of the groundwater work is to estimate the recharge in each basin, and the depths and quality of water of aquifers. A byproduct of the application of the abcd water balance model is the recharge. Depths and quality of aquifers are being taken from many separate reports on groundwater in different parts of China; we have been unable to find any global or regional datasets of groundwater.. Combining Surface and Groundwater Supply Functions Water Demand Curves. Water Use data is reported on political regions. Water Supply is reported and modeled on river basin regions. It is necessary to allocate water demands to river basins. To accomplish this China's 9 major river basins were divided into 36 hydroeconomic regions. The counties were then allocated to one of the 36-hydroeconomic zones. The county-level water use data was aggregated to 5 major water use sectors: 1)industry; 2)urban municipal and vegetable gardens: 3) major irrigation; 4) Energy and 5)Other agriculture (forestry, pasture; fishery). Sectoral Demand functions that include price and income elasticity were developed for the 5 sectors for each of the 9 major river basin. The supply and demand curves were aggregated at a variety of geographical scales as well as levels of economic sectoral aggregation. Implications for investment and sustainable development policies were examined for the various aggregation using partial and general equilibrium modeling of the Chinese economy. These results and policy implications for China as well as insights and recommendation for other developing countries will be presented.
NASA Astrophysics Data System (ADS)
Wu, Qi
2010-03-01
Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.
The Mobility Assessment Course for the Diagnosis of Spatial Neglect: Taking a Step Forward?
Grech, Megan; Stuart, Tracey; Williams, Lindy; Chen, Celia; Loetscher, Tobias
2017-01-01
Spatial neglect after stroke can be a challenging syndrome to diagnose under standard neuropsychological assessment. There is now sufficient evidence that those affected might demonstrate neglect behavior in everyday settings despite showing no signs of neglect during common neglect tasks. This discrepancy is attributed to the simplified and unrealistic nature of common pen and paper based tasks that do not match the demanding, novel, and complex environment of everyday life. As such, increasing task demands under more ecologically valid scenarios has become an important method of increasing test sensitivity. The main aim of the current study was to evaluate the diagnostic utility of the Mobility Assessment Course (MAC), an ecological task, for the assessment of neglect. If neglect becomes more apparent under more challenging task demands the MAC could prove to be more diagnostically accurate at detecting neglect than conventional methods, particularly as the time from initial brain damage increases. Data collected by Guide Dogs of SA/NT were retrospectively analyzed. The Receiver Operating Characteristic (ROC) curve, a measure of sensitivity and specificity, was used to investigate the diagnostic utility of the MAC and a series of paper and pencil tests in 67 right hemisphere stroke survivors. While the MAC proved to be a more sensitive neglect test (74.2%) when compared to the Star Cancellation (43.3%) and Line Bisection (35.7%) tests, this was at the expense of relatively low specificity. As a result, the ROC curve analysis showed no statistically discernable differences between tasks (p > 0.12), or between subacute and chronic groups for individual tasks (p > 0.45). It is concluded that, while the MAC is an ecologically valid alternative for assessing neglect, regarding its diagnostic accuracy, there is currently not enough evidence to suggest that it is a big step forward in comparison to the accuracy of conventional tests. PMID:29163331
Han, Hyung Joon; Choi, Sae Byeol; Park, Man Sik; Lee, Jin Suk; Kim, Wan Bae; Song, Tae Jin; Choi, Sang Yong
2011-07-01
Single port laparoscopic surgery has come to the forefront of minimally invasive surgery. For those familiar with conventional techniques, however, this type of operation demands a different type of eye/hand coordination and involves unfamiliar working instruments. Herein, the authors describe the learning curve and the clinical outcomes of single port laparoscopic cholecystectomy for 150 consecutive patients with benign gallbladder disease. All patients underwent single port laparoscopic cholecystectomy using a homemade glove port by one of five operators with different levels of experiences of laparoscopic surgery. The learning curve for each operator was fitted using the non-linear ordinary least squares method based on a non-linear regression model. Mean operating time was 77.6 ± 28.5 min. Fourteen patients (6.0%) were converted to conventional laparoscopic cholecystectomy. Complications occurred in 15 patients (10.0%), as follows: bile duct injury (n = 2), surgical site infection (n = 8), seroma (n = 2), and wound pain (n = 3). One operator achieved a learning curve plateau at 61.4 min per procedure after 8.5 cases and his time improved by 95.3 min as compared with initial operation time. Younger surgeons showed significant decreases in mean operation time and achieved stable mean operation times. In particular, younger surgeons showed significant decreases in operation times after 20 cases. Experienced laparoscopic surgeons can safely perform single port laparoscopic cholecystectomy using conventional or angled laparoscopic instruments. The present study shows that an operator can overcome the single port laparoscopic cholecystectomy learning curve in about eight cases.
Exoplanet Atmospheres: From Light-Curve Analyses to Radiative-Transfer Modeling
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Foster, Andrew S.; Loredo, Thomas J.
2015-01-01
Multi-wavelength transit and secondary-eclipse light-curve observations are some of the most powerful techniques to probe the thermo-chemical properties of exoplanets. Although the small planet-to-star constrast ratios demand a meticulous data analysis, and the limited available spectral bands can further restrain constraints, a Bayesian approach can robustly reveal what constraints can we set, given the data.We review the main aspects considered during the analysis of Spitzer time-series data by our group with an aplication to WASP-8b and TrES-1. We discuss the applicability and limitations of the most commonly used correlated-noise estimators. We describe our open-source Bayesian Atmospheric Radiative Transfer (BART) code. BART calculates the planetary emission or transmission spectrum by solving a 1D line-by-line radiative-transfer equation. The generated spectra are integrated over determined bandpasses for comparison to the data. Coupled to our Multi-core Markov-chain Monte Carlo (MC3) statistical package, BART constrains the temperature profile and chemical abundances in the planet's atmosphere. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project
NASA Technical Reports Server (NTRS)
Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.
2006-01-01
NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.
Visual demand of curves and fog-limited sight distance and its relationship to brake response time.
DOT National Transportation Integrated Search
2006-05-01
Driver distraction is a major contributing factor to automobile crashes. National Highway Traffic Safety Administration (NHTSA) has estimated that approximately 25% of crashes are attributed to driver distraction and inattention (Wang, Knipling, & Go...
Percentage entrainment of constituent loads in urban runoff, south Florida
Miller, R.A.
1985-01-01
Runoff quantity and quality data from four urban basins in south Florida were analyzed to determine the entrainment of total nitrogen, total phosphorus, total carbon, chemical oxygen demand, suspended solids, and total lead within the stormwater runoff. Land use of the homogeneously developed basins are residential (single family), highway, commercial, and apartment (multifamily). A computational procedure was used to calculate, for all storms that had water-quality data, the percentage of constituent load entrainment in specified depths of runoff. The plot of percentage of constituent load entrained as a function of runoff is termed the percentage-entrainment curve. Percentage-entrainment curves were developed for three different source areas of basin runoff: (1) the hydraulically effective impervious area, (2) the contributing area, and (3) the drainage area. With basin runoff expressed in inches over the contributing area, the depth of runoff required to remove 90 percent of the constituent load ranged from about 0.4 inch to about 1.4 inches; and to remove 80 percent, from about 0.3 to 0.9 inch. Analysis of variance, using depth of runoff from the contributing area as the response variable, showed that the factor 'basin' is statistically significant, but that the factor 'constituent' is not statistically significant in the forming of the percentage-entrainment curve. Evidently the sewerage design, whether elongated or concise in plan dictates the shape of the percentage-entrainment curve. The percentage-entrainment curves for all constituents were averaged for each basin and plotted against basin runoff for three source areas of runoff-the hydraulically effective impervious area, the contributing area, and the drainage area. The relative positions of the three curves are directly related to the relative sizes of the three source areas considered. One general percentage-entrainment curve based on runoff from the contributing area was formed by averaging across both constituents and basins. Its coordinates are: 0.25 inch of runoff for 50-percent entrainment, 0.65 inch of runoff for 80-percent entrainment, and 0.95 inch of runoff for 90-percent entrainment. The general percentage-entrainment curve based on runoff from the hydraulically effective impervious area has runoff values of 0.35, 0.95, 1.6 inches, respectively.
The curving calculation of a mechanical device attached to a multi-storey car park
NASA Astrophysics Data System (ADS)
Muscalagiu, C. G.; Muscalagiu, I.; Muscalagiu, D. M.
2017-01-01
Study bunk storage systems for motor vehicles developed much lately due to high demand for parking in congested city centers. In this paper we propose to study mechanism drive bunk platforms for dynamic request. This paper aims to improve the response mechanism on a platform behavior self during operation of the system and identify hot spots. In this paper we propose to analyze the deformations of the superposed platform in the points of application of the exterior forces produced by the weight of the vehicle in a dynamic way. This paper aims to automate the necessary computation for the analysis of the deformations of the superposed platform using Netlogo language.
Water use trends and demand projections in the Northwest Florida Water Management District
Marella, R.L.; Mokray, M.F.; Hallock-Solomon, Michael
1998-01-01
The Northwest Florida Water Management District is located in the western panhandle of Florida and encompasses about 11,200 square miles. In 1995, the District had an estimated population of 1.13 million, an increase of about 47 percent from the 1975 population of 0.77 million. Over 50 percent of the resident population lives within 10 miles of the coast. In addition, hundreds of thousands of visitors come to the coastal areas of the panhandle during the summer months for recreation or vacation purposes. Water withdrawn to meet demands for public supply, domestic self-supplied, commercial-industrial, agricultural irrigation, and recreational irrigation purposes in the District increased 18 percent (52 million gallons per day) between 1970 and 1995. The greatest increases were for public supply and domestic self-supplied (99 percent increase) and for agricultural irrigation (60 percent increase) between 1970 and 1995. In 1995, approximately 70 percent of the water withdrawn was from ground-water sources, with the majority of this from the Floridan aquifer system. The increasing water demands have affected water levels in the Floridan aquifer system, especially along the coastal areas. The Northwest Florida Water Management District is mandated under the Florida Statutes (Chapter 373) to protect and manage the water resources in this area of the State. The mandate requires that current and future water demands be met, while water resources and water-dependent natural systems are sustained. For this project, curve fitting and extrapolation were used to project most of the variables (population, population served by public supply, and water use) to the years 2000, 2005, 2010, 2015, and 2020. This mathematical method involves fitting a curve to historical population or water-use data and then extending this curve to arrive at future values. The population within the region is projected to reach 1,596,888 by the year 2020, an increase of 41 percent between 1995 and 2020. Most of the population in this region will continue to reside in the urban areas of Pensacola and Tallahassee, and along the coastal areas. The population served by public water supply is projected to reach 1,353,836 by the year 2020, an increase of nearly 46 percent between 1995 and 2020. Total water demand for the Northwest Florida Water Management District is projected to reach 940.2 million gallons per day in 2000, 1,003.1 million gallons per day in 2010, and 1,059.1 million gallons per day in 2020. Excluding water withdrawn for power generation from these totals, water demands will increase 34 percent between 1995 and 2020, and 58 percent between 1970 and 2020. Specifically, public supply demands are projected to increase 74.1 million gallons per day (53 percent) and domestic self-supplied and small public supply systems demands are projected to increase 9.1 million gallons per day (28 percent) between 1995 and 2020. Commercial- industrial self-supplied demands are projected to increase about 16.9 million gallons per day (13 percent) between 1995 and 2020. Agricultural and recreational irrigation demands combined are projected to increase 16.8 million gallons per day (48 percent) between 1995 and 2020. Water demands for power generation are projected to increase about 53.9 million gallons per day (10 percent) between 1995 and 2020. Although power generation water use shows a projected increase during this time, plant capacities are not expected to change dramatically.
Weather and age-gender effects on the projection of future emergency ambulance demand in Hong Kong.
Lai, Poh-Chin; Wong, Ho-Ting
2015-03-01
An accurate projection for ambulance demand is essential to enable better resource planning for the future that strives to either maintain current levels of services or reconsider future standards and expectations. More than 2 million cases of emergency room attendance in 2008 were obtained from the Hong Kong Hospital Authority to project the demand for its ambulance services in 2036. The projection of ambulance demand in 2036 was computed in consideration of changes in the age-gender structure between 2008 and 2036. The quadratic relation between average daily temperature and daily ambulance demand in 2036 was further explored by including and excluding age-gender demographic changes. Without accounting for changes in the age-gender structure, the 2036 ambulance demand for age groups of 65 and above were consistently underestimated (by 38%-65%), whereas those of younger age groups were overestimated (by 6%-37%). Moreover, changes in the 2008 to 2036 age-gender structure also shift upward and emphasize relationships between average daily temperature and daily ambulance demand at both ends of the quadratic U-shaped curve. Our study reveals a potential societal implication of ageing population on the demand for ambulance services. © 2012 APJPH.
Lindley, J T
1972-01-01
Rumania provides the opportunity to determine the effects of change in abortion laws by comparing it to Bulgaria, Czechoslovakia, and Hungary with whom it has a similar background, government, and growth pattern. Rumania had legalized abortion in 1957 but reversed its decision in 1966. 3 years later when compared with the other countries where legalized abortion continued, there was a significant increase in the crude birthrate of Rumania, a notable increase resulting mainly from the change in its abortion law. This same conclusion can also be reached by applying microeconomic theory using the concept that children are, on the margin, the result of a maximizing process. The decision to have an abortion in the countries in question is voluntary. No one is coerced and even when abortion is illegal it can be seen as an increase in price. By doing this the decision of whether to have an abortion can be analyzed as a microeconomic decision. The birth decision is made on the margin where the expected cost of a child is compared with the expected return. Traditional analysis implies that there is no cost involved in not having children, but there are both monetary and nonmonetary costs, the latter being physical and psychological. All forms of birth control involve costs, and the following analysis could be used on any of them. By combining the cost of preventing birth with the concept of traditional theory, there is now a threefold margin of decision rather than a twofold one. The cost of prevention must be included. If the amount that will have to be expended for prevention exceeds the net cost of having the child, the ultimate decision will be to have the child. The demand curve for abortion shows that as abortion is legalized the supply curve will shift out and the price will fall, with the opposite case if abortion is again made illegal. The demand curve might also shift as abortion was legalized or made illegal as the desire for abortion could change. It could be altered by such concepts as obeying the law and social acceptance. With abortion legal and the cost of prevention lower, fewer people will decide to have children. This microanalysis explains well why the crude birthrate rose so abrubtly in Rumania.
Energy technologies evaluated against climate targets using a cost and carbon trade-off curve.
Trancik, Jessika E; Cross-Call, Daniel
2013-06-18
Over the next few decades, severe cuts in emissions from energy will be required to meet global climate-change mitigation goals. These emission reductions imply a major shift toward low-carbon energy technologies, and the economic cost and technical feasibility of mitigation are therefore highly dependent upon the future performance of energy technologies. However, existing models do not readily translate into quantitative targets against which we can judge the dynamic performance of technologies. Here, we present a simple, new model for evaluating energy-supply technologies and their improvement trajectories against climate-change mitigation goals. We define a target for technology performance in terms of the carbon intensity of energy, consistent with emission reduction goals, and show how the target depends upon energy demand levels. Because the cost of energy determines the level of adoption, we then compare supply technologies to one another and to this target based on their position on a cost and carbon trade-off curve and how the position changes over time. Applying the model to U.S. electricity, we show that the target for carbon intensity will approach zero by midcentury for commonly cited emission reduction goals, even under a high demand-side efficiency scenario. For Chinese electricity, the carbon intensity target is relaxed and less certain because of lesser emission reductions and greater variability in energy demand projections. Examining a century-long database on changes in the cost-carbon space, we find that the magnitude of changes in cost and carbon intensity that are required to meet future performance targets is not unprecedented, providing some evidence that these targets are within engineering reach. The cost and carbon trade-off curve can be used to evaluate the dynamic performance of existing and new technologies against climate-change mitigation goals.
Defining operating rules for mitigation of drought effects on water supply systems
NASA Astrophysics Data System (ADS)
Rossi, G.; Caporali, E.; Garrote, L.; Federici, G. V.
2012-04-01
Reservoirs play a pivotal role for water supply systems regulation and management especially during drought periods. Optimization of reservoir releases, related to drought mitigation rules is particularly required. The hydrologic state of the system is evaluated defining some threshold values, expressed in probabilistic terms. Risk deficit curves are used to reduce the ensemble of possible rules for simulation. Threshold values can be linked to specific actions in an operational context in different levels of severity, i.e. normal, pre-alert, alert and emergency scenarios. A simplified model of the water resources system is built to evaluate the threshold values and the management rules. The threshold values are defined considering the probability to satisfy a given fraction of the demand in a certain time horizon, and are validated with a long term simulation that takes into account the characteristics of the evaluated system. The threshold levels determine some curves that define reservoir releases as a function of existing storage volume. A demand reduction is related to each threshold level. The rules to manage the system in drought conditions, the threshold levels and the reductions are optimized using long term simulations with different hypothesized states of the system. Synthetic sequences of flows with the same statistical properties of the historical ones are produced to evaluate the system behaviour. Performances of different values of reduction and different threshold curves are evaluated using different objective function and performances indices. The methodology is applied to the urban area Firenze-Prato-Pistoia in central Tuscany, in Central Italy. The considered demand centres are Firenze and Bagno a Ripoli that have, accordingly to the census ISTAT 2001, a total of 395.000 inhabitants.
Fedota, John R; Matous, Allison L; Salmeron, Betty Jo; Gu, Hong; Ross, Thomas J; Stein, Elliot A
2016-09-01
Deficits in cognitive control processes are a primary characteristic of nicotine addiction. However, while network-based connectivity measures of dysfunction have frequently been observed, empirical evidence of task-based dysfunction in these processes has been inconsistent. Here, in a sample of smokers (n=35) and non-smokers (n=21), a previously validated parametric flanker task is employed to characterize addiction-related alterations in responses to varying (ie, high, intermediate, and low) demands for cognitive control. This approach yields a demand-response curve that aims to characterize potential non-linear responses to increased demand for control, including insensitivities or lags in fully activating the cognitive control network. We further used task-based differences in activation between groups as seeds for resting-state analysis of network dysfunction in an effort to more closely link prior inconsistencies in task-related activation with evidence of impaired network connectivity in smokers. For both smokers and non-smokers, neuroimaging results showed similar increases in activation in brain areas associated with cognitive control. However, reduced activation in right insula was seen only in smokers and only when processing intermediate demand for cognitive control. Further, in smokers, this task-modulated right insula showed weaker functional connectivity with the superior frontal gyrus, a component of the task-positive executive control network. These results demonstrate that the neural instantiation of salience attribution in smokers is both more effortful to fully activate and has more difficulty communicating with the exogenous, task-positive, executive control network. Together, these findings further articulate the cognitive control dysfunction associated with smoking and illustrate a specific brain circuit potentially responsible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; Park, Won Young; McNeil, Michael A.
Increasing concerns on non-sustainable energy use and climate change spur a growing research interest in energy efficiency potentials in various critical areas such as industrial production. This paper focuses on learning curve aspects of energy efficiency measures in the U.S iron and steel sector. A number of early-stage efficient technologies (i.e., emerging or demonstration technologies) are technically feasible and have the potential to make a significant contribution to energy saving and CO 2 emissions reduction, but fall short economically to be included. However, they may also have the cost effective potential for significant cost reduction and/or performance improvement in themore » future under learning effects such as ‘learning-by-doing’. The investigation is carried out using ISEEM, a technology oriented, linear optimization model. We investigated how steel demand is balanced with/without the availability learning curve, compared to a Reference scenario. The retrofit (or investment in some cases) costs of energy efficient technologies decline in the scenario where learning curve is applied. The analysis also addresses market penetration of energy efficient technologies, energy saving, and CO 2 emissions in the U.S. iron and steel sector with/without learning impact. Accordingly, the study helps those who use energy models better manage the price barriers preventing unrealistic diffusion of energy-efficiency technologies, better understand the market and learning system involved, predict future achievable learning rates more accurately, and project future savings via energy-efficiency technologies with presence of learning. We conclude from our analysis that, most of the existing energy efficiency technologies that are currently used in the U.S. iron and steel sector are cost effective. Penetration levels increases through the years, even though there is no price reduction. However, demonstration technologies are not economically feasible in the U.S. iron and steel sector with the current cost structure. In contrast, some of the demonstration technologies are adapted in the mid-term and their penetration levels increase as the prices go down with learning curve. We also observe large penetration of 225kg pulverized coal injection with the presence of learning.« less
Behavioral economic analysis of stress effects on acute motivation for alcohol.
Owens, Max M; Ray, Lara A; MacKillop, James
2015-01-01
Due to issues of definition and measurement, the heavy emphasis on subjective craving in the measurement of acute motivation for alcohol and other drugs remains controversial. Behavioral economic approaches have increasingly been applied to better understand acute drug motivation, particularly using demand curve modeling via purchase tasks to characterize the perceived reinforcing value of the drug. This approach has focused on using putatively more objective indices of motivation, such as units of consumption, monetary expenditure, and price sensitivity. To extend this line of research, the current study used an alcohol purchase task to determine if, compared to a neutral induction, a personalized stress induction would increase alcohol demand in a sample of heavy drinkers. The stress induction significantly increased multiple measures of the reinforcing value of alcohol to the individual, including consumption at zero price (intensity), the maximum total amount of money spent on alcohol (Omax), the first price where consumption was reduced to zero (breakpoint), and the general responsiveness of consumption to increases in price (elasticity). These measures correlated only modestly with craving and mood. Self-reported income was largely unrelated to demand but moderated the influence of stress on Omax. Moderation based on CRH-BP genotype (rs10055255) was present for Omax, with T allele homozygotes exhibiting more pronounced increases in response to stress. These results provide further support for a behavioral economic approach to measuring acute drug motivation. The findings also highlight the potential relevance of income and genetic factors in understanding state effects on the perceived reinforcing value of alcohol. © Society for the Experimental Analysis of Behavior.
Fragale, Jennifer E. C.; Beck, Kevin D.; Pang, Kevin C. H.
2017-01-01
Abnormal motivation and hedonic assessment of aversive stimuli are symptoms of anxiety and depression. Symptoms influenced by motivation and anhedonia predict treatment success or resistance. Therefore, a translational approach to the study of negatively motivated behaviors is needed. We describe a novel use of behavioral economics demand curve analysis to investigate negative reinforcement in animals that separates hedonic assessment of footshock termination (i.e., relief) from motivation to escape footshock. In outbred Sprague Dawley (SD) rats, relief increased as shock intensity increased. Likewise, motivation to escape footshock increased as shock intensity increased. To demonstrate the applicability to anxiety disorders, hedonic and motivational components of negative reinforcement were investigated in anxiety vulnerable Wistar Kyoto (WKY) rats. WKY rats demonstrated increased motivation for shock cessation with no difference in relief as compared to control SD rats, consistent with a negative bias for motivation in anxiety vulnerability. Moreover, motivation was positively correlated with relief in SD, but not in WKY. This study is the first to assess the hedonic and motivational components of negative reinforcement using behavioral economic analysis. This procedure can be used to investigate positive and negative reinforcement in humans and animals to gain a better understanding of the importance of motivated behavior in stress-related disorders. PMID:28270744
H2@Scale: Technical and Economic Potential of Hydrogen as an Energy Intermediate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark F; Jadun, Paige; Pivovar, Bryan S
The H2@Scale concept is focused on developing hydrogen as an energy carrier and using hydrogen's properties to improve the national energy system. Specifically hydrogen has the abilities to (1) supply a clean energy source for industry and transportation and (2) increase the profitability of variable renewable electricity generators such as wind turbines and solar photovoltaic (PV) farms by providing value for otherwise potentially-curtailed electricity. Thus the concept also has the potential to reduce oil dependency by providing a low-carbon fuel for fuel cell electric vehicles (FCEVs), reduce emissions of carbon dioxide and pollutants such as NOx, and support domestic energymore » production, manufacturing, and U.S. economic competitiveness. The analysis reported here focuses on the potential market size and value proposition for the H2@Scale concept. It involves three analysis phases: 1. Initial phase estimating the technical potential for hydrogen markets and the resources required to meet them; 2. National-scale analysis of the economic potential for hydrogen and the interactions between willingness to pay by hydrogen users and the cost to produce hydrogen from various sources; and 3. In-depth analysis of spatial and economic issues impacting hydrogen production and utilization and the markets. Preliminary analysis of the technical potential indicates that the technical potential for hydrogen use is approximately 60 million metric tons (MMT) annually for light duty FCEVs, heavy duty vehicles, ammonia production, oil refining, biofuel hydrotreating, metals refining, and injection into the natural gas system. The technical potential of utility-scale PV and wind generation independently are much greater than that necessary to produce 60 MMT / year hydrogen. Uranium, natural gas, and coal reserves are each sufficient to produce 60 MMT / year hydrogen in addition to their current uses for decades to centuries. National estimates of the economic potential of hydrogen production using steam methane reforming of natural gas, high temperature electrolysis coupled with nuclear power plants, and low temperature electrolysis are reported. To generate the estimates, supply curves for those technologies are used. They are compared to demand curves that describe the market size for hydrogen uses and willingness to pay for that hydrogen. Scenarios are developed at prices where supply meets demand and are used to estimate energy use, emissions, and economic impacts.« less
Potential Size of and Value Proposition for H2@Scale Concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark F; Jadun, Paige; Pivovar, Bryan S
The H2@Scale concept is focused on developing hydrogen as an energy carrier and using hydrogen's properties to improve the national energy system. Specifically hydrogen has the abilities to (1) supply a clean energy source for industry and transportation and (2) increase the profitability of variable renewable electricity generators such as wind turbines and solar photovoltaic (PV) farms by providing value for otherwise potentially-curtailed electricity. Thus the concept also has the potential to reduce oil dependency by providing a low-carbon fuel for fuel cell electric vehicles (FCEVs), reduce emissions of carbon dioxide and pollutants such as NOx, and support domestic energymore » production, manufacturing, and U.S. economic competitiveness. The analysis reported here focuses on the potential market size and value proposition for the H2@Scale concept. It involves three analysis phases: 1. Initial phase estimating the technical potential for hydrogen markets and the resources required to meet them; 2. National-scale analysis of the economic potential for hydrogen and the interactions between willingness to pay by hydrogen users and the cost to produce hydrogen from various sources; and 3. In-depth analysis of spatial and economic issues impacting hydrogen production and utilization and the markets. Preliminary analysis of the technical potential indicates that the technical potential for hydrogen use is approximately 60 million metric tons (MMT) annually for light duty FCEVs, heavy duty vehicles, ammonia production, oil refining, biofuel hydrotreating, metals refining, and injection into the natural gas system. The technical potential of utility-scale PV and wind generation independently are much greater than that necessary to produce 60 MMT / year hydrogen. Uranium, natural gas, and coal reserves are each sufficient to produce 60 MMT / year hydrogen in addition to their current uses for decades to centuries. National estimates of the economic potential of hydrogen production using steam methane reforming of natural gas, high temperature electrolysis coupled with nuclear power plants, and low temperature electrolysis are reported. To generate the estimates, supply curves for those technologies are used. They are compared to demand curves that describe the market size for hydrogen uses and willingness to pay for that hydrogen. Scenarios are developed at prices where supply meets demand and are used to estimate energy use, emissions, and economic impacts.« less
Potentially conflicting metabolic demands of diving and exercise in seals.
Castellini, M A; Murphy, B J; Fedak, M; Ronald, K; Gofton, N; Hochachka, P W
1985-02-01
Metabolic replacement rates (Ra) for glucose and free fatty acids (FFA) were determined during rest, exercise, and diving conditions in the gray seal using bolus injections of radiotracers. In the exercise experiments the seal swam at a metabolic rate elevated twofold over resting Ra for glucose and FFA while resting were similar to values found in terrestrial mammals and other marine mammal species. During exercise periods glucose turnover increased slightly while FFA turnover changes were variable. However, the energetic demands of exercise could not be met by the increase in the replacement rates of glucose or FFA even if both were completely oxidized. Under diving conditions the tracer pool displayed radically different specific activity curves indicative of the changes in perfusion and metabolic rate associated with a strong dive response. Since the radiotracer curves during exercise and diving differed qualitatively and quantitatively, it is possible that similar studies on freely diving animals can be used to assess the role of the diving response during underwater swimming in nature.
ESTIMATING WELFARE IN INSURANCE MARKETS USING VARIATION IN PRICES*
Einav, Liran; Finkelstein, Amy; Cullen, Mark R.
2009-01-01
We provide a graphical illustration of how standard consumer and producer theory can be used to quantify the welfare loss associated with inefficient pricing in insurance markets with selection. We then show how this welfare loss can be estimated empirically using identifying variation in the price of insurance. Such variation, together with quantity data, allows us to estimate the demand for insurance. The same variation, together with cost data, allows us to estimate how insurer’s costs vary as market participants endogenously respond to price. The slope of this estimated cost curve provides a direct test for both the existence and nature of selection, and the combination of demand and cost curves can be used to estimate welfare. We illustrate our approach by applying it to data on employer-provided health insurance from one specific company. We detect adverse selection but estimate that the quantitative welfare implications associated with inefficient pricing in our particular application are small, in both absolute and relative terms. PMID:21218182
BEHAVIORAL HAZARD IN HEALTH INSURANCE*
Baicker, Katherine; Mullainathan, Sendhil; Schwartzstein, Joshua
2015-01-01
A fundamental implication of standard moral hazard models is overuse of low-value medical care because copays are lower than costs. In these models, the demand curve alone can be used to make welfare statements, a fact relied on by much empirical work. There is ample evidence, though, that people misuse care for a different reason: mistakes, or “behavioral hazard.” Much high-value care is underused even when patient costs are low, and some useless care is bought even when patients face the full cost. In the presence of behavioral hazard, welfare calculations using only the demand curve can be off by orders of magnitude or even be the wrong sign. We derive optimal copay formulas that incorporate both moral and behavioral hazard, providing a theoretical foundation for value-based insurance design and a way to interpret behavioral “nudges.” Once behavioral hazard is taken into account, health insurance can do more than just provide financial protection—it can also improve health care efficiency. PMID:23930294
The maximum contraceptive prevalence ‘demand curve’: guiding discussions on programmatic investments
Weinberger, Michelle; Sonneveldt, Emily; Stover, John
2017-01-01
Most frameworks for family planning include both access and demand interventions. Understanding how these two are linked and when each should be prioritized is difficult. The maximum contraceptive prevalence ‘demand curve’ was created based on a relationship between the modern contraceptive prevalence rate (mCPR) and mean ideal number of children to allow for a quantitative assessment of the balance between access and demand interventions. The curve represents the maximum mCPR that is likely to be seen given fertility intentions and related norms and constructs that influence contraceptive use. The gap between a country’s mCPR and this maximum is referred to as the ‘potential use gap.’ This concept can be used by countries to prioritize access investments where the gap is large, and discuss implications for future contraceptive use where the gap is small. It is also used within the FP Goals model to ensure mCPR growth from access interventions does not exceed available demand. PMID:29355228
Driving forces behind the Chinese public's demand for improved environmental safety.
Wen, Ting; Wang, Jigan; Ma, Zongwei; Bi, Jun
2017-12-15
Over the past decades, the public demand for improved environmental safety keeps increasing in China. This study aims to assess the driving forces behind the increasing public demand for improved environmental safety using a provincial and multi-year (1995, 2000, 2005, 2010, and 2014) panel data and the Stochastic Impacts by Regression on Population, Affluence, and Technology (STIRPAT) model. The potential driving forces investigated included population size, income levels, degrees of urbanization, and educational levels. Results show that population size and educational level are positively (P<0.01) associated with public demand for improved environmental safety. No significant impact on demand was found due to the degree of urbanization. For the impact due to income level, an inverted U-shaped curve effect with the turning point of ~140,000 CNY GDP per capita is indicated. Since per capita GDP of 2015 in China was approximately 50,000 CNY and far from the turning point, the public demand for improved environmental safety will continue rising in the near future. To meet the increasing public demand for improved environmental safety, proactive and risk prevention based environmental management systems coupled with effective environmental risk communication should be established. Copyright © 2017 Elsevier B.V. All rights reserved.
Aston, Elizabeth R.; Metrik, Jane; Amlung, Michael; Kahler, Christopher W.; MacKillop, James
2016-01-01
Background Distinct behavioral economic domains, including high perceived drug value (demand) and delay discounting (DD), have been implicated in the initiation of drug use and the progression to dependence. However, it is unclear whether frequent marijuana users conform to a “reinforcer pathology” addiction model wherein marijuana demand and DD jointly increase risk for problematic marijuana use and cannabis dependence (CD). Methods Participants (n=88, 34% female, 14% cannabis dependent) completed a marijuana purchase task at baseline. A delay discounting task was completed following placebo marijuana cigarette (0% THC) administration during a separate experimental session. Results Marijuana demand and DD were quantified using area under the curve (AUC). In multiple regression models, demand uniquely predicted frequency of marijuana use while DD did not. In contrast, DD uniquely predicted CD symptom count while demand did not. There were no significant interactions between demand and DD in either model. Conclusions These findings suggest that frequent marijuana users exhibit key constituents of the reinforcer pathology model: high marijuana demand and steep discounting of delayed rewards. However, demand and DD appear to be independent rather than synergistic risk factors for elevated marijuana use and risk for progression to CD. Findings also provide support for using AUC as a singular marijuana demand metric, particularly when also examining other behavioral economic constructs that apply similar statistical approaches, such as DD, to support analytic methodological convergence. PMID:27810657
Interactive Web Graphs for Economic Principles.
ERIC Educational Resources Information Center
Kaufman, Dennis A.; Kaufman, Rebecca S.
2002-01-01
Describes a Web site with animation and interactive activities containing graphs and basic economics concepts. Features changes in supply and market equilibrium, the construction of the long-run average cost curve, short-run profit maximization, long-run market equilibrium, and changes in aggregate demand and aggregate supply. States the…
Slope versus Elasticity and the Burden of Taxation.
ERIC Educational Resources Information Center
Graves, Philip E.; And Others
1996-01-01
Criticizes the standard presentation, in introductory economics, of the burden of a tax as an application of elasticity. Argues that using the slopes of a supply and demand curve is the simplest and easiest way to clarify tax incidence. Includes three graphs illustrating this approach. (MJP)
Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.
Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G
2012-05-01
This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.
Jirapinyo, Pichamol; Abidi, Wasif M; Aihara, Hiroyuki; Zaki, Theodore; Tsay, Cynthia; Imaeda, Avlin B; Thompson, Christopher C
2017-10-01
Preclinical simulator training has the potential to decrease endoscopic procedure time and patient discomfort. This study aims to characterize the learning curve of endoscopic novices in a part-task simulator and propose a threshold score for advancement to initial clinical cases. Twenty novices with no prior endoscopic experience underwent repeated endoscopic simulator sessions using the part-task simulator. Simulator scores were collected; their inverse was averaged and fit to an exponential curve. The incremental improvement after each session was calculated. Plateau was defined as the session after which incremental improvement in simulator score model was less than 5%. Additionally, all participants filled out questionnaires regarding simulator experience after sessions 1, 5, 10, 15, and 20. A visual analog scale and NASA task load index were used to assess levels of comfort and demand. Twenty novices underwent 400 simulator sessions. Mean simulator scores at sessions 1, 5, 10, 15, and 20 were 78.5 ± 5.95, 176.5 ± 17.7, 275.55 ± 23.56, 347 ± 26.49, and 441.11 ± 38.14. The best fit exponential model was [time/score] = 26.1 × [session #] -0.615 ; r 2 = 0.99. This corresponded to an incremental improvement in score of 35% after the first session, 22% after the second, 16% after the third and so on. Incremental improvement dropped below 5% after the 12th session corresponding to the predicted score of 265. Simulator training was related to higher comfort maneuvering an endoscope and increased readiness for supervised clinical endoscopy, both plateauing between sessions 10 and 15. Mental demand, physical demand, and frustration levels decreased with increased simulator training. Preclinical training using an endoscopic part-task simulator appears to increase comfort level and decrease mental and physical demand associated with endoscopy. Based on a rigorous model, we recommend that novices complete a minimum of 12 training sessions and obtain a simulator score of at least 265 to be best prepared for clinical endoscopy.
MAGIC-f Gel in Nuclear Medicine Dosimetry: study in an external beam of Iodine-131
NASA Astrophysics Data System (ADS)
Schwarcke, M.; Marques, T.; Garrido, C.; Nicolucci, P.; Baffa, O.
2010-11-01
MAGIC-f gel applicability in Nuclear Medicine dosimetry was investigated by exposure to a 131I source. Calibration was made to provide known absorbed doses in different positions around the source. The absorbed dose in gel was compared with a Monte Carlo Simulation using PENELOPE code and a thermoluminescent dosimetry (TLD). Using MRI analysis for the gel a R2-dose sensitivity of 0.23 s-1Gy-1was obtained. The agreement between dose-distance curves obtained with Monte Carlo simulation and TLD was better than 97% and for MAGIC-f and TLD was better than 98%. The results show the potential of polymer gel for application in nuclear medicine where three dimensional dose distribution is demanded.
The Sensor Management for Applied Research Technologies (SMART) Project
NASA Technical Reports Server (NTRS)
Goodman, Michael; Jedlovec, Gary; Conover, Helen; Botts, Mike; Robin, Alex; Blakeslee, Richard; Hood, Robbie; Ingenthron, Susan; Li, Xiang; Maskey, Manil;
2007-01-01
NASA seeks on-demand data processing and analysis of Earth science observations to facilitate timely decision-making that can lead to the realization of the practical benefits of satellite instruments, airborne and surface remote sensing systems. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep "learning curve" associated with each sensor, data type, and associated products. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output.
Efficient computation of photonic crystal waveguide modes with dispersive material.
Schmidt, Kersten; Kappeler, Roman
2010-03-29
The optimization of PhC waveguides is a key issue for successfully designing PhC devices. Since this design task is computationally expensive, efficient methods are demanded. The available codes for computing photonic bands are also applied to PhC waveguides. They are reliable but not very efficient, which is even more pronounced for dispersive material. We present a method based on higher order finite elements with curved cells, which allows to solve for the band structure taking directly into account the dispersiveness of the materials. This is accomplished by reformulating the wave equations as a linear eigenproblem in the complex wave-vectors k. For this method, we demonstrate the high efficiency for the computation of guided PhC waveguide modes by a convergence analysis.
[Osteosynthesis of distal radius fractures by doral plate: advantages and disadvantages].
Obert, L; Vichard, P; Garbuio, P; Tropet, Y
2001-12-01
Distal radius fractures remain a challenge. No one osteosynthesis procedure can solve all the problems. A method of analysis is necessary in order to choose the best tools. Open treatment of the fracture is logical but rarely performed. A review of the literature and the experience of the authors are reported in order to analyse the correct place of dorsal plating in distal radius fracture with dorsal displacement. The learning curve of the operative procedure and the design of the implants can explain the occurrence of several complications. The dorsal plate is effective against secondary dorsal displacement. This demanding procedure must be compared with other reported procedures (pining and external fixator) to define the advantages and disadvantages.
NASA Astrophysics Data System (ADS)
Uysal, G.; Sensoy, A.; Yavuz, O.; Sorman, A. A.; Gezgin, T.
2012-04-01
Effective management of a controlled reservoir system where it involves multiple and sometimes conflicting objectives is a complex problem especially in real time operations. Yuvacık Dam Reservoir, located in the Marmara region of Turkey, is built to supply annual demand of 142 hm3 water for Kocaeli city requires such a complex management strategy since it has relatively small (51 hm3) effective capacity. On the other hand, the drainage basin is fed by both rainfall and snowmelt since the elevation ranges between 80 - 1548 m. Excessive water must be stored behind the radial gates between February and May in terms of sustainability especially for summer and autumn periods. Moreover, the downstream channel physical conditions constraint the spillway releases up to 100 m3/s although the spillway is large enough to handle major floods. Thus, this situation makes short term release decisions the challenging task. Long term water supply curves, based on historical inflows and annual water demand, are in conflict with flood regulation (control) levels, based on flood attenuation and routing curves, for this reservoir. A guide curve, that is generated using both water supply and flood control of downstream channel, generally corresponds to upper elevation of conservation pool for simulation of a reservoir. However, sometimes current operation necessitates exceeding this target elevation. Since guide curves can be developed as a function of external variables, the water potential of a basin can be an indicator to explain current conditions and decide on the further strategies. Besides, releases with respect to guide curve are managed and restricted by user-defined rules. Although the managers operate the reservoir due to several variable conditions and predictions, still the simulation model using variable guide curve is an urgent need to test alternatives quickly. To that end, using HEC-ResSim, the several variable guide curves are defined to meet the requirements by taking inflow, elevation, precipitation and snow water equivalent into consideration to propose alternative simulations as a decision support system. After that, the releases are subjected to user-defined rules. Thus, previous year reservoir simulations are compared with observed reservoir levels and releases. Hypothetical flood scenarios are tested in case of different storm event timing and sizing. Numerical weather prediction data of Mesoscale Model 5 (MM5) can be used for temperature and precipitation forecasts that will form the inputs for a hydrological model. The estimated flows can be used for real time short term decisions for reservoir simulation based on variable guide curve and user defined rules.
Impacts of Extreme Hot Weather Events on Electricity Consumption in Baden-Wuerttemberg
NASA Astrophysics Data System (ADS)
Mimler, S.
2009-04-01
Changes in electricity consumption due to hot weather events were examined for the German federal state Baden-Württemberg. The analysis consists of three major steps: Firstly, an analysis of the media coverage on the hot summer of 2003 gives direct and indirect information about changes in electricity demand due to changes in consumption patterns. On the one hand there was an overall increase in electricity demand due to the more frequent use of air conditionings, fans, cooling devices and water pumps. On the other hand shifts in electricity consumption took place due to modifications in daily routines: if possible, core working times were scheduled earlier, visitor streams in gastronomy and at events shifted from noon to evening hours, a temporal shifting of purchases took place in early morning or evening hours, and an increased night-activity was documented by a higher number of police operations due to noise disturbances. In a second step, some of the findings of the media analysis were quantified for households in the city region of Karlsruhe. For the chosen electric device groups refrigerators, mini-coolers, air conditionings, fans and electric stoves the difference between the consumption on a hot summer day and a normal summer day was computed. For this purpose, assumptions had to be made on the share of affected households, affected devices or usage patterns. These assumptions were summarized into three scenarios on low, medium and high heat induced changes in electricity consumption. In total, the quantification resulted in a range of about 7.5 to 9.2 % of heat-induced over-consumption related to the average amount of electrical load that is normally provided to Karlsruhe households on a summer's day. A third analysis of summer load curves aimed at testing the following hypotheses derived from the media analysis regarding changes in every-day routines and their effects on shifts in load profiles. To test the hypotheses, correlation tests were applied. (1) The higher the temperature the higher the daily electricity consumption. This hypothesis was confirmed for workdays and weekends at a significance level of 99 %. (2) The higher the temperature the lower the electricity consumption at noon. This hypothesis was confirmed at 99 % for workdays only while it was declined for weekends. (3) The higher the temperature the higher the electricity consumption during evening hours. This hypothesis was declined both for workdays and weekends. (4) The higher the temperature the higher the electricity consumption during night. This hypothesis was confirmed at 95 % for workdays and at 99 % for weekends. (5) The higher the temperature the later the decrease of the consumption curve in the evening. This hypothesis was confirmed at 90 % for workdays only. (6) The higher the temperature the earlier the increase of the consumption curve in the morning. This hypothesis was declined both for workdays and weekends.
A Bargaining Experiment To Motivate a Discussion on Fairness.
ERIC Educational Resources Information Center
Dickinson, David L.
2002-01-01
Employs a classroom version of the research game, the Ultimatum Game, to teach undergraduate students how fairness affects behavior. Focuses on three concepts related to fairness. Finds that classroom results motivate discussion about a downward sloping demand curve for fairness. Provides an appendix that includes instructional materials. (JEH)
Daniel M. Bishop; Floyd A. Johnson
1958-01-01
The increasing commercial importance of red alder (Alnus rubra) in the Pacific Northwest has created a demand for research on this species. Noting the lack of information on growth of alder, the Puget Sound Research Center Advisory Committee established a subcommittee in January 1956 to undertake construction of alder yield tables. Through the...
Replacing Relative Reinforcing Efficacy with Behavioral Economic Demand Curves
ERIC Educational Resources Information Center
Johnson, Matthew W.; Bickel, Warren K.
2006-01-01
Relative reinforcing efficacy refers to the behavior-strengthening or maintaining property of a reinforcer when compared to that of another reinforcer. Traditional measures of relative reinforcing efficacy sometimes have led to discordant results across and within studies. By contrast, previous investigations have found traditional measures to be…
Implications for Veterinary Medical Education: Preprofessional and Professional Education.
ERIC Educational Resources Information Center
Vaughan, J. T.
1980-01-01
The need to boost the productivity curve for production of animal protein and the demand for specialists in all areas of animal agriculture are discussed. Legislative action and academic priorities must be initiated to assume the responsibility for the control of disease, promotion of health, and efficiency of production. (MLW)
On integrable boundaries in the 2 dimensional O(N) σ-models
NASA Astrophysics Data System (ADS)
Aniceto, Inês; Bajnok, Zoltán; Gombor, Tamás; Kim, Minkyoo; Palla, László
2017-09-01
We make an attempt to map the integrable boundary conditions for 2 dimensional non-linear O(N) σ-models. We do it at various levels: classically, by demanding the existence of infinitely many conserved local charges and also by constructing the double row transfer matrix from the Lax connection, which leads to the spectral curve formulation of the problem; at the quantum level, we describe the solutions of the boundary Yang-Baxter equation and derive the Bethe-Yang equations. We then show how to connect the thermodynamic limit of the boundary Bethe-Yang equations to the spectral curve.
Saraswat, Mayank; Joenväärä, Sakari; Seppänen, Hanna; Mustonen, Harri; Haglund, Caj; Renkonen, Risto
2017-07-01
Finland ranks sixth among the countries having highest incidence rate of pancreatic cancer with mortality roughly equaling incidence. The average age of diagnosis for pancreatic cancer is 69 years in Nordic males, whereas the average age of diagnosis of chronic pancreatitis is 40-50 years, however, many cases overlap in age. By radiology, the evaluation of a pancreatic mass, that is, the differential diagnosis between chronic pancreatitis and pancreatic cancer is often difficult. Preoperative needle biopsies are difficult to obtain and are demanding to interpret. New blood based biomarkers are needed. The accuracy of the only established biomarker for pancreatic cancer, CA 19-9 is rather poor in differentiating between benign and malignant mass of the pancreas. In this study, we have performed mass spectrometry analysis (High Definition MS E ) of serum samples from patients with chronic pancreatitis (13) and pancreatic cancer (22). We have quantified 291 proteins and performed detailed statistical analysis such as principal component analysis, orthogonal partial least square discriminant analysis and receiver operating curve analysis. The proteomic signature of chronic pancreatitis versus pancreatic cancer samples was able to separate the two groups by multiple statistical techniques. Some of the enriched pathways in the proteomic dataset were LXR/RXR activation, complement and coagulation systems and inflammatory response. We propose that multiple high-confidence biomarker candidates in our pilot study including Inter-alpha-trypsin inhibitor heavy chain H2 (Area under the curve, AUC: 0.947), protein AMBP (AUC: 0.951) and prothrombin (AUC: 0.917), which should be further evaluated in larger patient series as potential new biomarkers for differential diagnosis. © 2017 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Simulating water markets with transaction costs
NASA Astrophysics Data System (ADS)
Erfani, Tohid; Binions, Olga; Harou, Julien J.
2014-06-01
This paper presents an optimization model to simulate short-term pair-wise spot-market trading of surface water abstraction licenses (water rights). The approach uses a node-arc multicommodity formulation that tracks individual supplier-receiver transactions in a water resource network. This enables accounting for transaction costs between individual buyer-seller pairs and abstractor-specific rules and behaviors using constraints. Trades are driven by economic demand curves that represent each abstractor's time-varying water demand. The purpose of the proposed model is to assess potential hydrologic and economic outcomes of water markets and aid policy makers in designing water market regulations. The model is applied to the Great Ouse River basin in Eastern England. The model assesses the potential weekly water trades and abstractions that could occur in a normal and a dry year. Four sectors (public water supply, energy, agriculture, and industrial) are included in the 94 active licensed water diversions. Each license's unique environmental restrictions are represented and weekly economic water demand curves are estimated. Rules encoded as constraints represent current water management realities and plausible stakeholder-informed water market behaviors. Results show buyers favor sellers who can supply large volumes to minimize transactions. The energy plant cooling and agricultural licenses, often restricted from obtaining water at times when it generates benefits, benefit most from trades. Assumptions and model limitations are discussed. This article was corrected on 13 JUN 2014. See the end of the full text for details.
Nitzan, Meir; Nitzan, Itamar
2013-08-01
The oxygen saturation of the systemic arterial blood is associated with the adequacy of respiration, and can be measured non-invasively by pulse oximetry in the systemic tissue. The oxygen saturation of the blood in the pulmonary artery, the mixed venous blood, reflects the balance between oxygen supply to the systemic tissues and their oxygen demand. The mixed venous oxygen saturation has also clinical significance because it is used in Fick equation for the quantitative measurement of cardiac output. At present the measurement of the mixed venous oxygen saturation is invasive and requires insertion of a Swan-Ganz catheter into the pulmonary artery. We suggest a noninvasive method for the measurement of the mixed venous oxygen saturation in infants, pulmonary pulse oximetry. The method is similar to the systemic pulse oximetry, which is based on the different light absorption curves of oxygenated and deoxygenated hemoglobin and on the analysis of photoplethysmographic curves in two wavelengths. The proposed pulmonary pulse oximeter includes light-sources of two wavelengths in the infrared, which illuminate the pulmonary tissue through the thoracic wall. Part of the light which is scattered back from the pulmonary tissue and passes through the thoracic wall is detected, and for each wavelength a pulmonary photoplethysmographic curve is obtained. The pulmonary photoplethysmographic curves reflect blood volume increase during systole in the pulmonary arteries in the lung tissue, which contain mixed venous blood. The ratio R of the amplitude-to-baseline ratio for the two wavelengths is related to the mixed venous oxygen saturation through equations derived for the systemic pulse oximetry. The method requires the use of extinction coefficients values for oxygenated and deoxygenated hemoglobin, which can be found in the literature. Copyright © 2013 Elsevier Ltd. All rights reserved.
Effects of bupropion on simulated demand for cigarettes and the subjective effects of smoking.
Madden, Gregory J; Kalman, David
2010-04-01
The biobehavioral mechanism(s) mediating bupropion's efficacy are not well understood. Behavioral economic measures such as demand curves have proven useful in investigations of the reinforcing effects of drugs of abuse. Behavioral economic measures may also be used to measure the effect of pharmacotherapies on the reinforcing effects of drugs of abuse. The effects of bupropion on simulated demand for cigarettes were investigated in a placebo-controlled double-blind clinical trial. Participants reported the number of cigarettes they would purchase and consume in a single day at a range of prices. The effects of medication on the subjective effects of smoking were also explored. Demand for cigarettes was well described by an exponential demand equation. Bupropion did not significantly decrease the maximum number of cigarettes that participants said they would smoke in a single day nor did it significantly alter the relation between price per cigarette and demand. Baseline demand elasticity did not predict smoking cessation, but changes in elasticity following 1 week of treatment did. Medication group had no effect on any subjective effects of smoking. Bupropion had no significant effects on demand for cigarettes. The exponential demand equation, recently introduced in behavioral economics, proved amenable to human simulated demand and might be usefully employed in other pharmacotherapy studies as it provides a potentially useful measure of changes in the essential value of the drug as a reinforcer. Such changes may be useful in predicting the efficacy of medications designed to reduce drug consumption.
Aston, Elizabeth R; Metrik, Jane; Amlung, Michael; Kahler, Christopher W; MacKillop, James
2016-12-01
Distinct behavioral economic domains, including high perceived drug value (demand) and delay discounting (DD), have been implicated in the initiation of drug use and the progression to dependence. However, it is unclear whether frequent marijuana users conform to a "reinforcer pathology" addiction model wherein marijuana demand and DD jointly increase risk for problematic marijuana use and cannabis dependence (CD). Participants (n=88, 34% female, 14% cannabis dependent) completed a marijuana purchase task at baseline. A delay discounting task was completed following placebo marijuana cigarette (0% THC) administration during a separate experimental session. Marijuana demand and DD were quantified using area under the curve (AUC). In multiple regression models, demand uniquely predicted frequency of marijuana use while DD did not. In contrast, DD uniquely predicted CD symptom count while demand did not. There were no significant interactions between demand and DD in either model. These findings suggest that frequent marijuana users exhibit key constituents of the reinforcer pathology model: high marijuana demand and steep discounting of delayed rewards. However, demand and DD appear to be independent rather than synergistic risk factors for elevated marijuana use and risk for progression to CD. Findings also provide support for using AUC as a singular marijuana demand metric, particularly when also examining other behavioral economic constructs that apply similar statistical approaches, such as DD, to support analytic methodological convergence. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Manufacturing complexity analysis
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1977-01-01
The analysis of the complexity of a typical system is presented. Starting with the subsystems of an example system, the step-by-step procedure for analysis of the complexity of an overall system is given. The learning curves for the various subsystems are determined as well as the concurrent numbers of relevant design parameters. Then trend curves are plotted for the learning curve slopes versus the various design-oriented parameters, e.g. number of parts versus slope of learning curve, or number of fasteners versus slope of learning curve, etc. Representative cuts are taken from each trend curve, and a figure-of-merit analysis is made for each of the subsystems. Based on these values, a characteristic curve is plotted which is indicative of the complexity of the particular subsystem. Each such characteristic curve is based on a universe of trend curve data taken from data points observed for the subsystem in question. Thus, a characteristic curve is developed for each of the subsystems in the overall system.
ERIC Educational Resources Information Center
Nahl, Diane
2010-01-01
New users of virtual environments face a steep learning curve, requiring persistence and determination to overcome challenges experienced while acclimatizing to the demands of avatar-mediated behavior. Concurrent structured self-reports can be used to monitor the personal affective and cognitive struggles involved in virtual world adaptation to…
The Elasticity of Substitution of White for Nonwhite Labor.
ERIC Educational Resources Information Center
Galchus, Kenneth Edward
This study calculates the degree of substitutability between white and nonwhite labor within various occupational categories, in order to determine the extent of racial discrimination and to derive demand curves for nonwhite labor. The model developed in the study treats employer discrimination as a differential between total and money costs to…
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Professional development for nurses: mentoring along the u-shaped curve.
Johnson, Joyce E; Billingsley, Molly; Crichlow, Tori; Ferrell, Eileen
2011-01-01
Shortages of nurses are expected to continue throughout the coming decade. To meet the demand, nursing leaders must develop creative approaches for nurturing and sustaining nursing talent. Traditionally, nursing has embraced a variety of development strategies to enhance the leadership abilities of nurses and to fill the leadership ranks with top talent. We describe the rationale, design, and impact of a 3-pronged organizational approach to mentoring nursing talent at Georgetown University Hospital, the first Magnet hospital in Washington, District of Columbia. The design of these programs was driven by the demographics of our nursing staff. Analysis of length of tenure revealed a modified "U-shaped curve" with the majority of new nurses with tenure less than 5 years, few in the middle between 5 and 15 years, and a moderate number with 15 or more years. Investment in all our nurses' leadership development required integrating a diverse developmental process into our organizational culture, which values personal growth and mastery. A strong mentoring program makes good business sense in terms of employee job satisfaction, improved cost control, and better patient outcomes. Our experience suggests that voluntary mentoring programs work synergistically to further the development of a mentoring culture in today's hospitals.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
Seismic fragility assessment of low-rise stone masonry buildings
NASA Astrophysics Data System (ADS)
Abo-El-Ezz, Ahmad; Nollet, Marie-José; Nastev, Miroslav
2013-03-01
Many historic buildings in old urban centers in Eastern Canada are made of stone masonry reputed to be highly vulnerable to seismic loads. Seismic risk assessment of stone masonry buildings is therefore the first step in the risk mitigation process to provide adequate planning for retrofit and preservation of historical urban centers. This paper focuses on development of analytical displacement-based fragility curves reflecting the characteristics of existing stone masonry buildings in Eastern Canada. The old historic center of Quebec City has been selected as a typical study area. The standard fragility analysis combines the inelastic spectral displacement, a structure-dependent earthquake intensity measure, and the building damage state correlated to the induced building displacement. The proposed procedure consists of a three-step development process: (1) mechanics-based capacity model, (2) displacement-based damage model and (3) seismic demand model. The damage estimation for a uniform hazard scenario of 2% in 50 years probability of exceedance indicates that slight to moderate damage is the most probable damage experienced by these stone masonry buildings. Comparison is also made with fragility curves implicit in the seismic risk assessment tools Hazus and ELER. Hazus shows the highest probability of the occurrence of no to slight damage, whereas the highest probability of extensive and complete damage is predicted with ELER. This comparison shows the importance of the development of fragility curves specific to the generic construction characteristics in the study area and emphasizes the need for critical use of regional risk assessment tools and generated results.
Minimally invasive video-assisted thyroidectomy: Ascending the learning curve
Capponi, Michela Giulii; Bellotti, Carlo; Lotti, Marco; Ansaloni, Luca
2015-01-01
BACKGROUND: Minimally invasive video-assisted thyroidectomy (MIVAT) is a technically demanding procedure and requires a surgical team skilled in both endocrine and endoscopic surgery. The aim of this report is to point out some aspects of the learning curve of the video-assisted thyroid surgery, through the analysis of our preliminary series of procedures. PATIENTS AND METHODS: Over a period of 8 months, we selected 36 patients for minimally invasive video-assisted surgery of the thyroid. The patients were considered eligible if they presented with a nodule not exceeding 35 mm and total thyroid volume <20 ml; presence of biochemical and ultrasound signs of thyroiditis and pre-operative diagnosis of cancer were exclusion criteria. We analysed surgical results, conversion rate, operating time, post-operative complications, hospital stay and cosmetic outcomes of the series. RESULTS: We performed 36 total thyroidectomy and in one case we performed a consensual parathyroidectomy. The procedure was successfully carried out in 33 out of 36 cases (conversion rate 8.3%). The mean operating time was 109 min (range: 80-241 min) and reached a plateau after 29 MIVAT. Post-operative complications included three transient recurrent nerve palsies and two transient hypocalcemias; no definitive hypoparathyroidism was registered. The cosmetic result was considered excellent by most patients. CONCLUSIONS: Advances in skills and technology allow surgeons to easily reproduce the standard open total thyroidectomy with video-assistance. Although the learning curve represents a time-consuming step, training remains a crucial point in gaining a reasonable confidence with video-assisted surgical technique. PMID:25883451
NASA Astrophysics Data System (ADS)
Miranda Guedes, Rui
2018-02-01
Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.
NASA Astrophysics Data System (ADS)
Ji, Xingpei; Wang, Bo; Liu, Dichen; Dong, Zhaoyang; Chen, Guo; Zhu, Zhenshan; Zhu, Xuedong; Wang, Xunting
2016-10-01
Whether the realistic electrical cyber-physical interdependent networks will undergo first-order transition under random failures still remains a question. To reflect the reality of Chinese electrical cyber-physical system, the "partial one-to-one correspondence" interdependent networks model is proposed and the connectivity vulnerabilities of three realistic electrical cyber-physical interdependent networks are analyzed. The simulation results show that due to the service demands of power system the topologies of power grid and its cyber network are highly inter-similar which can effectively avoid the first-order transition. By comparing the vulnerability curves between electrical cyber-physical interdependent networks and its single-layer network, we find that complex network theory is still useful in the vulnerability analysis of electrical cyber-physical interdependent networks.
A behavioral economic analysis of the nonmedical use of prescription drugs among young adults.
Pickover, Alison M; Messina, Bryan G; Correia, Christopher J; Garza, Kimberly B; Murphy, James G
2016-02-01
The nonmedical use of prescription drugs is a widely recognized public health issue, and young adults are particularly vulnerable to their use. Behavioral economic drug purchase tasks capture an individual's strength of desire and motivation for a particular drug. We examined young adult prescription drug purchase and consumption patterns using hypothetical behavioral economic purchase tasks for prescription sedatives/tranquilizers, stimulants, and opiate pain relievers. We also examined relations between demand, use frequency, and Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) substance use disorder (SUD) symptoms, and sex differences in these relations. Undergraduate students who endorsed past-year prescription drug use (N = 393) completed an online questionnaire for course credit. Measures assessed substance use frequency and DSM-5 SUD symptoms. Hypothetical purchase tasks for sedatives, stimulants, and pain relievers assessed participants' consumption and expenditure patterns for these substances across 25 prices. Past-year prescription sedative, stimulant, and pain reliever use was endorsed by 138, 258, and 189 participants, respectively. Among these users, consumption for their respective substance decreased as a function of ascending price, as expected. Demand indices for a prescription drug were associated with each other and with use frequency and SUD symptoms, with variability across substances but largely not by sex. In addition, demand for prescription pain relievers differentially predicted symptoms independent of use, with differences for females and males. In conclusion, hypothetical consumption and expenditure patterns for prescription drugs were generally well described by behavioral economic demand curves, and the observed associations with use and SUD symptoms provide support for the utility of prescription drug purchase tasks. PsycINFO Database Record (c) 2016 APA, all rights reserved.
Recession curve analysis for groundwater levels: case study in Latvia
NASA Astrophysics Data System (ADS)
Gailuma, A.; Vītola, I.; Abramenko, K.; Lauva, D.; Vircavs, V.; Veinbergs, A.; Dimanta, Z.
2012-04-01
Recession curve analysis is powerful and effective analysis technique in many research areas related with hydrogeology where observations have to be made, such as water filtration and absorption of moisture, irrigation and drainage, planning of hydroelectric power production and chemical leaching (elution of chemical substances) as well as in other areas. The analysis of the surface runoff hydrograph`s recession curves, which is performed to conceive the after-effects of interaction of precipitation and surface runoff, has approved in practice. The same method for analysis of hydrograph`s recession curves can be applied for the observations of the groundwater levels. There are manually prepared hydrograph for analysis of recession curves for observation wells (MG2, BG2 and AG1) in agricultural monitoring sites in Latvia. Within this study from the available monitoring data of groundwater levels were extracted data of declining periods, splitted by month. The drop-down curves were manually (by changing the date) moved together, until to find the best match, thereby obtaining monthly drop-down curves, representing each month separately. Monthly curves were combined and manually joined, for obtaining characterizing drop-down curves of the year for each well. Within the process of decreased recession curve analysis, from the initial curve was cut out upward areas, leaving only the drops of the curve, consequently, the curve is transformed more closely to the groundwater flow, trying to take out the impact of rain or drought periods from the curve. Respectively, the drop-down curve is part of the data, collected with hydrograph, where data with the discharge dominates, without considering impact of precipitation. Using the recession curve analysis theory, ready tool "A Visual Basic Spreadsheet Macro for Recession Curve Analysis" was used for selection of data and logarithmic functions matching (K. Posavec et.al., GROUND WATER 44, no. 5: 764-767, 2006), as well as functions were developed by manual processing of data. For displaying data the mathematical model of data equalization was used, finding the corresponding or closest logarithmic function of the recession for the graph. Obtained recession curves were similar but not identical. With full knowledge of the fluctuations of ground water level, it is possible to indirectly (without taking soil samples) determine the filtration coefficient: more rapid decline in the recession curve correspond for the better filtration conditions. This research could be very useful in construction planning, road constructions, agriculture etc. Acknowledgments The authors gratefully acknowledge the funding from ESF Project "Establishment of interdisciplinary scientist group and modeling system for groundwater research" (Agreement No. 2009/0212/1DP/1.1.1.2.0/09/APIA/VIAA/060EF7)
Bayesian Inference and Application of Robust Growth Curve Models Using Student's "t" Distribution
ERIC Educational Resources Information Center
Zhang, Zhiyong; Lai, Keke; Lu, Zhenqiu; Tong, Xin
2013-01-01
Despite the widespread popularity of growth curve analysis, few studies have investigated robust growth curve models. In this article, the "t" distribution is applied to model heavy-tailed data and contaminated normal data with outliers for growth curve analysis. The derived robust growth curve models are estimated through Bayesian…
Grebenstein, Patricia; Burroughs, Danielle; Zhang, Yan; LeSage, Mark G
2013-12-01
Reducing the nicotine content in tobacco products is being considered by the FDA as a policy to reduce the addictiveness of tobacco products. Understanding individual differences in response to nicotine reduction will be critical to developing safe and effective policy. Animal and human research demonstrating sex differences in the reinforcing effects of nicotine suggests that males and females may respond differently to nicotine-reduction policies. However, no studies have directly examined sex differences in the effects of nicotine unit-dose reduction on nicotine self-administration (NSA) in animals. The purpose of the present study was to examine this issue in a rodent self-administration model. Male and female rats were trained to self-administer nicotine (0.06mg/kg) under an FR 3 schedule during daily 23h sessions. Rats were then exposed to saline extinction and reacquisition of NSA, followed by weekly reductions in the unit dose (0.03 to 0.00025mg/kg) until extinction levels of responding were achieved. Males and females were compared with respect to baseline levels of intake, resistance to extinction, degree of compensatory increases in responding during dose reduction, and the threshold reinforcing unit dose of nicotine. Exponential demand-curve analysis was also conducted to compare the sensitivity of males and females to increases in the unit price (FR/unit dose) of nicotine (i.e., elasticity of demand or reinforcing efficacy). Females exhibited significantly higher baseline intake and less compensation than males. However, there were no sex differences in the reinforcement threshold or elasticity of demand. Dose-response relationships were very well described by the exponential demand function (r(2) values>0.96 for individual subjects). These findings suggest that females may exhibit less compensatory smoking in response to nicotine reduction policies, even though their nicotine reinforcement threshold and elasticity of demand may not differ from males. Copyright © 2013 Elsevier Inc. All rights reserved.
Grebenstein, Patricia; Burroughs, Danielle; Zhang, Yan; LeSage, Mark G.
2013-01-01
Reducing the nicotine content in tobacco products is being considered by the FDA as a policy to reduce the addictiveness of tobacco products. Understanding individual differences in response to nicotine reduction will be critical to developing safe and effective policy. Animal and human research demonstrating sex differences in the reinforcing effects of nicotine suggests that males and females may respond differently to nicotine-reduction policies. However, no studies have directly examined sex differences in the effects of nicotine unit-dose reduction on nicotine self-administration (NSA) in animals. The purpose of the present study was to examine this issue in a rodent self-administration model. Male and female rats were trained to self-administer nicotine (0.06 mg/kg) under an FR 3 schedule during daily 23 h sessions. Rats were then exposed to saline extinction and reacquisition of NSA, followed by weekly reductions in the unit dose (0.03 to 0.00025 mg/kg) until extinction levels of responding were achieved. Males and females were compared with respect to baseline levels of intake, resistance to extinction, degree of compensatory increases in responding during dose reduction, and the threshold reinforcing unit dose of nicotine. Exponential demand-curve analysis was also conducted to compare the sensitivity of males and females to increases in the unit price (FR/unit dose) of nicotine (i.e., elasticity of demand or reinforcing efficacy). Females exhibited significantly higher baseline intake and less compensation than males. However, there were no sex differences in the reinforcement threshold or elasticity of demand. Dose–response relationships were very well described by the exponential demand function (r2 values > 0.96 for individual subjects). These findings suggest that females may exhibit less compensatory smoking in response to nicotine reduction policies, even though their nicotine reinforcement threshold and elasticity of demand may not differ from males. PMID:24201048
Product diffusion through on-demand information-seeking behaviour.
Riedl, Christoph; Bjelland, Johannes; Canright, Geoffrey; Iqbal, Asif; Engø-Monsen, Kenth; Qureshi, Taimur; Sundsøy, Pål Roe; Lazer, David
2018-02-01
Most models of product adoption predict S-shaped adoption curves. Here we report results from two country-scale experiments in which we find linear adoption curves. We show evidence that the observed linear pattern is the result of active information-seeking behaviour: individuals actively pulling information from several central sources facilitated by modern Internet searches. Thus, a constant baseline rate of interest sustains product diffusion, resulting in a linear diffusion process instead of the S-shaped curve of adoption predicted by many diffusion models. The main experiment seeded 70 000 (48 000 in Experiment 2) unique voucher codes for the same product with randomly sampled nodes in a social network of approximately 43 million individuals with about 567 million ties. We find that the experiment reached over 800 000 individuals with 80% of adopters adopting the same product-a winner-take-all dynamic consistent with search engine driven rankings that would not have emerged had the products spread only through a network of social contacts. We provide evidence for (and characterization of) this diffusion process driven by active information-seeking behaviour through analyses investigating (a) patterns of geographical spreading; (b) the branching process; and (c) diffusion heterogeneity. Using data on adopters' geolocation we show that social spreading is highly localized, while on-demand diffusion is geographically independent. We also show that cascades started by individuals who actively pull information from central sources are more effective at spreading the product among their peers. © 2018 The Authors.
Product diffusion through on-demand information-seeking behaviour
Bjelland, Johannes; Canright, Geoffrey; Iqbal, Asif; Qureshi, Taimur; Sundsøy, Pål Roe
2018-01-01
Most models of product adoption predict S-shaped adoption curves. Here we report results from two country-scale experiments in which we find linear adoption curves. We show evidence that the observed linear pattern is the result of active information-seeking behaviour: individuals actively pulling information from several central sources facilitated by modern Internet searches. Thus, a constant baseline rate of interest sustains product diffusion, resulting in a linear diffusion process instead of the S-shaped curve of adoption predicted by many diffusion models. The main experiment seeded 70 000 (48 000 in Experiment 2) unique voucher codes for the same product with randomly sampled nodes in a social network of approximately 43 million individuals with about 567 million ties. We find that the experiment reached over 800 000 individuals with 80% of adopters adopting the same product—a winner-take-all dynamic consistent with search engine driven rankings that would not have emerged had the products spread only through a network of social contacts. We provide evidence for (and characterization of) this diffusion process driven by active information-seeking behaviour through analyses investigating (a) patterns of geographical spreading; (b) the branching process; and (c) diffusion heterogeneity. Using data on adopters' geolocation we show that social spreading is highly localized, while on-demand diffusion is geographically independent. We also show that cascades started by individuals who actively pull information from central sources are more effective at spreading the product among their peers. PMID:29467257
A Graphical Exposition of the Link between Two Representations of the Excess Burden of Taxation
ERIC Educational Resources Information Center
Liu, Liqun; Rettenmaier, Andrew J.
2005-01-01
The excess burden of taxation typically has two graphical representations in undergraduate microeconomics and public finance textbooks: the IC/BC (indifference curve/budget constraint) representation and the demand/supply representation. The IC/BC representation has the advantage of showing the behavioral response to a distortionary tax and how a…
ERIC Educational Resources Information Center
Anggrianto, Desi; Churiyah, Madziatul; Arief, Mohammad
2016-01-01
This research was conducted in order to know the effect of Logan Avenue Problem Solving (LAPS)-Heuristic learning model towards critical thinking skills of students of class X Office Administration (APK) in SMK Negeri 1 Ngawi, East Java, Indonesia on material curve and equilibrium of demand and supply, subject Introduction to Economics and…
Simulating water markets with transaction costs
Erfani, Tohid; Binions, Olga; Harou, Julien J
2014-01-01
This paper presents an optimization model to simulate short-term pair-wise spot-market trading of surface water abstraction licenses (water rights). The approach uses a node-arc multicommodity formulation that tracks individual supplier-receiver transactions in a water resource network. This enables accounting for transaction costs between individual buyer-seller pairs and abstractor-specific rules and behaviors using constraints. Trades are driven by economic demand curves that represent each abstractor's time-varying water demand. The purpose of the proposed model is to assess potential hydrologic and economic outcomes of water markets and aid policy makers in designing water market regulations. The model is applied to the Great Ouse River basin in Eastern England. The model assesses the potential weekly water trades and abstractions that could occur in a normal and a dry year. Four sectors (public water supply, energy, agriculture, and industrial) are included in the 94 active licensed water diversions. Each license's unique environmental restrictions are represented and weekly economic water demand curves are estimated. Rules encoded as constraints represent current water management realities and plausible stakeholder-informed water market behaviors. Results show buyers favor sellers who can supply large volumes to minimize transactions. The energy plant cooling and agricultural licenses, often restricted from obtaining water at times when it generates benefits, benefit most from trades. Assumptions and model limitations are discussed. Key Points Transaction tracking hydro-economic optimization models simulate water markets Proposed model formulation incorporates transaction costs and trading behavior Water markets benefit users with the most restricted water access PMID:25598558
Simulating water markets with transaction costs.
Erfani, Tohid; Binions, Olga; Harou, Julien J
2014-06-01
This paper presents an optimization model to simulate short-term pair-wise spot-market trading of surface water abstraction licenses (water rights). The approach uses a node-arc multicommodity formulation that tracks individual supplier-receiver transactions in a water resource network. This enables accounting for transaction costs between individual buyer-seller pairs and abstractor-specific rules and behaviors using constraints. Trades are driven by economic demand curves that represent each abstractor's time-varying water demand. The purpose of the proposed model is to assess potential hydrologic and economic outcomes of water markets and aid policy makers in designing water market regulations. The model is applied to the Great Ouse River basin in Eastern England. The model assesses the potential weekly water trades and abstractions that could occur in a normal and a dry year. Four sectors (public water supply, energy, agriculture, and industrial) are included in the 94 active licensed water diversions. Each license's unique environmental restrictions are represented and weekly economic water demand curves are estimated. Rules encoded as constraints represent current water management realities and plausible stakeholder-informed water market behaviors. Results show buyers favor sellers who can supply large volumes to minimize transactions. The energy plant cooling and agricultural licenses, often restricted from obtaining water at times when it generates benefits, benefit most from trades. Assumptions and model limitations are discussed. Transaction tracking hydro-economic optimization models simulate water marketsProposed model formulation incorporates transaction costs and trading behaviorWater markets benefit users with the most restricted water access.
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Analysis and Recognition of Curve Type as The Basis of Object Recognition in Image
NASA Astrophysics Data System (ADS)
Nugraha, Nurma; Madenda, Sarifuddin; Indarti, Dina; Dewi Agushinta, R.; Ernastuti
2016-06-01
An object in an image when analyzed further will show the characteristics that distinguish one object with another object in an image. Characteristics that are used in object recognition in an image can be a color, shape, pattern, texture and spatial information that can be used to represent objects in the digital image. The method has recently been developed for image feature extraction on objects that share characteristics curve analysis (simple curve) and use the search feature of chain code object. This study will develop an algorithm analysis and the recognition of the type of curve as the basis for object recognition in images, with proposing addition of complex curve characteristics with maximum four branches that will be used for the process of object recognition in images. Definition of complex curve is the curve that has a point of intersection. By using some of the image of the edge detection, the algorithm was able to do the analysis and recognition of complex curve shape well.
Saeidabadi, Mohammad Sadegh; Nili, Hassan; Dadras, Habibollah; Sharifiyazdi, Hassan; Connolly, Joanne; Valcanis, Mary; Raidal, Shane; Ghorashi, Seyed Ali
2017-06-01
Consumption of poultry products contaminated with Salmonella is one of the major causes of foodborne diseases worldwide and therefore detection and differentiation of Salmonella spp. in poultry is important. In this study, oligonucleotide primers were designed from hemD gene and a PCR followed by high-resolution melt (HRM) curve analysis was developed for rapid differentiation of Salmonella isolates. Amplicons of 228 bp were generated from 16 different Salmonella reference strains and from 65 clinical field isolates mainly from poultry farms. HRM curve analysis of the amplicons differentiated Salmonella isolates and analysis of the nucleotide sequence of the amplicons from selected isolates revealed that each melting curve profile was related to a unique DNA sequence. The relationship between reference strains and tested specimens was also evaluated using a mathematical model without visual interpretation of HRM curves. In addition, the potential of the PCR-HRM curve analysis was evaluated for genotyping of additional Salmonella isolates from different avian species. The findings indicate that PCR followed by HRM curve analysis provides a rapid and robust technique for genotyping of Salmonella isolates to determine the serovar/serotype.
SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators
NASA Astrophysics Data System (ADS)
Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.
2011-12-01
Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.
NASA Astrophysics Data System (ADS)
Lall, U.
2013-12-01
The availability of long lead climate forecasts that can in turn inform streamflow, agricultural, ecological and municipal/industrial and energy demands provides an opportunity for innovations in water resources management that go beyond the current practices and paradigms. In a practical setting, managers seek to meet registered demands as well as they can. Pricing mechanisms to manage demand are rarely invoked. Drought restrictions and operations are implemented as needed, and pressures from special interest groups are sometimes accommodated through a variety of processes. In the academic literature, there is a notion that demand curves for different sectors could be established and used for "optimal management". However, the few attempts to implement such ideas have invariably failed as elicitation of demand elasticity and socio-political factors is imperfect at best. In this talk, I will focus on what is worth predicting and for whom and how operational risks for the water system can be securitized while providing a platform for priced and negotiated allocation of the resources in the presence of imperfect forecasts. The possibility of a national or regional market for water contracts as part of the framework is explored, and its potential benefits and pitfalls identified.
A novel real time PCR assay using melt curve analysis for ivory identification.
Kitpipit, Thitika; Penchart, Kitichaya; Ouithavon, Kanita; Satasook, Chutamas; Linacre, Adrian; Thanakiatkrai, Phuvadol
2016-10-01
Demand for ivory and expansion of human settlements have resulted in a rapid decline in the number of elephants. Enforcement of local and international laws and regulations requires identification of the species from which any ivory, or ivory products, originated. Further geographical assignment of the dead elephant from which the ivory was taken can assist in forensic investigations. In this study, a real-time PCR assay using melt curve analysis was developed and fully validated for forensic use. The presence or absence of three Elephantidae-specific and elephant species-specific melting peaks was used to identify the elephant species. Using 141 blood and ivory samples from the three extant elephant species, the assay demonstrated very high reproducibility and accuracy. The limit of detection was as low as 0.031ng of input DNA for conventional amplification and 0.002ng for nested amplification. Both DNA concentrations are typically encountered in forensic casework, especially for degraded samples. No cross-reactivity was observed for non-target species. Evaluation of direct amplification and nested amplification demonstrated the assay's flexibility and capability of analyzing low-template DNA samples and aged samples. Additionally, blind trial testing showed the assay's suitability application in real casework. In conclusion, wildlife forensic laboratories could use this novel, quick, and low-cost assay to help combat the continuing poaching crises leading to the collapse of elephant numbers in the wild. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Matsumoto, Masatoshi; Inoue, Kazuo; Noguchi, Satomi; Toyokawa, Satoshi; Kajii, Eiji
2009-02-18
In many countries, there is a surplus of physicians in some communities and a shortage in others. Population size is known to be correlated with the number of physicians in a community, and is conventionally considered to represent the power of communities to attract physicians. However, associations between other demographic/economic variables and the number of physicians in a community have not been fully evaluated. This study seeks other parameters that correlate with the physician population and show which characteristics of a community determine its "attractiveness" to physicians. Associations between the number of physicians and selected demographic/economic/life-related variables of all of Japan's 3132 municipalities were examined. In order to exclude the confounding effect of community size, correlations between the physician-to-population ratio and other variable-to-population ratios or variable-to-area ratios were evaluated with simple correlation and multiple regression analyses. The equity of physician distribution against each variable was evaluated by the orenz curve and Gini index. Among the 21 variables selected, the service industry workers-to-population ratio (0.543), commercial land price (0.527), sales of goods per person (0.472), and daytime population density (0.451) were better correlated with the physician-to-population ratio than was population density (0.409). Multiple regression analysis showed that the service industry worker-to-population ratio, the daytime population density, and the elderly rate were each independently correlated with the physician-to-population ratio (standardized regression coefficient 0.393, 0.355, 0.089 respectively; each p<0.001). Equity of physician distribution was higher against service industry population (Gini index=0.26) and daytime population (0.28) than against population (0.33). Daytime population and service industry population in a municipality are better parameters of community attractiveness to physicians than population. Because attractiveness is supposed to consist of medical demand and the amenities of urban life, the two parameters may represent the amount of medical demand and/or the extent of urban amenities of the community more precisely than population does. The conventional demand-supply analysis based solely on population as the demand parameter may overestimate the inequity of the physician distribution among communities.
The Tax Compliance Demand Curve: A Diagrammatical Approach to Income Tax Evasion
ERIC Educational Resources Information Center
Yaniv, Gideon
2009-01-01
One of the most interesting results in the tax evasion literature is that an increase in the income tax rate would increase tax compliance. Despite its peculiarity, this result has gained acceptance as a cornerstone for further developments of the rational tax evasion model. However, because of the mathematical format by which it is conveyed, this…
Ivandini, Tribidasari A; Saepudin, Endang; Wardah, Habibah; Harmesa; Dewangga, Netra; Einaga, Yasuaki
2012-11-20
Gold-modified boron doped diamond (BDD) electrodes were examined for the amperometric detection of oxygen as well as a detector for measuring biochemical oxygen demand (BOD) using Rhodotorula mucilaginosa UICC Y-181. An optimum potential of -0.5 V (vs Ag/AgCl) was applied, and the optimum waiting time was observed to be 20 min. A linear calibration curve for oxygen reduction was achieved with a sensitivity of 1.4 μA mg(-1) L oxygen. Furthermore, a linear calibration curve in the glucose concentration range of 0.1-0.5 mM (equivalent to 10-50 mg L(-1) BOD) was obtained with an estimated detection limit of 4 mg L(-1) BOD. Excellent reproducibility of the BOD sensor was shown with an RSD of 0.9%. Moreover, the BOD sensor showed good tolerance against the presence of copper ions up to a maximum concentration of 0.80 μM (equivalent to 50 ppb). The sensor was applied to BOD measurements of the water from a lake at the University of Indonesia in Jakarta, Indonesia, with results comparable to those made using a standard method for BOD measurement.
Yurasek, Ali M; Murphy, James G; Dennhardt, Ashley A; Skidmore, Jessica R; Buscemi, Joanna; McCausland, Claudia; Martens, Matthew P
2011-11-01
Several studies have shown that demand curve indices of the reinforcing efficacy of alcohol (i.e., reports of hypothetical alcohol consumption and expenditures across a range of drink prices) are associated with alcohol-related outcomes. A next logical step in this area of research is to examine potential mediators of this direct relationship. It is possible that enhancement and coping drinking motives serve as an intermediary of the reinforcing efficacy-alcohol use relationship, such that higher reinforcing efficacy is associated with increased motivation to drink, which is then associated with greater alcohol use and problems. Data were collected from 215 college undergraduates who reported drinking in the past 30 days. The demand curve reinforcing efficacy indices O(max) (maximum alcohol expenditure) and intensity (consumption level when drinks were free) demonstrated the strongest and most consistent associations with alcohol use, problems, and motives. Results from two structural equation models indicated that enhancement and coping motives mediated the relationship between reinforcing efficacy and alcohol use and alcohol-related problems. These results suggest that the motivational effects of the behavioral economic variable reinforcing efficacy on problematic alcohol use are in part mediated by increases in enhancement and coping motives for drinking.
Surface inspection system for carriage parts
NASA Astrophysics Data System (ADS)
Denkena, Berend; Acker, Wolfram
2006-04-01
Quality standards are very high in carriage manufacturing, due to the fact, that the visual quality impression is highly relevant for the purchase decision for the customer. In carriage parts even very small dents can be visible on the varnished and polished surface by observing reflections. The industrial demands are to detect these form errors on the unvarnished part. In order to meet the requirements, a stripe projection system for automatic recognition of waviness and form errors is introduced1. It bases on a modified stripe projection method using a high resolution line scan camera. Particular emphasis is put on achieving a short measuring time and a high resolution in depth, aiming at a reliable automatic recognition of dents and waviness of 10 μm on large curved surfaces of approximately 1 m width. The resulting point cloud needs to be filtered in order to detect dents. Therefore a spatial filtering technique is used. This works well on smoothly curved surfaces, if frequency parameters are well defined. On more complex parts like mudguards the method is restricted by the fact that frequencies near the define dent frequencies occur within the surface as well. To allow analysis of complex parts, the system is currently extended by including 3D CAD models into the process of inspection. For smoothly curved surfaces, the measuring speed of the prototype is mainly limited by the amount of light produced by the stripe projector. For complex surfaces the measuring speed is limited by the time consuming matching process. Currently, the development focuses on the improvement of the measuring speed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spears, Robert Edward; Coleman, Justin Leigh
Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soilmore » and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE’s) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This methodology will be known as, NonLinear Soil-Structure Interaction (NLSSI). In general NLSSI analysis should provide a more accurate representation of the seismic demands on nuclear facilities their systems and components. INL, in collaboration with a Nuclear Power Plant Vender (NPP-V), will develop a generic Nuclear Power Plant (NPP) structural design to be used in development of the methodology and for comparison with SASSI. This generic NPP design has been evaluated for the INL soil site because of the ease of access and quality of the site specific data. It is now being evaluated for a second site at Vogtle which is located approximately 15 miles East-Northeast of Waynesboro, Georgia and adjacent to Savanna River. The Vogtle site consists of many soil layers spanning down to a depth of 1058 feet. The reason that two soil sites are chosen is to demonstrate the methodology across multiple soil sites. The project will drive the models (soil and structure) using successively increasing acceleration time histories with amplitudes. The models will be run in time domain codes such as ABAQUS, LS-DYNA, and/or ESSI and compared with the same models run in SASSI. The project is focused on developing and documenting a method for performing time domain, non-linear seismic soil structure interaction (SSI) analysis. Development of this method will provide the Department of Energy (DOE) and industry with another tool to perform seismic SSI analysis.« less
Aligning PEV Charging Times with Electricity Supply and Demand
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Cabell
Plug-in electric vehicles (PEVs) are a growing source of electricity consumption that could either exacerbate supply shortages or smooth electricity demand curves. Extensive research has explored how vehicle-grid integration (VGI) can be optimized by controlling PEV charging timing or providing vehicle-to-grid (V2G) services, such as storing energy in vehicle batteries and returning it to the grid at peak times. While much of this research has modeled charging, implementation in the real world requires a cost-effective solution that accounts for consumer behavior. To function across different contexts, several types of charging administrators and methods of control are necessary to minimize costsmore » in the VGI context.« less
A new method named as Segment-Compound method of baffle design
NASA Astrophysics Data System (ADS)
Qin, Xing; Yang, Xiaoxu; Gao, Xin; Liu, Xishuang
2017-02-01
As the observation demand increased, the demand of the lens imaging quality rising. Segment- Compound baffle design method was proposed in this paper. Three traditional methods of baffle design they are characterized as Inside to Outside, Outside to Inside, and Mirror Symmetry. Through a transmission type of optical system, the four methods were used to design stray light suppression structure for it, respectively. Then, structures modeling simulation with Solidworks, CAXA, Tracepro, At last, point source transmittance (PST) curve lines were got to describe their performance. The result shows that the Segment- Compound method can inhibit stay light more effectively. Moreover, it is easy to active and without use special material.
Kirkpatrick, Naomi C; Blacker, Hayley P; Woods, Wayne G; Gasser, Robin B; Noormohammadi, Amir H
2009-02-01
Coccidiosis is a significant disease of poultry caused by different species of Eimeria. Differentiation of Eimeria species is important for the quality control of the live attenuated Eimeria vaccines derived from monospecific lines of Eimeria spp. In this study, high-resolution melting (HRM) curve analysis of the amplicons generated from the second internal transcribed spacer of nuclear ribosomal DNA (ITS-2) was used to distinguish between seven pathogenic Eimeria species of chickens, and the results were compared with those obtained from the previously described technique, capillary electrophoresis. Using a series of known monospecific lines of Eimeria species, HRM curve analysis was shown to distinguish between Eimeria acervulina, Eimeria brunetti, Eimeria maxima, Eimeria mitis, Eimeria necatrix, Eimeria praecox and Eimeria tenella. Computerized analysis of the HRM curves and capillary electrophoresis profiles could detect the dominant species in several specimens containing different ratios of E. necatrix and E. maxima and of E. tenella and E. acervulina. The HRM curve analysis identified all of the mixtures as "variation" to the reference species, and also identified the minor species in some mixtures. Computerized HRM curve analysis also detected impurities in 21 possible different combinations of the seven Eimeria species. The PCR-based HRM curve analysis of the ITS-2 provides a powerful tool for the detection and identification of pure Eimeria species. The HRM curve analysis could also be used as a rapid tool in the quality assurance of Eimeria vaccine production to confirm the purity of the monospecific cell lines. The HRM curve analysis is rapid and reliable and can be performed in a single test tube in less than 3 h.
Seismic Performance Evaluation of Reinforced Concrete Frames Subjected to Seismic Loads
NASA Astrophysics Data System (ADS)
Zameeruddin, Mohd.; Sangle, Keshav K.
2017-06-01
Ten storied-3 bays reinforced concrete bare frame designed for gravity loads following the guidelines of IS 456 and IS 13920 for ductility is subjected to seismic loads. The seismic demands on this building were calculated by following IS 1893 for response spectra of 5% damping (for hard soil type). Plastic hinges were assigned to the beam and column at both ends to represent the failure mode, when member yields. Non-linear static (pushover) analysis was performed to evaluate the performance of the building in reference to first (ATC 40), second (FEMA 356) and next-generation (FEMA 440) performance based seismic design procedures. Base shear against top displacement curve of structure, known as pushover curve was obtained for two actions of plastic hinge behavior, force-controlled (brittle) and deformation-controlled (ductile) actions. Lateral deformation corresponding to performance point proves the building capability to sustain a certain level of seismic loads. The failure is represented by a sequence of formation of plastic hinges. Deformation-controlled action of hinges showed that building behaves like strong-column-weak-beam mechanism, whereas force-controlled action showed formation of hinges in the column. The study aims to understand the first, second and next generation performance based design procedure in prediction of actual building responses and their conservatism into the acceptance criteria.
Pharmacokinetic Correlates of the Effects of a Heroin Vaccine on Heroin Self-Administration in Rats
Raleigh, Michael D.; Pentel, Paul R.; LeSage, Mark G.
2014-01-01
The purpose of this study was to evaluate the effects of a morphine-conjugate vaccine (M-KLH) on the acquisition, maintenance, and reinstatement of heroin self-administration (HSA) in rats, and on heroin and metabolite distribution during heroin administration that approximated the self-administered dosing rate. Vaccination with M-KLH blocked heroin-primed reinstatement of heroin responding. Vaccination also decreased HSA at low heroin unit doses but produced a compensatory increase in heroin self-administration at high unit doses. Vaccination shifted the heroin dose-response curve to the right, indicating reduced heroin potency, and behavioral economic demand curve analysis further confirmed this effect. In a separate experiment heroin was administered at rates simulating heroin exposure during HSA. Heroin and its active metabolites, 6-acetylmorphine (6-AM) and morphine, were retained in plasma and metabolite concentrations were reduced in brain in vaccinated rats compared to controls. Reductions in 6-AM concentrations in brain after vaccination were consistent with the changes in HSA rates accompanying vaccination. These data provide evidence that 6-AM is the principal mediator of heroin reinforcement, and the principal target of the M-KLH vaccine, in this model. While heroin vaccines may have potential as therapies for heroin addiction, high antibody to drug ratios appear to be important for obtaining maximal efficacy. PMID:25536404
[The optimizing design and experiment for a MOEMS micro-mirror spectrometer].
Mo, Xiang-xia; Wen, Zhi-yu; Zhang, Zhi-hai; Guo, Yuan-jun
2011-12-01
A MOEMS micro-mirror spectrometer, which uses micro-mirror as a light switch so that spectrum can be detected by a single detector, has the advantages of transforming DC into AC, applying Hadamard transform optics without additional template, high pixel resolution and low cost. In this spectrometer, the vital problem is the conflict between the scales of slit and the light intensity. Hence, in order to improve the resolution of this spectrometer, the present paper gives the analysis of the new effects caused by micro structure, and optimal values of the key factors. Firstly, the effects of diffraction limitation, spatial sample rate and curved slit image on the resolution of the spectrum were proposed. Then, the results were simulated; the key values were tested on the micro mirror spectrometer. Finally, taking all these three effects into account, this micro system was optimized. With a scale of 70 mm x 130 mm, decreasing the height of the image at the plane of micro mirror can not diminish the influence of curved slit image in the spectrum; under the demand of spatial sample rate, the resolution must be twice over the pixel resolution; only if the width of the slit is 1.818 microm and the pixel resolution is 2.2786 microm can the spectrometer have the best performance.
Markov switching of the electricity supply curve and power prices dynamics
NASA Astrophysics Data System (ADS)
Mari, Carlo; Cananà, Lucianna
2012-02-01
Regime-switching models seem to well capture the main features of power prices behavior in deregulated markets. In a recent paper, we have proposed an equilibrium methodology to derive electricity prices dynamics from the interplay between supply and demand in a stochastic environment. In particular, assuming that the supply function is described by a power law where the exponent is a two-state strictly positive Markov process, we derived a regime switching dynamics of power prices in which regime switches are induced by transitions between Markov states. In this paper, we provide a dynamical model to describe the random behavior of power prices where the only non-Brownian component of the motion is endogenously introduced by Markov transitions in the exponent of the electricity supply curve. In this context, the stochastic process driving the switching mechanism becomes observable, and we will show that the non-Brownian component of the dynamics induced by transitions from Markov states is responsible for jumps and spikes of very high magnitude. The empirical analysis performed on three Australian markets confirms that the proposed approach seems quite flexible and capable of incorporating the main features of power prices time-series, thus reproducing the first four moments of log-returns empirical distributions in a satisfactory way.
Quantifying Energy and Water Savings in the U.S. Residential Sector.
Chini, Christopher M; Schreiber, Kelsey L; Barker, Zachary A; Stillwell, Ashlynn S
2016-09-06
Stress on water and energy utilities, including natural resource depletion, infrastructure deterioration, and growing populations, threatens the ability to provide reliable and sustainable service. This study presents a demand-side management decision-making tool to evaluate energy and water efficiency opportunities at the residential level, including both direct and indirect consumption. The energy-water nexus accounts for indirect resource consumption, including water-for-energy and energy-for-water. We examine the relationship between water and energy in common household appliances and fixtures, comparing baseline appliances to ENERGY STAR or WaterSense appliances, using a cost abatement analysis for the average U.S. household, yielding a potential annual per household savings of 7600 kWh and 39 600 gallons, with most upgrades having negative abatement cost. We refine the national average cost abatement curves to understand regional relationships, specifically for the urban environments of Los Angeles, Chicago, and New York. Cost abatement curves display per unit cost savings related to overall direct and indirect energy and water efficiency, allowing utilities, policy makers, and homeowners to consider the relationship between energy and water when making decisions. Our research fills an important gap of the energy-water nexus in a residential unit and provides a decision making tool for policy initiatives.
Rousson, Valentin; Zumbrunn, Thomas
2011-06-22
Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
2011-01-01
Background Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. Methods We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. Results We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. Conclusions We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application. PMID:21696604
Quantitative Assessment of Free Flap Viability with CEUS Using an Integrated Perfusion Software.
Geis, S; Klein, S; Prantl, L; Dolderer, J; Lamby, P; Jung, E-M
2015-12-01
New treatment strategies in oncology and trauma surgery lead to an increasing demand for soft tissue reconstruction with free tissue transfer. In previous studies, CEUS was proven to detect early flap failure. The aim of this study was to detect and quantify vascular disturbances after free flap transplantation using a fast integrated perfusion software tool. From 2011 to 2013, 33 patients were examined by one experienced radiologist using CEUS after a bolus injection of 1-2.4 ml of SonoVue(®). Flap perfusion was analysed qualitatively regarding contrast defects or delayed wash-in. Additionally, an integrated semi-quantitative analysis using time-intensity curve analysis (TIC) was performed. TIC analysis of the transplant was conducted on a centimetre-by-centimetre basis up to a penetration depth of 4 cm. The 2 perfusion parameters "Time to PEAK" and "Area under the Curve" were compared in patients without complications vs. patients with minor complications or complete flap loss to figure out significant differences. TtoPk is given in seconds (s) and Area is given in relative units (rU) Results: A regular postoperative process was observed in 26 (79%) patients. In contrast, 5 (15%) patients with partial superficial flap necrosis, 1 patient (3%) with complete flap loss and 1 patient (3%) with haematoma were observed. TtoPk revealed no significant differences, whereas Area revealed significantly lower perfusion values in the corresponding areas in patients with complications. The critical threshold for sufficient flap perfusion was set below 150 rU. In conclusion, CEUS is a mobile and cost-effective opportunity to quantify tissue perfusion and can even be used almost without any restrictions in multi-morbid patients with renal and hepatic failure. © Georg Thieme Verlag KG Stuttgart · New York.
Vincent, Paula C; Collins, R Lorraine; Liu, Liu; Yu, Jihnhee; De Leo, Joseph A; Earleywine, Mitch
2017-01-01
Given the growing legalization of recreational marijuana use and related increase in its prevalence in the United States, it is important to understand marijuana's appeal. We used a behavioral economic (BE) approach to examine whether the reinforcing properties of marijuana, including "demand" for marijuana, varied as a function of its perceived quality. Using an innovative, Web-based marijuana purchase task (MPT), a sample of 683 young-adult recreational marijuana users made hypothetical purchases of marijuana across three qualities (low, mid and high grade) at nine escalating prices per joint, ranging from $0/free to $20. We used nonlinear mixed effects modeling to conduct demand curve analyses, which produced separate demand indices (e.g., P max , elasticity) for each grade of marijuana. Consistent with previous research, as the price of marijuana increased, marijuana users reduced their purchasing. Demand also was sensitive to quality, with users willing to pay more for higher quality/grade marijuana. In regression analyses, demand indices accounted for significant variance in typical marijuana use. This study illustrates the value of applying BE theory to young adult marijuana use. It extends past research by examining how perceived quality affects demand for marijuana and provides support for the validity of a Web-based MPT to examine the appeal of marijuana. Our results have implications for policies to regulate marijuana use, including taxation based on the quality of different marijuana products. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Methods of Technological Forecasting,
1977-05-01
Trend Extrapolation Progress Curve Analogy Trend Correlation Substitution Analysis or Substitution Growth Curves Envelope Curve Advances in the State of...the Art Technological Mapping Contextual Mapping Matrix Input-Output Analysis Mathematical Models Simulation Models Dynamic Modelling. CHAPTER IV...Generation Interaction between Needs and Possibilities Map of the Technological Future — (‘ross- Impact Matri x Discovery Matrix Morphological Analysis
Arsac, Laurent M
2002-05-01
The present study was designed to investigate the role of reduced air density on the energetics of 100 m running at altitude. A mathematical supply-demand model was used where supply had two components, aerobic and anaerobic and demand had three components: the cost of overcoming non-aerodynamic forces (C(na)), the cost of overcoming air resistance (C(aero)), and the cost due to changes in the runner's kinetic energy (C(kin)). Actual instantaneous-speed curves recorded in 100 m world champions were modelled at sea level. Then I calculated improvements in 100 m running times and changes in the components of the energy cost with changes in altitude from 0 m to 4,000 m. For the 100 m world championship for men, the model predicted times of 9.88 s at sea level, 9.80 s at 1,000 m, 9.73 s at 2,000 m, 9.64 s at 4,000 m and 9.15 s in the hypothetical situation where the air resistance was nil. In the counterpart for women the corresponding times were 10.85 s, 10.76 s, 10.70 s, 10.60 s and 10.04 s. The C(aero) was 12%-13% of demand at sea level, 10%-11% at 2,000 m and 8%-9% at 4,000 m. When C(aero) decreased this led to better performance by making more energy available for acceleration. Accordingly, C(kin) increased from 20%-24% at sea level to 23%-27% at 4,000 m. There was no effect of altitude specific to body size.
Development of probabilistic operating rules for Hluhluwe Dam, South Africa
NASA Astrophysics Data System (ADS)
Ndiritu, J.; Odiyo, J.; Makungo, R.; Mwaka, B.; Mthethwa, N.; Ntuli, C.; Andanje, A.
2017-08-01
Hluhluwe Dam, with a 30 million m3 reservoir that supplies water for irrigation and Hluhluwe municipality in Kwa-Zulu Natal Province, South Africa, was consistently experiencing low storage levels over several non-drought years since 2001. The dam was operated by rules of thumb and there were no records of water releases for irrigation - the main user of the dam. This paper describes an assessment of the historic behaviour of the reservoir since its completion in 1964 and the development of operating rules that accounted for: i) the multiple and different levels of reliability at which municipal and irrigation demands need to be supplied, and ii) inter-annual and inter-decadal variability of climate and inflows into the dam. The assessment of the behaviour of the reservoir was done by simulation assuming trigonometric rule curves that were optimized to maximize both yield and storage state using the SCE-UA method. The resulting reservoir behaviour matched the observed historic trajectory reasonably well and indicated that the dam has mainly been operated at a demand of 10 million m3/year until 2000 when the demand suddenly rose to 25 million m3/year. Operating rules were developed from a statistical analysis of the base yields from 500 simulations of the reservoir each using 5 year-long stochastically generated sequences of inflows, rainfall and evaporation. After the implementation of the operating rules in 2009, the storage state of the dam improved and matched those of other reservoirs in the region that had established operating rules.
1980-12-01
NOTES 3 19. KEY WORDS (Continue on revere side If n.cessary d Identify by block number) Bulk cargo Market demand analysis Commodity resource inventory...The study included a Commodity Resource Inventory, a Modal Split Analysis and a Market Demand Analysis. The work included investigation and analyses...inventory, a modal split analysis and a market demand analysis. The work included investigation and analyses of the production, transportation, and
Zhang, Hao; Hu, Huimei; Wu, Christina; Yu, Hai; Dong, Hengjin
2015-01-01
Background High drug costs due to supplier-induced demand (SID) obstruct healthcare accessibility in China. Drug prescriptions can generate markup-related profits, and the low prices of other medical services can lead to labor-force underestimations; therefore, physicians are keen to prescribe drugs rather than services. Thus, in China, a public hospital reform has been instituted to cancel markups and increase service prices. Methods A retrospective pre/post-reform study was conducted in ZJ province to assess the impact of the reform on healthcare expenditures and utilization, ultimately to inform policy development and decision-making. The main indicators are healthcare expenditures and utilization. Results Post-reform, drug expenditures per visit decreased by 8.2% and 15.36% in outpatient and inpatient care, respectively; service expenditures per visit increased by 23.03% and 27.69% in outpatient and inpatient care, respectively. Drug utilization per visit increased by 5.58% in outpatient care and underwent no significant change in inpatient care. Both were lower than the theoretical drug-utilization level, which may move along the demand curve because of patient-initiated demand (PID); this indicates that SID-promoted drug utilization may decrease. Finally, service utilization per visit increased by 6% in outpatient care and by 13.10% in inpatient care; both were higher than the theoretical level moving along the demand curve, and this indicates that SID-promoted service utilization may increase. Conclusion The reform reduces drug-prescription profits by eliminating drug markups; additionally, it compensates for service costs by increasing service prices. Post-reform, the SID of drug prescriptions decreased, which may reduce drug-resource waste. The SID of services increased, with potentially positive and negative effects: accessibility to services may be promoted when physicians provide more services, but the risk of resource waste may also increase. This warrants further research. It is recommended that comprehensive measures that control SID and promote physician enthusiasm be carried out concurrently. PMID:26588244
Zhang, Hao; Hu, Huimei; Wu, Christina; Yu, Hai; Dong, Hengjin
2015-01-01
High drug costs due to supplier-induced demand (SID) obstruct healthcare accessibility in China. Drug prescriptions can generate markup-related profits, and the low prices of other medical services can lead to labor-force underestimations; therefore, physicians are keen to prescribe drugs rather than services. Thus, in China, a public hospital reform has been instituted to cancel markups and increase service prices. A retrospective pre/post-reform study was conducted in ZJ province to assess the impact of the reform on healthcare expenditures and utilization, ultimately to inform policy development and decision-making. The main indicators are healthcare expenditures and utilization. Post-reform, drug expenditures per visit decreased by 8.2% and 15.36% in outpatient and inpatient care, respectively; service expenditures per visit increased by 23.03% and 27.69% in outpatient and inpatient care, respectively. Drug utilization per visit increased by 5.58% in outpatient care and underwent no significant change in inpatient care. Both were lower than the theoretical drug-utilization level, which may move along the demand curve because of patient-initiated demand (PID); this indicates that SID-promoted drug utilization may decrease. Finally, service utilization per visit increased by 6% in outpatient care and by 13.10% in inpatient care; both were higher than the theoretical level moving along the demand curve, and this indicates that SID-promoted service utilization may increase. The reform reduces drug-prescription profits by eliminating drug markups; additionally, it compensates for service costs by increasing service prices. Post-reform, the SID of drug prescriptions decreased, which may reduce drug-resource waste. The SID of services increased, with potentially positive and negative effects: accessibility to services may be promoted when physicians provide more services, but the risk of resource waste may also increase. This warrants further research. It is recommended that comprehensive measures that control SID and promote physician enthusiasm be carried out concurrently.
Small woodland ownership management
Albert J. Childs
1977-01-01
Small woodlot ownerships are a commodity on the real estate market which have cycled through a supply and demand curve on a somewhat irregular basis. In order to have some understanding of what it is we are talking about, it is necessary to define what a small woodlot is, and where it may be found. In size these parcels can range from ten acres to fifteen acres on up...
Acquisition Review quarterly. Vol. 1, No. 3, Summer 1994
1994-01-01
implications , and suggests the use of explicit demand curves. 281 - GUIDELINES FOR AUTHORS Acquisition Review Quarterly Summer 1994 - ii The Lemon Juice...environment. This attention is long overdue. Defense Department installations cover tens of millions oQf square miles of the American landscape. If you look...Commission Report, Facing Towards Governments - Nongovernmental Organi- zations and Scientific and Technical Advice (New York: Carnegie Commission, January
1980-12-01
SUPPLEMENTARY NOTES 19. KEY WORDS (Continue on reverse aide if neceeary aod identify by block number) Bulk cargo Market demand analysis Iron Commodity resource...shown below. The study included a Commodity Resource Inventory, a Modal Split Analysis and a Market Demand Analysis. The work included investigation...resource inventory, a modal split analysis and a market demand analysis. The work included investigation and analyses of the production
Hardware Architecture and Cutting-Edge Assembly Process of a Tiny Curved Compound Eye
Viollet, Stéphane; Godiot, Stéphanie; Leitel, Robert; Buss, Wolfgang; Breugnon, Patrick; Menouni, Mohsine; Juston, Raphaël; Expert, Fabien; Colonnier, Fabien; L'Eplattenier, Géraud; Brückner, Andreas; Kraze, Felix; Mallot, Hanspeter; Franceschini, Nicolas; Pericet-Camara, Ramon; Ruffier, Franck; Floreano, Dario
2014-01-01
The demand for bendable sensors increases constantly in the challenging field of soft and micro-scale robotics. We present here, in more detail, the flexible, functional, insect-inspired curved artificial compound eye (CurvACE) that was previously introduced in the Proceedings of the National Academy of Sciences (PNAS, 2013). This cylindrically-bent sensor with a large panoramic field-of-view of 180° × 60° composed of 630 artificial ommatidia weighs only 1.75 g, is extremely compact and power-lean (0.9 W), while it achieves unique visual motion sensing performance (1950 frames per second) in a five-decade range of illuminance. In particular, this paper details the innovative Very Large Scale Integration (VLSI) sensing layout, the accurate assembly fabrication process, the innovative, new fast read-out interface, as well as the auto-adaptive dynamic response of the CurvACE sensor. Starting from photodetectors and microoptics on wafer substrates and flexible printed circuit board, the complete assembly of CurvACE was performed in a planar configuration, ensuring high alignment accuracy and compatibility with state-of-the art assembling processes. The characteristics of the photodetector of one artificial ommatidium have been assessed in terms of their dynamic response to light steps. We also characterized the local auto-adaptability of CurvACE photodetectors in response to large illuminance changes: this feature will certainly be of great interest for future applications in real indoor and outdoor environments. PMID:25407908
Near field wireless power transfer using curved relay resonators for extended transfer distance
NASA Astrophysics Data System (ADS)
Zhu, D.; Clare, L.; Stark, B. H.; Beeby, S. P.
2015-12-01
This paper investigates the performance of a near field wireless power transfer system that uses curved relay resonator to extend transfer distance. Near field wireless power transfer operates based on the near-field electromagnetic coupling of coils. Such a system can transfer energy over a relatively short distance which is of the same order of dimensions of the coupled coils. The energy transfer distance can be increased using flat relay resonators. Recent developments in printing electronics and e-textiles have seen increasing demand of embedding electronics into fabrics. Near field wireless power transfer is one of the most promising methods to power electronics on fabrics. The concept can be applied to body-worn textiles by, for example, integrating a transmitter coil into upholstery, and a flexible receiver coil into garments. Flexible textile coils take on the shape of the supporting materials such as garments, and therefore curved resonator and receiver coils are investigated in this work. Experimental results showed that using curved relay resonator can effectively extend the wireless power transfer distance. However, as the curvature of the coil increases, the performance of the wireless power transfer, especially the maximum received power, deteriorates.
Optimizing Reservoir Operation to Adapt to the Climate Change
NASA Astrophysics Data System (ADS)
Madadgar, S.; Jung, I.; Moradkhani, H.
2010-12-01
Climate change and upcoming variation in flood timing necessitates the adaptation of current rule curves developed for operation of water reservoirs as to reduce the potential damage from either flood or draught events. This study attempts to optimize the current rule curves of Cougar Dam on McKenzie River in Oregon addressing some possible climate conditions in 21th century. The objective is to minimize the failure of operation to meet either designated demands or flood limit at a downstream checkpoint. A simulation/optimization model including the standard operation policy and a global optimization method, tunes the current rule curve upon 8 GCMs and 2 greenhouse gases emission scenarios. The Precipitation Runoff Modeling System (PRMS) is used as the hydrology model to project the streamflow for the period of 2000-2100 using downscaled precipitation and temperature forcing from 8 GCMs and two emission scenarios. An ensemble of rule curves, each associated with an individual scenario, is obtained by optimizing the reservoir operation. The simulation of reservoir operation, for all the scenarios and the expected value of the ensemble, is conducted and performance assessment using statistical indices including reliability, resilience, vulnerability and sustainability is made.
Steer, Penelope A.; Kirkpatrick, Naomi C.; O'Rourke, Denise; Noormohammadi, Amir H.
2009-01-01
Identification of fowl adenovirus (FAdV) serotypes is of importance in epidemiological studies of disease outbreaks and the adoption of vaccination strategies. In this study, real-time PCR and subsequent high-resolution melting (HRM)-curve analysis of three regions of the hexon gene were developed and assessed for their potential in differentiating 12 FAdV reference serotypes. The results were compared to previously described PCR and restriction enzyme analyses of the hexon gene. Both HRM-curve analysis of a 191-bp region of the hexon gene and restriction enzyme analysis failed to distinguish a number of serotypes used in this study. In addition, PCR of the region spanning nucleotides (nt) 144 to 1040 failed to amplify FAdV-5 in sufficient quantities for further analysis. However, HRM-curve analysis of the region spanning nt 301 to 890 proved a sensitive and specific method of differentiating all 12 serotypes. All melt curves were highly reproducible, and replicates of each serotype were correctly genotyped with a mean confidence value of more than 99% using normalized HRM curves. Sequencing analysis revealed that each profile was related to a unique sequence, with some sequences sharing greater than 94% identity. Melting-curve profiles were found to be related mainly to GC composition and distribution throughout the amplicons, regardless of sequence identity. The results presented in this study show that the closed-tube method of PCR and HRM-curve analysis provides an accurate, rapid, and robust genotyping technique for the identification of FAdV serotypes and can be used as a model for developing genotyping techniques for other pathogens. PMID:19036935
The impact of monsoon intraseasonal variability on renewable power generation in India
NASA Astrophysics Data System (ADS)
Dunning, C. M.; Turner, A. G.; Brayshaw, D. J.
2015-06-01
India is increasingly investing in renewable technology to meet rising energy demands, with hydropower and other renewables comprising one-third of current installed capacity. Installed wind-power is projected to increase 5-fold by 2035 (to nearly 100GW) under the International Energy Agency's New Policies scenario. However, renewable electricity generation is dependent upon the prevailing meteorology, which is strongly influenced by monsoon variability. Prosperity and widespread electrification are increasing the demand for air conditioning, especially during the warm summer. This study uses multi-decadal observations and meteorological reanalysis data to assess the impact of intraseasonal monsoon variability on the balance of electricity supply from wind-power and temperature-related demand in India. Active monsoon phases are characterized by vigorous convection and heavy rainfall over central India. This results in lower temperatures giving lower cooling energy demand, while strong westerly winds yield high wind-power output. In contrast, monsoon breaks are characterized by suppressed precipitation, with higher temperatures and hence greater demand for cooling, and lower wind-power output across much of India. The opposing relationship between wind-power supply and cooling demand during active phases (low demand, high supply) and breaks (high demand, low supply) suggests that monsoon variability will tend to exacerbate fluctuations in the so-called demand-net-wind (i.e., electrical demand that must be supplied from non-wind sources). This study may have important implications for the design of power systems and for investment decisions in conventional schedulable generation facilities (such as coal and gas) that are used to maintain the supply/demand balance. In particular, if it is assumed (as is common) that the generated wind-power operates as a price-taker (i.e., wind farm operators always wish to sell their power, irrespective of price) then investors in conventional facilities will face additional weather-volatility through the monsoonal impact on the length and frequency of production periods (i.e. their load-duration curves).
Abar, Beau; Sheinkopf, Stephen; Lester, Barry; Lagasse, Linda; Seifer, Ronald; Shankaran, Seetha; Bada-Ellzey, Henrietta; Bauer, Charles; Whitaker, Toni; Hinckley, Matt; Hammond, Jane; Higgins, Rosemary
2014-01-01
We employed latent growth curve analysis to examine trajectories of respiratory sinus arrhythmia (RSA) from 3 to 6 years among children with varying levels of prenatal substance exposure and early adversity. Data were drawn from a prospective longitudinal study of prenatal substance exposure that included 1,121 participants. Baseline RSA and RSA reactivity to an attention-demanding task were assessed at 3, 4, 5, and 6 years. Overall, there were significant individual differences in the trajectories of RSA reactivity, but not baseline RSA, across development. Greater levels of prenatal substance exposure, and less exposure to early adversity, were associated with increased RSA reactivity at 3 years, but by 6 years, both were associated with greater RSA reactivity. Prenatal substance exposure had an indirect influence through early adversity on growth in RSA reactivity. Results are in support of and contribute to the framework of allostatic load. PMID:24002807
NASA Astrophysics Data System (ADS)
>D Stollenwerk, 2013-06-01 European countries are highly dependent on energy imports. To lower this import dependency effectively, renewable energies will take a major role in future energy supply systems. To assist the national and inter-European efforts, extensive changes towards a renewable energy supply, especially on the company level, will be unavoidable. To conduct this conversion in the most effective way, the methodology developed in this paper can support the planning procedure. It is applied to the energy intense anodizing production process, where the electrical demand is the governing factor for the energy system layout. The differences between the classical system layout based on the current energy procurement and an approach with a detailed load-time-curve analysis, using process decomposition besides thermodynamic optimization, are discussed. The technical effects on the resulting energy systems are shown besides the resulting energy supply costs which will be determined by hourly discrete simulation.
An Airline-Based Multilevel Analysis of Airfare Elasticity for Passenger Demand
NASA Technical Reports Server (NTRS)
Castelli, Lorenzo; Ukovich, Walter; Pesenti, Raffaele
2003-01-01
Price elasticity of passenger demand for a specific airline is estimated. The main drivers affecting passenger demand for air transportation are identified. First, an Ordinary Least Squares regression analysis is performed. Then, a multilevel analysis-based methodology to investigate the pattern of variation of price elasticity of demand among the various routes of the airline under study is proposed. The experienced daily passenger demands on each fare-class are grouped for each considered route. 9 routes were studied for the months of February and May in years from 1999 to 2002, and two fare-classes were defined (business and economy). The analysis has revealed that the airfare elasticity of passenger demand significantly varies among the different routes of the airline.
Thermoluminescence glow curve analysis and CGCD method for erbium doped CaZrO{sub 3} phosphor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiwari, Ratnesh, E-mail: 31rati@gmail.com; Chopra, Seema
2016-05-06
The manuscript report the synthesis, thermoluminescence study at fixed concentration of Er{sup 3+} (1 mol%) doped CaZrO{sub 3} phosphor. The phosphors were prepared by modified solid state reaction method. The powder sample was characterized by thermoluminescence (TL) glow curve analysis. In TL glow curve the optimized concentration in 1mol% for UV irradiated sample. The kinetic parameters were calculated by computerized glow curve deconvolution (CGCD) techniaue. Trapping parameters gives the information of dosimetry loss in prepared phosphor and its usability in environmental monitoring and for personal monitoring. CGCD is the advance tool for analysis of complicated TL glow curves.
A subjective supply-demand model: the maximum Boltzmann/Shannon entropy solution
NASA Astrophysics Data System (ADS)
Piotrowski, Edward W.; Sładkowski, Jan
2009-03-01
The present authors have put forward a projective geometry model of rational trading. The expected (mean) value of the time that is necessary to strike a deal and the profit strongly depend on the strategies adopted. A frequent trader often prefers maximal profit intensity to the maximization of profit resulting from a separate transaction because the gross profit/income is the adopted/recommended benchmark. To investigate activities that have different periods of duration we define, following the queuing theory, the profit intensity as a measure of this economic category. The profit intensity in repeated trading has a unique property of attaining its maximum at a fixed point regardless of the shape of demand curves for a wide class of probability distributions of random reverse transactions (i.e. closing of the position). These conclusions remain valid for an analogous model based on supply analysis. This type of market game is often considered in research aiming at finding an algorithm that maximizes profit of a trader who negotiates prices with the Rest of the World (a collective opponent), possessing a definite and objective supply profile. Such idealization neglects the sometimes important influence of an individual trader on the demand/supply profile of the Rest of the World and in extreme cases questions the very idea of demand/supply profile. Therefore we put forward a trading model in which the demand/supply profile of the Rest of the World induces the (rational) trader to (subjectively) presume that he/she lacks (almost) all knowledge concerning the market but his/her average frequency of trade. This point of view introduces maximum entropy principles into the model and broadens the range of economic phenomena that can be perceived as a sort of thermodynamical system. As a consequence, the profit intensity has a fixed point with an astonishing connection with Fibonacci classical works and looking for the quickest algorithm for obtaining the extremum of a convex function: the profit intensity reaches its maximum when the probability of transaction is given by the golden ratio rule (\\sqrt {5}-1)/{2} . This condition sets a sharp criterion of validity of the model and can be tested with real market data.
Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E; Terán-Gilmore, Amador
2014-01-01
Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions.
Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Terán-Gilmore, Amador
2014-01-01
Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288
Uchida, Emi; Swallow, Stephen K; Gold, Arthur; Opaluch, James; Kafle, Achyut; Merrill, Nathaniel; Michaud, Clayton; Gill, Carrie Anne
2018-04-01
Innovative market mechanisms are being increasingly recognized as effective decision-making institutions to incorporate the value of ecosystem services into the economy. We present a field experiment that integrates an economic auction and a biophysical water flux model to develop a local market process consisting of both the supply and demand sides. On the supply side, we operate an auction with small-scale livestock owners who bid for contracts to implement site-specific manure management practices that reduce phosphorus loadings to a major reservoir. On the demand side, we implement a real money, multi-unit public good auction for these contracts with residents who potentially benefit from reduced water quality risks. The experiments allow us to construct supply and demand curves to find an equilibrium price for water quality improvement. The field experiments provide a proof-of-concept for practical implementation of a local market for environmental improvements, even for the challenging context of nonpoint pollution.
Acquisition Review Quarterly (ARQ), Volume 1 Number 3, Summer 1994
1994-01-01
sole- and dual-source production and their cost implications , and suggests the use of explicit demand curves. 281 - GUIDELINES FOR AUTHORS...millions of square miles of the American landscape. If you look at a map, it is astonishing how much of the country we cover. Naturally, nearly every- thing...See Carnegie Commission Report, Facing Towards Governments - Nongovernmental Organi- zations and Scientific and Technical Advice (New York: Carnegie
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferguson, D.W.; Tompkins, T.A.; Pratapas, J.M.
The Coal Quality Impact Model (CQIM{trademark}) was used to evaluate the economic and performance impacts of gas co-firing at Mississippi Power Company`s Plant Watson. One of the most important benefits of gas co-firing considered was the ability to burn lower quality, less expensive fuels. Four coals and petroleum coke were evaluated at 0, 5, 10, 20, and 30 percent gas co-firing. These fuels vary widely in their geographic source, heating value, moisture, volatile matter, and sulfur contents. Performance and economic evaluations were conducted at individual load points of 100, 75, 50, 40, 30, and 20 percent of full load. Additionalmore » analyses were made for seasonal load-demand curves and for an average annual load-demand curve. Operating cost in $/MWh, net plant heat rate in Btu/kWh, and break-even gas price in $/MBtu are presented as a function of load and percent gas co-firing. Results illustrate that with the Illinois Basin Coal currently burned at Plant Watson, gas co-firing can be economically justified over a range of gas market prices on either an annual or seasonal basis. Other findings indicate that petroleum coke and South American coal co-fired with natural gas offer significant fuel cost savings and are attractive candidate fuels for combustion verification testing.« less
Securing resource constraints embedded devices using elliptic curve cryptography
NASA Astrophysics Data System (ADS)
Tam, Tony; Alfasi, Mohamed; Mozumdar, Mohammad
2014-06-01
The use of smart embedded device has been growing rapidly in recent time because of miniaturization of sensors and platforms. Securing data from these embedded devices is now become one of the core challenges both in industry and research community. Being embedded, these devices have tight constraints on resources such as power, computation, memory, etc. Hence it is very difficult to implement traditional Public Key Cryptography (PKC) into these resource constrained embedded devices. Moreover, most of the public key security protocols requires both public and private key to be generated together. In contrast with this, Identity Based Encryption (IBE), a public key cryptography protocol, allows a public key to be generated from an arbitrary string and the corresponding private key to be generated later on demand. While IBE has been actively studied and widely applied in cryptography research, conventional IBE primitives are also computationally demanding and cannot be efficiently implemented on embedded system. Simplified version of the identity based encryption has proven its competence in being robust and also satisfies tight budget of the embedded platform. In this paper, we describe the choice of several parameters for implementing lightweight IBE in resource constrained embedded sensor nodes. Our implementation of IBE is built using elliptic curve cryptography (ECC).
NASA Technical Reports Server (NTRS)
Hu, Shoufeng; Bark, Jong S.; Nairn, John A.
1993-01-01
A variational analysis of the stress state in microcracked cross-ply laminates has been used to investigate the phenomenon of curved microcracking in /(S)/90n/s laminates. Previous investigators proposed that the initiation and orientation of curved microcracks are controlled by local maxima and stress trajectories of the principal stresses. We have implemented a principal stress model using a variational mechanics stress analysis and we were able to make predictions about curved microcracks. The predictions agree well with experimental observations and therefore support the assertion that the variational analysis gives an accurate stress state that is useful for modeling the microcracking properties of cross-ply laminates. An important prediction about curved microcracks is that they are a late stage of microcracking damage. They occur only when the crack density of straight microcracks exceeds the critical crack density for curved microcracking. The predicted critical crack density for curved microcracking agrees well with experimental observations.
Initial Development of a Brief Behavioral Economic Assessment of Alcohol Demand
Owens, Max M.; Murphy, Cara M.; MacKillop, James
2015-01-01
Due to difficulties with definition and measurement, the role of conscious craving in substance use disorders remains contentious. To address this, behavioral economics is increasingly being used to quantify aspects of an individual’s acute motivation to use a substance. Doing so typically involves the use of a purchase task, in which participants make choices about consuming alcohol or other substances at various prices and multiple indices of alcohol demand are generated. However, purchase tasks can be limited by the time required to administer and score them. In the current study, a brief 3-item measure, designed to capture three important indices of demand that are derived from demand curve modeling (intensity, Omax, and breakpoint), was investigated in a group of 84 heavy drinkers. Participants underwent a cue-reactivity paradigm that is established to increase both conscious craving and alcohol demand on traditional purchase tasks. All three indices of demand for alcohol measured using the abbreviated measure increased significantly in response to alcohol cues, analogous to what has been observed using a traditional purchase task. Additionally, the correlations between these indices and subjective craving were modest-to-moderate, as has been found in studies comparing craving to the indices derived from purchase tasks. These findings suggest that this abbreviated measure may be a useful and efficient way to capture important and distinct aspects of motivation for alcohol. If these results are confirmed, this measure may be able to help increase the portability of behavioral economic indices of demand into novel research and clinical contexts. PMID:27135038
A demonstration of the instream flow incremental methodology, Shenandoah River
Zappia, Humbert; Hayes, Donald C.
1998-01-01
Current and projected demands on the water resources of the Shenandoah River have increased concerns for the potential effect of these demands on the natural integrity of the Shenandoah River system. The Instream Flow Incremental Method (IFIM) process attempts to integrate concepts of water-supply planning, analytical hydraulic engineering models, and empirically derived habitat versus flow functions to address water-use and instream-flow issues and questions concerning life-stage specific effects on selected species and the general well being of aquatic biological populations.The demonstration project also sets the stage for the identification and compilation of the major instream-flow issues in the Shenandoah River Basin, development of the required multidisciplinary technical team to conduct more detailed studies, and development of basin specific habitat and flow requirements for fish species, species assemblages, and various water uses in the Shenandoah River Basin. This report presents the results of an IFIM demonstration project, conducted on the main stem Shenandoah River in Virginia, during 1996 and 1997, using the Physical Habitat Simulation System (PHABSIM) model.Output from PHABSIM is used to address the general flow requirements for water supply and recreation and habitat for selected life stages of several fish species. The model output is only a small part of the information necessary for effective decision making and management of river resources. The information by itself is usually insufficient for formulation of recommendations regarding instream-flow requirements. Additional information, for example, can be obtained by analysis of habitat time-series data, habitat duration data, and habitat bottlenecks. Alternative-flow analysis and habitat-duration curves are presented.
Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M
2012-07-01
With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.
Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.
2014-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials
Application of Demand Analysis in Marketing Continuing Education.
ERIC Educational Resources Information Center
Waters, Elzberry, Jr.
This study investigated the feasibility of applying economic demand analysis (especially elasticity of demand) in marketing George Washington University off-campus degree programs. In the case under study, a supplemental budget request had to be submitted to meet expenses incurred by an unforeseen increase in demand for graduate and undergraduate…
The Effects of Perceived Quality on Behavioral Economic Demand for Marijuana: A Web-Based Experiment
Vincent, Paula C.; Collins, R. Lorraine; Liu, Liu; Yu, Jihnhee; De Leo, Joseph A.; Earleywine, Mitch
2016-01-01
Background Given the growing legalization of recreational marijuana use and related increase in its prevalence in the United States, it is important to understand marijuana's appeal. We used a behavioral economic (BE) approach to examine whether the reinforcing properties of marijuana, including “demand” for marijuana, varied as a function of its perceived quality. Methods Using an innovative, Web-based marijuana purchase task (MPT), a sample of 683 young-adult recreational marijuana users made hypothetical purchases of marijuana across three qualities (low, mid and high grade) at nine escalating prices per joint, ranging from $0/free to $20. Results We used nonlinear mixed effects modeling to conduct demand curve analyses, which produced separate demand indices (e.g., Pmax, elasticity) for each grade of marijuana. Consistent with previous research, as the price of marijuana increased, marijuana users reduced their purchasing. Demand also was sensitive to quality, with users willing to pay more for higher quality/grade marijuana. In regression analyses, demand indices accounted for significant variance in typical marijuana use. Conclusions This study illustrates the value of applying BE theory to young adult marijuana use. It extends past research by examining how perceived quality affects demand for marijuana and provides support for the validity of a Web-based MPT to examine the appeal of marijuana. Our results have implications for policies to regulate marijuana use, including taxation based on the quality of different marijuana products. PMID:27951424
pROC: an open-source package for R and S+ to analyze and compare ROC curves.
Robin, Xavier; Turck, Natacha; Hainard, Alexandre; Tiberti, Natalia; Lisacek, Frédérique; Sanchez, Jean-Charles; Müller, Markus
2011-03-17
Receiver operating characteristic (ROC) curves are useful tools to evaluate classifiers in biomedical and bioinformatics applications. However, conclusions are often reached through inconsistent use or insufficient statistical analysis. To support researchers in their ROC curves analysis we developed pROC, a package for R and S+ that contains a set of tools displaying, analyzing, smoothing and comparing ROC curves in a user-friendly, object-oriented and flexible interface. With data previously imported into the R or S+ environment, the pROC package builds ROC curves and includes functions for computing confidence intervals, statistical tests for comparing total or partial area under the curve or the operating points of different classifiers, and methods for smoothing ROC curves. Intermediary and final results are visualised in user-friendly interfaces. A case study based on published clinical and biomarker data shows how to perform a typical ROC analysis with pROC. pROC is a package for R and S+ specifically dedicated to ROC analysis. It proposes multiple statistical tests to compare ROC curves, and in particular partial areas under the curve, allowing proper ROC interpretation. pROC is available in two versions: in the R programming language or with a graphical user interface in the S+ statistical software. It is accessible at http://expasy.org/tools/pROC/ under the GNU General Public License. It is also distributed through the CRAN and CSAN public repositories, facilitating its installation.
Assessing the Utility of a Demand Assessment for Functional Analysis
ERIC Educational Resources Information Center
Roscoe, Eileen M.; Rooker, Griffin W.; Pence, Sacha T.; Longworth, Lynlea J.
2009-01-01
We evaluated the utility of an assessment for identifying tasks for the functional analysis demand condition with 4 individuals who had been diagnosed with autism. During the demand assessment, a therapist presented a variety of tasks, and observers measured problem behavior and compliance to identify demands associated with low levels of…
Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229
Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.
Xu, Yan; Wang, Yaqin; Coda, Rossana; Säde, Elina; Tuomainen, Päivi; Tenkanen, Maija; Katina, Kati
2017-05-02
Fava bean flour is regarded as a potential plant-based protein source, but the addition of it at high concentration is restricted by its poor texture-improving ability and by anti-nutritional factors (ANF). Exopolysaccharides (EPS) produced by lactic acid bacteria (LAB) are regarded as good texture modifiers. In this study, fava bean flour was fermented with Leuconostoc spp. and Weissella spp. with or without sucrose addition, in order to evaluate their potential in EPS production. The contents of free sugars, organic acids, mannitol and EPS in all fermented fava bean doughs were measured. Rheological properties of sucrose-enriched doughs, including viscosity flow curves, hysteresis loop and dynamic oscillatory sweep curves, were measured after fermentation. As one of the ANF, the degradation of raffinose family oligosaccharides (RFO) was also studied by analyzing RFO profiles of different doughs. Quantification of EPS revealed the potential of Leuconostoc pseudomesenteroides DSM 20193 in EPS production, and the rheological analysis showed that the polymers produced by this strain has the highest thickening and gelling capability. Furthermore, the viscous fava bean doughs containing plant proteins and synthesized in situ EPS may have a potential application in the food industry and fulfill consumers' increasing demands for "clean labels" and plant-originated food materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Minimally invasive video-assisted thyroid surgery: how can we improve the learning curve?
Castagnola, G; Giulii Cappone, M; Tierno, S M; Mezzetti, G; Centanini, F; Vetrone, I; Bellotti, C
2012-10-01
Minimally invasive video-assisted thyroidectomy (MIVAT) is a technically demanding procedure and requires a surgical team skilled in both endocrine and endoscopic surgery. A time consuming learning and training period is mandatory at the beginning of the experience. The aim of our report is to focus some aspects of the learning curve of the surgeon who practices video-assisted thyroid procedures for the first time, through the analysis of our preliminary series of 36 cases. From September 2004 to April 2005 we selected 36 patients for minimally invasive video-assisted surgery of the thyroid. The patients were considered eligible if they presented with a nodule not exceeding 35 mm in maximum diameter; total thyroid volume within normal range; absence of biochemical and echographic signs of thyroiditis. We analyzed surgical results, conversion rate, operating time, post-operative complications, hospital stay, cosmetic outcome of the series. We performed 36 total thyroidectomy. The procedure was successfully carried out in 33/36 cases. Post-operative complications included 3 transient recurrent nerve palsies and 2 transient hypocalcemias; no definitive hypoparathyroidism was registered. All patients were discharged 2 days after operation. The cosmetic result was considered excellent by most patients. Advances in skills and technology have enabled surgeons to reproduce most open surgical techniques with video-assistance or laparoscopically. Training is essential to acquire any new surgical technique and it should be organized in detail to exploit it completely.
Consumer Surplus, Demand Functions, and Policy Analysis,
1983-06-01
ARD-AL758 865 CONSUMER SURPLUS DEMAND FUNCTIONS AND POLICY ANALYSIS 1/2 (U) RAND CORP SANTA MONICA CA F CANM JUN 83 RAND/R-3848-RC UNCLASSIFIED F/O 5...8217 - * 2, Consumer Surplus, Demand Functions, and Policy Analysis Frank Camm OCFILE COEYI b0 loo Thi! d Ci rr.i h,13 bea~n approvedS i i l ot p...ui.- r~aoz an~d sale; its (5 06 VP1 d’ *. . . * . ~ - V * * . R-3048-RC Consumer Surplus, Demand Functions, and Policy Analysis Frank Caomm June 1983
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Baltzer, Pascal Andreas Thomas; Renz, Diane M; Kullnig, Petra E; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A
2009-04-01
The identification of the most suspect enhancing part of a lesion is regarded as a major diagnostic criterion in dynamic magnetic resonance mammography. Computer-aided diagnosis (CAD) software allows the semi-automatic analysis of the kinetic characteristics of complete enhancing lesions, providing additional information about lesion vasculature. The diagnostic value of this information has not yet been quantified. Consecutive patients from routine diagnostic studies (1.5 T, 0.1 mmol gadopentetate dimeglumine, dynamic gradient-echo sequences at 1-minute intervals) were analyzed prospectively using CAD. Dynamic sequences were processed and reduced to a parametric map. Curve types were classified by initial signal increase (not significant, intermediate, and strong) and the delayed time course of signal intensity (continuous, plateau, and washout). Lesion enhancement was measured using CAD. The most suspect curve, the curve-type distribution percentage, and combined dynamic data were compared. Statistical analysis included logistic regression analysis and receiver-operating characteristic analysis. Fifty-one patients with 46 malignant and 44 benign lesions were enrolled. On receiver-operating characteristic analysis, the most suspect curve showed diagnostic accuracy of 76.7 +/- 5%. In comparison, the curve-type distribution percentage demonstrated accuracy of 80.2 +/- 4.9%. Combined dynamic data had the highest diagnostic accuracy (84.3 +/- 4.2%). These differences did not achieve statistical significance. With appropriate cutoff values, sensitivity and specificity, respectively, were found to be 80.4% and 72.7% for the most suspect curve, 76.1% and 83.6% for the curve-type distribution percentage, and 78.3% and 84.5% for both parameters. The integration of whole-lesion dynamic data tends to improve specificity. However, no statistical significance backs up this finding.
Further examination of the temporal stability of alcohol demand.
Acuff, Samuel F; Murphy, James G
2017-08-01
Demand, or the amount of a substance consumed as a function of price, is a central dependent measure in behavioral economic research and represents the relative valuation of a substance. Although demand is often utilized as an index of substance use severity and is assumed to be relatively stable, recent experimental and clinical research has identified conditions in which demand can be manipulated, such as through craving and stress inductions, and treatment. Our study examines the 1-month reliability of the alcohol purchase task in a sample of heavy drinking college students. We also analyzed reliability in subgroup of individuals whose consumption decreased, increased, or stayed the same over the 1-month period, and in individuals with moderate/severe Alcohol Use Disorder (AUD) vs. those with no/mild AUD. Reliability was moderate in the full sample, high in the group with stable consumption, and did not differ appreciably between AUD groups. Observed indices and indices derived from an exponentiated equation (Koffarnus et al., 2015) were generally comparable, although P max observed had very low reliability. Area under the curve, O max derived, and essential value showed the greatest reliability in the full sample (rs=0.75-0.77). These results provide evidence for the relative stability over time of demand and across AUD groups, particularly in those whose consumption remains stable. Copyright © 2017 Elsevier B.V. All rights reserved.
Richardson, Richard B
2014-07-01
Knudson's carcinogenic model, which simulates incidence rates for retinoblastoma, provides compelling evidence for a two-stage mutational process. However, for more complex cancers, existing multistage models are less convincing. To fill this gap, I hypothesize that neoplasms preferentially arise when stem cell exhaustion creates a short supply of progenitor cells at ages of high proliferative demand. To test this hypothesis, published datasets were employed to model the age distribution of osteochondroma, a benign lesion, and osteosarcoma, a malignant one. The supply of chondrogenic stem-like cells in femur growth plates of children and adolescents was evaluated and compared with the progenitor cell demand of longitudinal bone growth. Similarly, the supply of osteoprogenitor cells from birth to old age was compared with the demands of bone formation. Results show that progenitor cell demand-to-supply ratios are a good risk indicator, exhibiting similar trends to the unimodal and bimodal age distributions of osteochondroma and osteosarcoma, respectively. The hypothesis also helps explain Peto's paradox and the finding that taller individuals are more prone to cancers and have shorter lifespans. The hypothesis was tested, in the manner of Knudson, by its ability to convincingly explain and demonstrate, for the first time, a bone tumour's bimodal age-incidence curve. Crown Copyright © 2014. Published by Elsevier Ireland Ltd. All rights reserved.
Quality Quandaries: Predicting a Population of Curves
Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip
2017-12-19
We present a random effects spline regression model based on splines that provides an integrated approach for analyzing functional data, i.e., curves, when the shape of the curves is not parametrically specified. An analysis using this model is presented that makes inferences about a population of curves as well as features of the curves.
Quality Quandaries: Predicting a Population of Curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip
We present a random effects spline regression model based on splines that provides an integrated approach for analyzing functional data, i.e., curves, when the shape of the curves is not parametrically specified. An analysis using this model is presented that makes inferences about a population of curves as well as features of the curves.
Pulpwood supply and demand : development in the South, little growth elsewhere.
Peter J. Ince; Irene Durbak
2002-01-01
This long-range outlook derives from analysis of pulp and paper markets and pulpwood demands for wood panels. The analysis projects modest increases in pulpwood demand beyond 2010, with decelerating growth in paper and paperboard consumption; increased demand for pulpwood in wood panels; increased imports of pulp, paper, and paperboard; and little additional growth in...
Ross, Sharona B; Choung, Edward; Teta, Anthony F; Colibao, Lotiffa; Luberice, Kenneth; Paul, Harold; Rosemurgy, Alexander S
2013-01-01
This study of laparoendoscopic single-site (LESS) fundoplication for gastroesophageal reflux disease was undertaken to determine the "learning curve" for implementing LESS fundoplication. One hundred patients, 38% men, with a median age of 61 years and median body mass index of 26 kg/m(2) , underwent LESS fundoplications. The operative times, placement of additional trocars, conversions to "open" operations, and complications were compared among patient quartiles to establish a learning curve. Median data are reported. The median operative times and complications did not differ among 25-patient cohorts. Additional trocars were placed in 27% of patients, 67% of whom were in the first 25-patient cohort. Patients undergoing LESS fundoplication had a dramatic relief in the frequency and severity of all symptoms of reflux across all cohorts equally (P < .05), particularly for heartburn and regurgitation, without causing dysphagia. LESS fundoplication ameliorates symptoms of gastroesophageal reflux disease without apparent scarring. Notably, few operations required additional trocars after the first 25-patient cohort. Patient selection became more inclusive (eg, more "redo" fundoplications) with increasing experience, whereas operative times and complications remained relatively unchanged. The learning curve of LESS fundoplication is definable, short, and safe. We believe that patients will seek LESS fundoplication because of the efficacy and superior cosmetic outcomes; surgeons will need to meet this demand.
Dong, Jianghu J; Wang, Liangliang; Gill, Jagbir; Cao, Jiguo
2017-01-01
This article is motivated by some longitudinal clinical data of kidney transplant recipients, where kidney function progression is recorded as the estimated glomerular filtration rates at multiple time points post kidney transplantation. We propose to use the functional principal component analysis method to explore the major source of variations of glomerular filtration rate curves. We find that the estimated functional principal component scores can be used to cluster glomerular filtration rate curves. Ordering functional principal component scores can detect abnormal glomerular filtration rate curves. Finally, functional principal component analysis can effectively estimate missing glomerular filtration rate values and predict future glomerular filtration rate values.
Finite element analysis of the Wolf Creek multispan curved girder bridge.
DOT National Transportation Integrated Search
2008-01-01
The use of curved girder bridges in highway construction has grown steadily during the last 40 years. Today, roughly 25% of newly constructed bridges have a curved alignment. Curved girder bridges have numerous complicating geometric features that di...
Regulating services as measures of ecological resilience on DoD lands
Angermeier, Paul; Villamagna, Amy M.
2015-01-01
Knowledge of the capacity and flow of ecosystem services can help DoD land managers make decisions that enhance cost-effectiveness, minimize environmental damage, and maximize resources available for military missions. We demonstrated a methodology to quantify and map selected regulating services (RS), which helps land managers envision tradeoffs. Our objectives were to 1) estimate current capacity of and demand for selected RS within DoD lands, 2) examine the effects of future DoD land management and climate changes on the capacity and flow of these RS, and 3) project how land-use and climate changes in nearby lands affect future demand for RS. Our approach incorporates widely accepted models and equations, remote sensing, GIS analysis, and stakeholder involvement. Required data include land cover/use, soil type, precipitation, and air temperature. We integrated data into the a) Surface Curve Number Method and b) Revised Universal Soil Loss Equation to estimate capacity of sediment, nitrogen (N) and surface-water regulation. Capacities and flows of RS vary greatly across landscapes and are likely to vary as climate changes or development occurs. Analyses of RS capacity and flow can help managers and planners prioritize actions in the context of best management practices and compatible use buffers. Staff surveys indicated that our approach was informative and easy to use. Implementation may be most limited by on-installation personnel time.
Harrysson, Iliana J; Cook, Jonathan; Sirimanna, Pramudith; Feldman, Liane S; Darzi, Ara; Aggarwal, Rajesh
2014-07-01
To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment. Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary. A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria. Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121). Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.
NEXT Performance Curve Analysis and Validation
NASA Technical Reports Server (NTRS)
Saripalli, Pratik; Cardiff, Eric; Englander, Jacob
2016-01-01
Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.
Gregori, Dario; Rosato, Rosalba; Zecchin, Massimo; Di Lenarda, Andrea
2005-01-01
This paper discusses the use of bivariate survival curves estimators within the competing risk framework. Competing risks models are used for the analysis of medical data with more than one cause of death. The case of dilated cardiomiopathy is explored. Bivariate survival curves plot the conjoint mortality processes. The different graphic representation of bivariate survival analysis is the major contribute of this methodology to the competing risks analysis.
[Application of melting curve to analyze genotype of Duffy blood group antigen Fy-a/b].
Chen, Xue; Zhou, Chang-Hua; Hong, Ying; Gong, Tian-Xiang
2012-12-01
This study was aimed to establish the real-time multiple-PCR with melting curve analysis for Duffy blood group Fy-a/b genotyping. According to the sequence of mRNA coding for β-actin and Fy-a/b, the primers of β-actin and Fy-a/b were synthesized. The real-time multiple-PCR with melting curve analysis for Fy-a/b genotyping was established. The Fy-a/b genotyping of 198 blood donors in Chinese Chengdu area has been investigated by melting curve analysis and PCR-SSP. The results showed that the results of Fy-a/b genotype by melting curve analysis were consistent with PCR-SSP. In all of 198 donors in Chinese Chengdu, 178 were Fy(a) (+) (89.9%), 19 were Fy(a) (+) Fy(b) (+) (9.6%), and 1 was Fy(b) (+) (0.5%). The gene frequency of Fy(a) was 0.947, while that of Fy(b) was 0.053. It is concluded that the genotyping method of Duffy blood group with melting curve analysis is established, which can be used as a high-throughput screening tool for Duffy blood group genotyping; and the Fy(a) genotype is the major of Duffy blood group of donors in Chinese Chengdu area.
REVISITING EVIDENCE OF CHAOS IN X-RAY LIGHT CURVES: THE CASE OF GRS 1915+105
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mannattil, Manu; Gupta, Himanshu; Chakraborty, Sagar, E-mail: mmanu@iitk.ac.in, E-mail: hiugupta@iitk.ac.in, E-mail: sagarc@iitk.ac.in
2016-12-20
Nonlinear time series analysis has been widely used to search for signatures of low-dimensional chaos in light curves emanating from astrophysical bodies. A particularly popular example is the microquasar GRS 1915+105, whose irregular but systematic X-ray variability has been well studied using data acquired by the Rossi X-ray Timing Explorer . With a view to building simpler models of X-ray variability, attempts have been made to classify the light curves of GRS 1915+105 as chaotic or stochastic. Contrary to some of the earlier suggestions, after careful analysis, we find no evidence for chaos or determinism in any of the GRS 1915+105 classes. Themore » dearth of long and stationary data sets representing all the different variability classes of GRS 1915+105 makes it a poor candidate for analysis using nonlinear time series techniques. We conclude that either very exhaustive data analysis with sufficiently long and stationary light curves should be performed, keeping all the pitfalls of nonlinear time series analysis in mind, or alternative schemes of classifying the light curves should be adopted. The generic limitations of the techniques that we point out in the context of GRS 1915+105 affect all similar investigations of light curves from other astrophysical sources.« less
A Tale Of 160 Scientists, Three Applications, a Workshop and a Cloud
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Brinkworth, C.; Gelino, D.; Wittman, D. K.; Deelman, E.; Juve, G.; Rynge, M.; Kinney, J.
2013-10-01
The NASA Exoplanet Science Institute (NExScI) hosts the annual Sagan Workshops, thematic meetings aimed at introducing researchers to the latest tools and methodologies in exoplanet research. The theme of the Summer 2012 workshop, held from July 23 to July 27 at Caltech, was to explore the use of exoplanet light curves to study planetary system architectures and atmospheres. A major part of the workshop was to use hands-on sessions to instruct attendees in the use of three open source tools for the analysis of light curves, especially from the Kepler mission. Each hands-on session involved the 160 attendees using their laptops to follow step-by-step tutorials given by experts. One of the applications, PyKE, is a suite of Python tools designed to reduce and analyze Kepler light curves; these tools can be invoked from the Unix command line or a GUI in PyRAF. The Transit Analysis Package (TAP) uses Markov Chain Monte Carlo (MCMC) techniques to fit light curves under the Interactive Data Language (IDL) environment, and Transit Timing Variations (TTV) uses IDL tools and Java-based GUIs to confirm and detect exoplanets from timing variations in light curve fitting. Rather than attempt to run these diverse applications on the inevitable wide range of environments on attendees laptops, they were run instead on the Amazon Elastic Cloud 2 (EC2). The cloud offers features ideal for this type of short term need: computing and storage services are made available on demand for as long as needed, and a processing environment can be customized and replicated as needed. The cloud environment included an NFS file server virtual machine (VM), 20 client VMs for use by attendees, and a VM to enable ftp downloads of the attendees' results. The file server was configured with a 1 TB Elastic Block Storage (EBS) volume (network-attached storage mounted as a device) containing the application software and attendees home directories. The clients were configured to mount the applications and home directories from the server via NFS. All VMs were built with CentOS version 5.8. Attendees connected their laptops to one of the client VMs using the Virtual Network Computing (VNC) protocol, which enabled them to interact with a remote desktop GUI during the hands-on sessions. We will describe the mechanisms for handling security, failovers, and licensing of commercial software. In particular, IDL licenses were managed through a server at Caltech, connected to the IDL instances running on Amazon EC2 via a Secure Shell (ssh) tunnel. The system operated flawlessly during the workshop.
NASA Astrophysics Data System (ADS)
Cullis, James D. S.; Walker, Nicholas J.; Ahjum, Fadiel; Juan Rodriguez, Diego
2018-02-01
Many countries, like South Africa, Australia, India, China and the United States, are highly dependent on coal fired power stations for energy generation. These power stations require significant amounts of water, particularly when fitted with technology to reduce pollution and climate change impacts. As water resources come under stress it is important that spatial variability in water availability is taken into consideration for future energy planning particularly with regards to motivating for a switch from coal fired power stations to renewable technologies. This is particularly true in developing countries where there is a need for increased power production and associated increasing water demands for energy. Typically future energy supply options are modelled using a least cost optimization model such as TIMES that considers water supply as an input cost, but is generally constant for all technologies. Different energy technologies are located in different regions of the country with different levels of water availability and associated infrastructure development and supply costs. In this study we develop marginal cost curves for future water supply options in different regions of a country where different energy technologies are planned for development. These water supply cost curves are then used in an expanded version of the South Africa TIMES model called SATIM-W that explicitly models the water-energy nexus by taking into account the regional nature of water supply availability associated with different energy supply technologies. The results show a significant difference in the optimal future energy mix and in particular an increase in renewables and a demand for dry-cooling technologies that would not have been the case if the regional variability of water availability had not been taken into account. Choices in energy policy, such as the introduction of a carbon tax, will also significantly impact on future water resources, placing additional water demands in some regions and making water available for other users in other regions with a declining future energy demand. This study presents a methodology for modelling the water-energy nexus that could be used to inform the sustainable development planning process in the water and energy sectors for both developed and developing countries.
Roadway network productivity assessment : system-wide analysis under variant travel demand
DOT National Transportation Integrated Search
2008-11-01
The analysis documented in this report examines the hypothesis that the system-wide productivity of a metropolitan freeway system in peak periods is higher in moderate travel demand conditions than in excessive travel demand conditions. The approach ...
23 CFR 450.320 - Congestion management process in transportation management areas.
Code of Federal Regulations, 2010 CFR
2010-04-01
... travel demand reduction and operational management strategies. (b) The development of a congestion... appropriate analysis of reasonable (including multimodal) travel demand reduction and operational management... the analysis demonstrates that travel demand reduction and operational management strategies cannot...
Estimation of error on the cross-correlation, phase and time lag between evenly sampled light curves
NASA Astrophysics Data System (ADS)
Misra, R.; Bora, A.; Dewangan, G.
2018-04-01
Temporal analysis of radiation from Astrophysical sources like Active Galactic Nuclei, X-ray Binaries and Gamma-ray bursts provides information on the geometry and sizes of the emitting regions. Establishing that two light-curves in different energy bands are correlated, and measuring the phase and time-lag between them is an important and frequently used temporal diagnostic. Generally the estimates are done by dividing the light-curves into large number of adjacent intervals to find the variance or by using numerically expensive simulations. In this work we have presented alternative expressions for estimate of the errors on the cross-correlation, phase and time-lag between two shorter light-curves when they cannot be divided into segments. Thus the estimates presented here allow for analysis of light-curves with relatively small number of points, as well as to obtain information on the longest time-scales available. The expressions have been tested using 200 light curves simulated from both white and 1 / f stochastic processes with measurement errors. We also present an application to the XMM-Newton light-curves of the Active Galactic Nucleus, Akn 564. The example shows that the estimates presented here allow for analysis of light-curves with relatively small (∼ 1000) number of points.
ERIC Educational Resources Information Center
Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.
2016-01-01
Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…
ERIC Educational Resources Information Center
Pavlik, Philip I. Jr.; Cen, Hao; Koedinger, Kenneth R.
2009-01-01
This paper describes a novel method to create a quantitative model of an educational content domain of related practice item-types using learning curves. By using a pairwise test to search for the relationships between learning curves for these item-types, we show how the test results in a set of pairwise transfer relationships that can be…
The top view for analysis of scoliosis progression
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Smet, A.A.; Tarlton, M.A.; Cook, L.T.
1983-05-01
Using computerized spinal analysis, a new top view was developed that displays the spine as if the observer were above and looking down on the patient. Serial top views were obtained of 12 patients with idiopathic scoliosis. In five patients with clinically stable curves, the top views showed no change. One patient with an enlaring rib hump was seen on the top view to have progressive kyphosis but stable scoliosis. Six patients with progressive scoliosis all demonstrated collapse of the thoracic curve in the anteroposterior direction. Five of these six patients had associated lumbar curves. Three lumbar curves demonstrated collapsemore » in the anteroposterior direction similar to the collapse of the thoracic curves, and the other two curves appeared elongated in the anteroposterior direction.« less
Study of optimizing water utilization in Benanga reservoir for irrigation and fresh water purposes
NASA Astrophysics Data System (ADS)
Tamrin; Retati, E.
2018-04-01
Benanga dam was built in1978an irrigation weir but currently it was developed into a multipurpose dam. However, based on the capacity curve measurement in 2015, the capacity curve measurement has been changed to get below. The runoff rate is calculated by using NRECA method, andwater reservoir volume is calculated by using penman modification method. The cropping pattern that has been implemented by the farmer of Lempake sincein Februaryis Paddy-Paddy-Fallow While the proposed cropping pattern in Benanga reservoir started on December, that proposed is based on the service ability for both raw water demands like irrigation and fresh water and if early planting is started besides these two months the elevation of benanga reservoir will not reach the normal elevation effective storage which is the condition pattern of reservoir operation.
Family Annualized Cost of Leaving: The Household as the Decision Unit in Military Retention
1990-05-01
the equation shown in the text. Note that the advantage of this approach is that the family is "on" its demand curve for leisure or non... the nonmember spouse while the family remains in the Army, HA. It is the spouse’s labor supply equation and is a function of the spouse’s market wage...typically, overstate losses because it ignores the spouse’s value
Novel Synthesis of 3D Graphene-CNF Electrode Architectures for Supercapacitor Applications
2013-06-01
curves. Specifically, I would like to mention LCDR Chris Daskam, LT Russell Canty, LT Jamie Cook , LT Ashley Maxson, LT Samuel Fromille, and Edwin...systems to store energy for the amplifier to use on demand. Additionally, capacitors are used to supply large pulses of current for pulsed power...stark contrast to values quoted in recent studies: Zhu et al. [56] cite a specific capacitance of microwave exfoliated graphene electrodes of 200 F/g
NASA Astrophysics Data System (ADS)
Tilmant, A.; Beevers, L.; Muyunda, B.
2010-07-01
Large storage facilities in hydropower-dominated river basins have traditionally been designed and managed to maximize revenues from energy generation. In an attempt to mitigate the externalities downstream due to a reduction in flow fluctuation, minimum flow requirements have been imposed to reservoir operators. However, it is now recognized that a varying flow regime including flow pulses provides the best conditions for many aquatic ecosystems. This paper presents a methodology to derive a trade-off relationship between hydropower generation and ecological preservation in a system with multiple reservoirs and stochastic inflows. Instead of imposing minimum flow requirements, the method brings more flexibility to the allocation process by building upon environmental valuation studies to derive simple demand curves for environmental goods and services, which are then used in a reservoir optimization model together with the demand for energy. The objective here is not to put precise monetary values on environmental flows but to see the marginal changes in release policies should those values be considered. After selecting appropriate risk indicators for hydropower generation and ecological preservation, the trade-off curve provides a concise way of exploring the extent to which one of the objectives must be sacrificed in order to achieve more of the other. The methodology is illustrated with the Zambezi River basin where large man-made reservoirs have disrupted the hydrological regime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodhouse, M.; Goodrich, A.; Redlinger, M.
2013-09-01
For those PV technologies that rely upon Te, In, and Ga, first-order observations and calculations hint that there may be resource constraints that could inhibit their successful deployment at a SunShot level. These are only first-order approximations, however, and the possibility for an expansion in global Te, In, and Ga supplies needs to be considered in the event that there are upward revisions in their demand and prices.In this study, we examine the current, mid-term, and long-term prospects of Tellurium (Te) for use in PV. We find that the current global supply base of Te would support <10 GW ofmore » annual traditional CdTe PV manufacturing production. But as for the possibility that the supply base for Te might be expanded, after compiling several preliminary cumulative availability curves we find that there may be significant upside potential in the supply base for this element - principally vis a vis increasing demand and higher prices. Primarily by reducing the Tellurium intensity in manufacturing and by increasing the recovery efficiency of Te in Cu refining processes, we calculate that it may prove affordable to PV manufacturers to expand the supply base for Te such that 100 GW, or greater, of annual CdTe PV production is possible in the 2030 - 2050 timeframe.« less
Nakamura, Hideaki; Kobayashi, Shun; Hirata, Yu; Suzuki, Kyota; Mogi, Yotaro; Karube, Isao
2007-10-15
A method to determine the spectrophotometric biochemical oxygen demand (BOD(sp)) was studied with high sensitivity and reproducibility by employing 2,6-dichlorophenolindophenol (DCIP) as a redox color indicator, the yeast Saccharomyces cerevisiae, and a temperature-controlling system providing a three-consecutive-stir unit. The absorbance of DCIP decreased due to the metabolism of organic substances in aqueous samples by S. cerevisiae. Under optimum conditions, a calibration curve for glucose glutamic acid concentration between 1.1 and 22mg O(2) L(-1) (r=0.988, six points, n=3) was obtained when the incubation mixture was incubated for 10min at 30 degrees C. The reproducibility of the optical responses in the calibration curve was 1.77% (average of relative standard deviations; RSD(av)). Subsequently, the characterization of this method was studied. The optical responses to pure organic substances and the influence of chloride ions, artificial seawater, and heavy metal ions on the sensor response were investigated before use with real samples. Measurements of real samples using river water were performed and compared with those obtained using the BOD(5) method. Finally, stable responses were obtained for 36 days when the yeast cell suspension was stored at 4 degrees C (response reduction, 89%; RSD(av) value for 9 testing days, 8.4%).
Dung, Van Than; Tjahjowidodo, Tegoeh
2017-01-01
B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE) area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.
New Insight into Combined Model and Revised Model for RTD Curves in a Multi-strand Tundish
NASA Astrophysics Data System (ADS)
Lei, Hong
2015-12-01
The analysis for the residence time distribution (RTD) curve is one of the important experimental technologies to optimize the tundish design. But there are some issues about RTD analysis model. Firstly, the combined (or mixed) model and the revised model give different analysis results for the same RTD curve. Secondly, different upper limits of integral in the numerator for the mean residence time give different results for the same RTD curve. Thirdly, the negative dead volume fraction sometimes appears at the outer strand of the multi-strand tundish. In order to solve the above problems, it is necessary to have a deep insight into the RTD curve and to propose a reasonable method to analyze the RTD curve. The results show that (1) the revised model is not appropriate to treat with the RTD curve; (2) the conception of the visual single-strand tundish and the combined model with the dimensionless time at the cut-off point are applied to estimate the flow characteristics in the multi-strand tundish; and that (3) the mean residence time at each exit is the key parameter to estimate the similarity of fluid flow among strands.
Use of a latency-based demand assessment to identify potential demands for functional analyses.
Call, Nathan A; Miller, Sarah J; Mintz, Joslyn Cynkus; Mevers, Joanna Lomas; Scheithauer, Mindy C; Eshelman, Julie E; Beavers, Gracie A
2016-12-01
Unlike potential tangible positive reinforcers, which are typically identified for inclusion in functional analyses empirically using preference assessments, demands are most often selected arbitrarily or based on caregiver report. The present study evaluated the use of a demand assessment with 12 participants who exhibited escape-maintained problem behavior. Participants were exposed to 10 demands, with aversiveness measured by average latency to the first instance of problem behavior. In subsequent functional analyses, results of a demand condition that included the demand with the shortest latency to problem behavior resulted in identification of an escape function for 11 of the participants. In contrast, a demand condition that included the demand with the longest latency resulted in identification of an escape function for only 5 participants. The implication of these findings is that for the remaining 7 participants, selection of the demand for the functional analysis without using the results of the demand assessment could have produced a false-negative finding. © 2016 Society for the Experimental Analysis of Behavior.
Interaction Analysis of Longevity Interventions Using Survival Curves.
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-06
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans . We find that interactions are generally weak even when the standard analysis indicates otherwise.
Interaction Analysis of Longevity Interventions Using Survival Curves
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G.; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-01
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans. We find that interactions are generally weak even when the standard analysis indicates otherwise. PMID:29316622
Product differentiation by analysis of DNA melting curves during the polymerase chain reaction.
Ririe, K M; Rasmussen, R P; Wittwer, C T
1997-02-15
A microvolume fluorometer integrated with a thermal cycler was used to acquire DNA melting curves during polymerase chain reaction by fluorescence monitoring of the double-stranded DNA specific dye SYBR Green I. Plotting fluorescence as a function of temperature as the thermal cycler heats through the dissociation temperature of the product gives a DNA melting curve. The shape and position of this DNA melting curve are functions of the GC/AT ratio, length, and sequence and can be used to differentiate amplification products separated by less than 2 degrees C in melting temperature. Desired products can be distinguished from undesirable products, in many cases eliminating the need for gel electrophoresis. Analysis of melting curves can extend the dynamic range of initial template quantification when amplification is monitored with double-stranded DNA specific dyes. Complete amplification and analysis of products can be performed in less than 15 min.
Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves
NASA Astrophysics Data System (ADS)
Rošt'áková, Zuzana; Rosipal, Roman
2018-02-01
Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.
NASA Astrophysics Data System (ADS)
Milano, M.; Ruelland, D.; Dezetter, A.; Ardoin-Bardin, S.; Thivet, G.; Servat, E.
2012-04-01
Worldwide studies modelling the hydrological response to global changes have proven the Mediterranean area as one of the most vulnerable region to water crisis. It is characterised by limited and unequally distributed water resources, as well as by important development of its human activities. Since the late 1950s, water demand in the Mediterranean basin has doubled due to a significant expansion of irrigated land and urban areas, and has maintained on a constant upward curve. The Ebro catchment, third largest Mediterranean basin, is very representative of this context. Since the late 1970s, a negative trend in mean rainfall has been observed as well as an increase in mean temperature. Meanwhile, the Ebro River discharge has decreased by about 40%. However, climate alone cannot explain this downward trend. Another factor is the increase in water consumption for agricultural and domestic uses. Indeed, the Ebro catchment is a key element in the Spanish agricultural production with respectively 30% and 60% of the meat and fruit production of the country. Moreover, population has increased by 20% over the catchment since 1970 and the number of inhabitant doubles each summer due to tourism attraction. Finally, more than 250 storage dams have been built over the Ebro River for hydropower production and irrigation water supply purposes, hence regulating river discharge. In order to better understand the respective influence of climatic and anthropogenic pressures on the Ebro hydrological regime, an integrated water resources modelling framework was developed. This model is driven by water supplies, generated by a conceptual rainfall-runoff model and by a storage dam module that accounts for water demands and environmental flow requirements. Water demands were evaluated for the most water-demanding sector, i.e. irrigated agriculture (5 670 Hm3/year), and the domestic sector (252 Hm3/year), often defined as being of prior importance for water supply. A water allocation module has also been implemented in the model. The ability of water resources to satisfy the water demands is assessed by computing a water allocation index which depends on site priorities and supply preferences. This modelling framework was applied to eight sub-catchments, each one representative of typical climatic or water use conditions within the basin, over the 1971-1990 period. The results show the interest of integrated modelling to address water resources vulnerability. The hydrological response to climatic and anthropogenic variations witnesses the influence of both these pressures on water resources availability. Moreover, the water allocation index makes it possible to highlight the growing competition among users, especially during the summer season. The developed methodology hence provides us a more complete analysis to support decision-making compared to uncoupled analysis. This study is a first step towards evaluating future water resources availability and ability to satisfy water demands under climatic and anthropogenic pressures scenarios.
A Graphical Approach to Item Analysis. Research Report. ETS RR-04-10
ERIC Educational Resources Information Center
Livingston, Samuel A.; Dorans, Neil J.
2004-01-01
This paper describes an approach to item analysis that is based on the estimation of a set of response curves for each item. The response curves show, at a glance, the difficulty and the discriminating power of the item and the popularity of each distractor, at any level of the criterion variable (e.g., total score). The curves are estimated by…
Estimation of Curve Tracing Time in Supercapacitor based PV Characterization
NASA Astrophysics Data System (ADS)
Basu Pal, Sudipta; Das Bhattacharya, Konika; Mukherjee, Dipankar; Paul, Debkalyan
2017-08-01
Smooth and noise-free characterisation of photovoltaic (PV) generators have been revisited with renewed interest in view of large size PV arrays making inroads into the urban sector of major developing countries. Such practice has recently been observed to be confronted by the use of a suitable data acquisition system and also the lack of a supporting theoretical analysis to justify the accuracy of curve tracing. However, the use of a selected bank of supercapacitors can mitigate the said problems to a large extent. Assuming a piecewise linear analysis of the V-I characteristics of a PV generator, an accurate analysis of curve plotting time has been possible. The analysis has been extended to consider the effect of equivalent series resistance of the supercapacitor leading to increased accuracy (90-95%) of curve plotting times.
On analyzing free-response data on location level
NASA Astrophysics Data System (ADS)
Bandos, Andriy I.; Obuchowski, Nancy A.
2017-03-01
Free-response ROC (FROC) data are typically collected when primary question of interest is focused on the proportions of the correct detection-localization of known targets and frequencies of false positive responses, which can be multiple per subject (image). These studies are particularly relevant for CAD and related applications. The fundamental tool of the location-level FROC analysis is the FROC curve. Although there are many methods of FROC analysis, as we describe in this work, some of the standard and popular approaches, while important, are not suitable for analyzing specifically the location-level FROC performance as summarized by the FROC curve. Analysis of the FROC curve, on the other hand, might not be straightforward. Recently we developed an approach for the location-level analysis of the FROC data using the well-known tools for clustered ROC analysis. In the current work, based on previously developed concepts, and using specific examples, we demonstrate the key reasons why specifically location-level FROC performance cannot be fully addressed by the common approaches as well as illustrate the proposed solution. Specifically, we consider the two most salient FROC approaches, namely JAFROC and the area under the exponentially transformed FROC curve (AFE) and show that clearly superior FROC curves can have lower values for these indices. We describe the specific features that make these approaches inconsistent with FROC curves. This work illustrates some caveats for using the common approaches for location-level FROC analysis and provides guidelines for the appropriate assessment or comparison of FROC systems.
Banowary, Banya; Dang, Van Tuan; Sarker, Subir; Connolly, Joanne H.; Chenu, Jeremy; Groves, Peter; Ayton, Michelle; Raidal, Shane; Devi, Aruna; Vanniasinkam, Thiru; Ghorashi, Seyed A.
2015-01-01
Campylobacter spp. are important causes of bacterial gastroenteritis in humans in developed countries. Among Campylobacter spp. Campylobacter jejuni (C. jejuni) and C. coli are the most common causes of human infection. In this study, a multiplex PCR (mPCR) and high resolution melt (HRM) curve analysis were optimized for simultaneous detection and differentiation of C. jejuni and C. coli isolates. A segment of the hippuricase gene (hipO) of C. jejuni and putative aspartokinase (asp) gene of C. coli were amplified from 26 Campylobacter isolates and amplicons were subjected to HRM curve analysis. The mPCR-HRM was able to differentiate between C. jejuni and C. coli species. All DNA amplicons generated by mPCR were sequenced. Analysis of the nucleotide sequences from each isolate revealed that the HRM curves were correlated with the nucleotide sequences of the amplicons. Minor variation in melting point temperatures of C. coli or C. jejuni isolates was also observed and enabled some intraspecies differentiation between C. coli and/or C. jejuni isolates. The potential of PCR-HRM curve analysis for the detection and speciation of Campylobacter in additional human clinical specimens and chicken swab samples was also confirmed. The sensitivity and specificity of the test were found to be 100% and 92%, respectively. The results indicated that mPCR followed by HRM curve analysis provides a rapid (8 hours) technique for differentiation between C. jejuni and C. coli isolates. PMID:26394042
On the Analysis and Construction of the Butterfly Curve Using "Mathematica"[R
ERIC Educational Resources Information Center
Geum, Y. H.; Kim, Y. I.
2008-01-01
The butterfly curve was introduced by Temple H. Fay in 1989 and defined by the polar curve r = e[superscript cos theta] minus 2 cos 4 theta plus sin[superscript 5] (theta divided by 12). In this article, we develop the mathematical model of the butterfly curve and analyse its geometric properties. In addition, we draw the butterfly curve and…
Nonlinear bulging factor based on R-curve data
NASA Technical Reports Server (NTRS)
Jeong, David Y.; Tong, Pin
1994-01-01
In this paper, a nonlinear bulging factor is derived using a strain energy approach combined with dimensional analysis. The functional form of the bulging factor contains an empirical constant that is determined using R-curve data from unstiffened flat and curved panel tests. The determination of this empirical constant is based on the assumption that the R-curve is the same for both flat and curved panels.
Dried blood spot analysis of creatinine with LC-MS/MS in addition to immunosuppressants analysis.
Koster, Remco A; Greijdanus, Ben; Alffenaar, Jan-Willem C; Touw, Daan J
2015-02-01
In order to monitor creatinine levels or to adjust the dosage of renally excreted or nephrotoxic drugs, the analysis of creatinine in dried blood spots (DBS) could be a useful addition to DBS analysis. We developed a LC-MS/MS method for the analysis of creatinine in the same DBS extract that was used for the analysis of tacrolimus, sirolimus, everolimus, and cyclosporine A in transplant patients with the use of Whatman FTA DMPK-C cards. The method was validated using three different strategies: a seven-point calibration curve using the intercept of the calibration to correct for the natural presence of creatinine in reference samples, a one-point calibration curve at an extremely high concentration in order to diminish the contribution of the natural presence of creatinine, and the use of creatinine-[(2)H3] with an eight-point calibration curve. The validated range for creatinine was 120 to 480 μmol/L (seven-point calibration curve), 116 to 7000 μmol/L (1-point calibration curve), and 1.00 to 400.0 μmol/L for creatinine-[(2)H3] (eight-point calibration curve). The precision and accuracy results for all three validations showed a maximum CV of 14.0% and a maximum bias of -5.9%. Creatinine in DBS was found stable at ambient temperature and 32 °C for 1 week and at -20 °C for 29 weeks. Good correlations were observed between patient DBS samples and routine enzymatic plasma analysis and showed the capability of the DBS method to be used as an alternative for creatinine plasma measurement.
Miyanji, Firoz; Pawelek, Jeff B; Van Valin, Scott E; Upasani, Vidyadhar V; Newton, Peter O
2008-11-01
Retrospective review of adolescent idiopathic scoliosis (AIS) patients. To investigate the clinical deformity and radiographic features of Lenke 1A and 1B curves to determine if the "A" and "B" lumbar modifiers actually describe 2 distinct curve patterns. The Lenke classification system attempts to address some of the shortcomings of the King-Moe classification system by providing a more comprehensive, reliable, and treatment-based categorization of all AIS deformities. Although this classification is useful in determining which regions of the spine should be fused, it does not necessarily divide AIS curves into distinct patterns. A critical analysis of the clinical deformity, radiographic features, and surgical treatment of AIS patients with Lenke 1A and 1B right thoracic curves was performed. Lenke 1A curves were differentiated according to the L4 coronal plane tilt. Analysis of variance and Pearson chi analysis were used to perform statistical comparisons between the individual curve patterns (P < or = 0.05). Ninety-three patients with preoperative and 2-year postoperative data were included in this analysis (65 Lenke 1A, and 28 Lenke 1B). Thirty-three patients were subdivided as 1A-L (L4 tilted to the left) and 32 patients were subdivided as 1A-R (L4 tilted to the right). The interobserver reliability for determining the direction of L4 tilt was excellent (kappa = 0.94, P < or = 0.001). Patients with 1A-L curves were similar to patients with 1B curves with respect to the L4 tilt and the location of the stable vertebra (most often in the thoracolumbar junction). In contrast, patients with 1A-R curves had a more distal stable vertebra (most often L3 or L4). The surgical treatment also differed between these 2 groups with regards to the lowest instrumented vertebra (LIV). 1A-L and 1B curves were similar with a median LIV of T12, whereas the 1A-R curves had a more distal median LIV of L2 (P = 0.01). Two Lenke 1A curve patterns can be described based on the direction of the L4 tilt. This distinction has ramifications regarding selection of fusion levels and assessing surgical outcomes. The A and B lumbar modifiers do not describe 2 distinct curve types within the Lenke 1 group; however, the tilt direction of L4 does allow subdivision of the Lenke 1A curves into 2 distinguishable patterns (1A-R and 1A-L). The 1A-L curves are similar to 1B curves and different in form and treatment from the 1A-R pattern.
Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem
2008-01-01
A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.
Rischewski, J; Schneppenheim, R
2001-01-30
Patients with Fanconi anemia (Fanc) are at risk of developing leukemia. Mutations of the group A gene (FancA) are most common. A multitude of polymorphisms and mutations within the 43 exons of the gene are described. To examine the role of heterozygosity as a risk factor for malignancies, a partially automatized screening method to identify aberrations was needed. We report on our experience with DHPLC (WAVE (Transgenomic)). PCR amplification of all 43 exons from one individual was performed on one microtiter plate on a gradient thermocycler. DHPLC analysis conditions were established via melting curves, prediction software, and test runs with aberrant samples. PCR products were analyzed twice: native, and after adding a WT-PCR product. Retention patterns were compared with previously identified polymorphic PCR products or mutants. We have defined the mutation screening conditions for all 43 exons of FancA using DHPLC. So far, 40 different sequence variations have been detected in more than 100 individuals. The native analysis identifies heterozygous individuals, and the second run detects homozygous aberrations. Retention patterns are specific for the underlying sequence aberration, thus reducing sequencing demand and costs. DHPLC is a valuable tool for reproducible recognition of known sequence aberrations and screening for unknown mutations in the highly polymorphic FancA gene.
NASA Astrophysics Data System (ADS)
Hosseini, Seyed Farhad; Hashemian, Ali; Moetakef-Imani, Behnam; Hadidimoud, Saied
2018-03-01
In the present paper, the isogeometric analysis (IGA) of free-form planar curved beams is formulated based on the nonlinear Timoshenko beam theory to investigate the large deformation of beams with variable curvature. Based on the isoparametric concept, the shape functions of the field variables (displacement and rotation) in a finite element analysis are considered to be the same as the non-uniform rational basis spline (NURBS) basis functions defining the geometry. The validity of the presented formulation is tested in five case studies covering a wide range of engineering curved structures including from straight and constant curvature to variable curvature beams. The nonlinear deformation results obtained by the presented method are compared to well-established benchmark examples and also compared to the results of linear and nonlinear finite element analyses. As the nonlinear load-deflection behavior of Timoshenko beams is the main topic of this article, the results strongly show the applicability of the IGA method to the large deformation analysis of free-form curved beams. Finally, it is interesting to notice that, until very recently, the large deformations analysis of free-form Timoshenko curved beams has not been considered in IGA by researchers.
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
High-frequency optical oscillation during the flare phase of the red dwarf EV Lac
NASA Astrophysics Data System (ADS)
Contadakis, M.; Avgoloupis, S.; Seiradakis, J.
2006-01-01
The observational support of the presence of high frequency low amplitude oscillations reported by Zhillyaev et al. 2000 and Contadakis et al. 2004, is highly demanding and will be done by the future observations and by carefully reanalysing the data from our files. In this paper we present the results of the analysis of the B-light curve for a flare of magnitude 1.01,which was observed on September,1993. Despite the low time resolution (sampling interval 12s) we were able to detect transient low amplitude oscillations with period ranging between 30s and 125s with a confidence level higher than 70%. This result is in favour of (or does not contradict) the suggested explanation i.e the evolution of a fast mode magneto-acoustic wave generated at the impulsive phase of the flare and travelling through the magnetic loop From: Michael E.Contadakis Address: kodadaki@vergina.eng.auth.gr Database: phy
Laser confocal measurement system for curvature radius of lenses based on grating ruler
NASA Astrophysics Data System (ADS)
Tian, Jiwei; Wang, Yun; Zhou, Nan; Zhao, Weirui; Zhao, Weiqian
2015-02-01
In the modern optical measurement field, the radius of curvature (ROC) is one of the fundamental parameters of optical lens. Its measurement accuracy directly affects the other optical parameters, such as focal length, aberration and so on, which significantly affect the overall performance of the optical system. To meet the demand of measurement instruments for radius of curvature (ROC) with high accuracy in the market, we develop a laser confocal radius measurement system with grating ruler. The system uses the peak point of the confocal intensity curve to precisely identify the cat-eye and confocal positions and then measure the distance between these two positions by using the grating ruler, thereby achieving the high-precision measurement for the ROC. The system has advantages of high focusing sensitivity and anti-environment disturbance ability. And the preliminary theoretical analysis and experiments show that the measuring repeatability can be up to 0.8 um, which can provide an effective way for the accurate measurement of ROC.
McGlone, Sarah M
2010-01-01
New vaccine pricing is a complicated process that could have substantial long-standing scientific, medical and public health ramifications. Pricing can have a considerable impact on new vaccine adoption and, thereby, either culminate or thwart years of research and development and public health efforts. Typically, pricing strategy consists of the following eleven components: (1) Conduct a target population analysis; (2) Map potential competitors and alternatives; (3) Construct a vaccine target product profile (TPP) and compare it to projected or actual TPPs of competing vaccines; (4) Quantify the incremental value of the new vaccine's characteristics; (5) Determine vaccine positioning in the marketplace; (6) Estimate the vaccine price-demand curve; (7) Calculate vaccine costs (including those of manufacturing, distribution, and research and development); (8) Account for various legal, regulatory, third party payer and competitor factors; (9) Consider the overall product portfolio; (10) Set pricing objectives; (11) Select pricing and pricing structure. While the biomedical literature contains some studies that have addressed these components, there is still considerable room for more extensive evaluation of this important area. PMID:20861678
Lee, Bruce Y; McGlone, Sarah M
2010-08-01
New vaccine pricing is a complicated process that could have substantial long-standing scientific, medical, and public health ramifications. Pricing can have a considerable impact on new vaccine adoption and, thereby, either culminate or thwart years of research and development and public health efforts. Typically, pricing strategy consists of the following ten components: 1. Conduct a target population analysis; 2. Map potential competitors and alternatives; 3. Construct a vaccine target product profile (TPP) and compare it to projected or actual TPPs of competing vaccines; 4. Quantify the incremental value of the new vaccine's characteristics; 5. Determine vaccine positioning in the marketplace; 6. Estimate the vaccine price-demand curve; 7. Calculate vaccine costs (including those of manufacturing, distribution, and research and development); 8. Account for various legal, regulatory, third party payer, and competitor factors; 9. Consider the overall product portfolio; 10. Set pricing objectives; 11. Select pricing and pricing structure. While the biomedical literature contains some studies that have addressed these components, there is still considerable room for more extensive evaluation of this important area.
Kinetic efficiency of polar monolithic capillary columns in high-pressure gas chromatography.
Kurganov, A A; Korolev, A A; Shiryaeva, V E; Popova, T P; Kanateva, A Yu
2013-11-08
Poppe plots were used for analysis of kinetic efficiency of monolithic sorbents synthesized in quartz capillaries for utilization in high-pressure gas chromatography. Values of theoretical plate time and maximum number of theoretical plates occurred to depend significantly on synthetic parameters such as relative amount of monomer in the initial polymerization mixture, temperature and polymerization time. Poppe plots let one to find synthesis conditions suitable either for high-speed separations or for maximal efficiency. It is shown that construction of kinetic Poppe curves using potential Van Deemter data demands compressibility of mobile phase to be taken into consideration in the case of gas chromatography. Model mixture of light hydrocarbons C1 to C4 was then used for investigation of influence of carrier gas nature on kinetic efficiency of polymeric monolithic columns. Minimal values of theoretical plate times were found for CO2 and N2O carrier gases. Copyright © 2013 Elsevier B.V. All rights reserved.
Imitative and best response behaviors in a nonlinear Cournotian setting
NASA Astrophysics Data System (ADS)
Cerboni Baiardi, Lorenzo; Naimzada, Ahmad K.
2018-05-01
We consider the competition among quantity setting players in a deterministic nonlinear oligopoly framework characterized by an isoelastic demand curve. Players are characterized by having heterogeneous decisional mechanisms to set their outputs: some players are imitators, while the remaining others adopt a rational-like rule according to which their past decisions are adjusted towards their static expectation best response. The Cournot-Nash production level is a stationary state of our model together with a further production level that can be interpreted as the competitive outcome in case only imitators are present. We found that both the number of players and the relative fraction of imitators influence stability of the Cournot-Nash equilibrium with an ambiguous role, and double instability thresholds may be observed. Global analysis shows that a wide variety of complex dynamic scenarios emerge. Chaotic trajectories as well as multi-stabilities, where different attractors coexist, are robust phenomena that can be observed for a wide spectrum of parameter sets.
Presque Isle Peninsula, Frie, Pennsylvania. Volume II. Appendices. Revised.
1980-11-01
Population Pyramid 9 c. Employment 9 d. Labor Force 9 .e. Public Facilities and Services 14 1. Transportation 14 2. Health Facilities. 14 3. Communications 14...Distribution of Shoreline Use and Overship, 3 Erie County, PA B2 Population Pyramid of Erie County 13 53 Travel Demand Curve Peak Day Good Weather 38...are also experiencing a decline in total population. 4(5) Population Pyramid B2.13 Figure B2, the population pyramid of Erie County, PA, for the years
Unsupervised classification of variable stars
NASA Astrophysics Data System (ADS)
Valenzuela, Lucas; Pichara, Karim
2018-03-01
During the past 10 years, a considerable amount of effort has been made to develop algorithms for automatic classification of variable stars. That has been primarily achieved by applying machine learning methods to photometric data sets where objects are represented as light curves. Classifiers require training sets to learn the underlying patterns that allow the separation among classes. Unfortunately, building training sets is an expensive process that demands a lot of human efforts. Every time data come from new surveys; the only available training instances are the ones that have a cross-match with previously labelled objects, consequently generating insufficient training sets compared with the large amounts of unlabelled sources. In this work, we present an algorithm that performs unsupervised classification of variable stars, relying only on the similarity among light curves. We tackle the unsupervised classification problem by proposing an untraditional approach. Instead of trying to match classes of stars with clusters found by a clustering algorithm, we propose a query-based method where astronomers can find groups of variable stars ranked by similarity. We also develop a fast similarity function specific for light curves, based on a novel data structure that allows scaling the search over the entire data set of unlabelled objects. Experiments show that our unsupervised model achieves high accuracy in the classification of different types of variable stars and that the proposed algorithm scales up to massive amounts of light curves.
Zhang, Yanbin; Lin, Guanfeng; Wang, Shengru; Zhang, Jianguo; Shen, Jianxiong; Wang, Yipeng; Guo, Jianwei; Yang, Xinyu; Zhao, Lijuan
2016-01-01
Study Design. Retrospective study. Objective. To study the behavior of the unfused thoracic curve in Lenke type 5C during the follow-up and to identify risk factors for its correction loss. Summary of Background Data. Few studies have focused on the spontaneous behaviors of the unfused thoracic curve after selective thoracolumbar or lumbar fusion during the follow-up and the risk factors for spontaneous correction loss. Methods. We retrospectively reviewed 45 patients (41 females and 4 males) with AIS who underwent selective TL/L fusion from 2006 to 2012 in a single institution. The follow-up averaged 36 months (range, 24–105 months). Patients were divided into two groups. Thoracic curves in group A improved or maintained their curve magnitude after spontaneous correction, with a negative or no correction loss during the follow-up. Thoracic curves in group B deteriorated after spontaneous correction with a positive correction loss. Univariate analysis and multivariate analysis were built to identify the risk factors for correction loss of the unfused thoracic curves. Results. The minor thoracic curve was 26° preoperatively. It was corrected to 13° immediately with a spontaneous correction of 48.5%. At final follow-up it was 14° with a correction loss of 1°. Thoracic curves did not deteriorate after spontaneous correction in 23 cases in group A, while 22 cases were identified with thoracic curve progressing in group B. In multivariate analysis, two risk factors were independently associated with thoracic correction loss: higher flexibility and better immediate spontaneous correction rate of thoracic curve. Conclusion. Posterior selective TL/L fusion with pedicle screw constructs is an effective treatment for Lenke 5C AIS patients. Nonstructural thoracic curves with higher flexibility or better immediate correction are more likely to progress during the follow-up and close attentions must be paid to these patients in case of decompensation. Level of Evidence: 4 PMID:27831989
Meta-analysis of Diagnostic Accuracy and ROC Curves with Covariate Adjusted Semiparametric Mixtures.
Doebler, Philipp; Holling, Heinz
2015-12-01
Many screening tests dichotomize a measurement to classify subjects. Typically a cut-off value is chosen in a way that allows identification of an acceptable number of cases relative to a reference procedure, but does not produce too many false positives at the same time. Thus for the same sample many pairs of sensitivities and false positive rates result as the cut-off is varied. The curve of these points is called the receiver operating characteristic (ROC) curve. One goal of diagnostic meta-analysis is to integrate ROC curves and arrive at a summary ROC (SROC) curve. Holling, Böhning, and Böhning (Psychometrika 77:106-126, 2012a) demonstrated that finite semiparametric mixtures can describe the heterogeneity in a sample of Lehmann ROC curves well; this approach leads to clusters of SROC curves of a particular shape. We extend this work with the help of the [Formula: see text] transformation, a flexible family of transformations for proportions. A collection of SROC curves is constructed that approximately contains the Lehmann family but in addition allows the modeling of shapes beyond the Lehmann ROC curves. We introduce two rationales for determining the shape from the data. Using the fact that each curve corresponds to a natural univariate measure of diagnostic accuracy, we show how covariate adjusted mixtures lead to a meta-regression on SROC curves. Three worked examples illustrate the method.
NASA Technical Reports Server (NTRS)
Alston, D. W.
1981-01-01
The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.
Analysis of real-time numerical integration methods applied to dynamic clamp experiments.
Butera, Robert J; McCarthy, Maeve L
2004-12-01
Real-time systems are frequently used as an experimental tool, whereby simulated models interact in real time with neurophysiological experiments. The most demanding of these techniques is known as the dynamic clamp, where simulated ion channel conductances are artificially injected into a neuron via intracellular electrodes for measurement and stimulation. Methodologies for implementing the numerical integration of the gating variables in real time typically employ first-order numerical methods, either Euler or exponential Euler (EE). EE is often used for rapidly integrating ion channel gating variables. We find via simulation studies that for small time steps, both methods are comparable, but at larger time steps, EE performs worse than Euler. We derive error bounds for both methods, and find that the error can be characterized in terms of two ratios: time step over time constant, and voltage measurement error over the slope factor of the steady-state activation curve of the voltage-dependent gating variable. These ratios reliably bound the simulation error and yield results consistent with the simulation analysis. Our bounds quantitatively illustrate how measurement error restricts the accuracy that can be obtained by using smaller step sizes. Finally, we demonstrate that Euler can be computed with identical computational efficiency as EE.
Zaugg, Serge; van der Schaar, Mike; Houégnigan, Ludwig; André, Michel
2013-02-01
The analysis of acoustic data from the ocean is a valuable tool to study free ranging cetaceans and anthropogenic noise. Due to the typically large volume of acquired data, there is a demand for automated analysis techniques. Many cetaceans produce acoustic pulses (echolocation clicks) with a pulse repetition interval (PRI) remaining nearly constant over several pulses. Analyzing these pulse trains is challenging because they are often interleaved. This article presents an algorithm that estimates a pulse's PRI with respect to neighboring pulses. It includes a deinterleaving step that operates via a spectral dissimilarity metric. The sperm whale (SW) produces trains with PRIs between 0.5 and 2 s. As a validation, the algorithm was used for the PRI-based identification of SW click trains with data from the NEMO-ONDE observatory that contained other pulsed sounds, mainly from ship propellers. Separation of files containing SW clicks with a medium and high signal to noise ratio from files containing other pulsed sounds gave an area under the receiver operating characteristic curve value of 0.96. This study demonstrates that PRI can be used for the automated identification of SW clicks and that deinterleaving via spectral dissimilarity contributes to algorithm performance.
Li, Jingjian; Xiong, Chao; He, Xia; Lu, Zhaocen; Zhang, Xin; Chen, Xiaoyang; Sun, Wei
2018-01-01
Traditional herbal medicines have played important roles in the ways of life of people around the world since ancient times. Despite the advanced medical technology of the modern world, herbal medicines are still used as popular alternatives to synthetic drugs. Due to the increasing demand for herbal medicines, plant species identification has become an important tool to prevent substitution and adulteration. Here we propose a method for biological assessment of the quality of prescribed species in the Chinese Pharmacopoeia by use of high resolution melting (HRM) analysis of microsatellite loci. We tested this method on licorice, a traditional herbal medicine with a long history. Results showed that nine simple sequence repeat (SSR) markers produced distinct melting curve profiles for the five licorice species investigated using HRM analysis. These results were validated by capillary electrophoresis. We applied this protocol to commercially available licorice products, thus enabling the consistent identification of 11 labels with non-declared Glycyrrhiza species. This novel strategy may thus facilitate DNA barcoding as a method of identification of closely related species in herbal medicine products. Based on this study, a brief operating procedure for using the SSR-HRM protocol for herbal authentication is provided.
Li, Jingjian; Xiong, Chao; He, Xia; Lu, Zhaocen; Zhang, Xin; Chen, Xiaoyang; Sun, Wei
2018-01-01
Traditional herbal medicines have played important roles in the ways of life of people around the world since ancient times. Despite the advanced medical technology of the modern world, herbal medicines are still used as popular alternatives to synthetic drugs. Due to the increasing demand for herbal medicines, plant species identification has become an important tool to prevent substitution and adulteration. Here we propose a method for biological assessment of the quality of prescribed species in the Chinese Pharmacopoeia by use of high resolution melting (HRM) analysis of microsatellite loci. We tested this method on licorice, a traditional herbal medicine with a long history. Results showed that nine simple sequence repeat (SSR) markers produced distinct melting curve profiles for the five licorice species investigated using HRM analysis. These results were validated by capillary electrophoresis. We applied this protocol to commercially available licorice products, thus enabling the consistent identification of 11 labels with non-declared Glycyrrhiza species. This novel strategy may thus facilitate DNA barcoding as a method of identification of closely related species in herbal medicine products. Based on this study, a brief operating procedure for using the SSR-HRM protocol for herbal authentication is provided. PMID:29740326
Wang, Ailin; Yao, Zhichao; Zheng, Weiwei; Zhang, Hongyu
2014-01-01
The citrus fruit fly Bactrocera minax is associated with diverse bacterial communities. We used a 454 pyrosequencing technology to study in depth the microbial communities associated with gut and reproductive organs of Bactrocera minax. Our dataset consisted of 100,749 reads with an average length of 400 bp. The saturated rarefaction curves and species richness indices indicate that the sampling was comprehensive. We found highly diverse bacterial communities, with individual sample containing approximately 361 microbial operational taxonomic units (OTUs). A total of 17 bacterial phyla were obtained from the flies. A phylogenetic analysis of 16S rDNA revealed that Proteobacteria was dominant in all samples (75%-95%). Actinobacteria and Firmicutes were also commonly found in the total clones. Klebsiella, Citrobacter, Enterobacter, and Serratia were the major genera. However, bacterial diversity (Chao1, Shannon and Simpson indices) and community structure (PCA analysis) varied across samples. Female ovary has the most diverse bacteria, followed by male testis, and the bacteria diversity of reproductive organs is richer than that of the gut. The observed variation can be caused by sex and tissue, possibly to meet the host's physiological demands.
A new methodology for free wake analysis using curved vortex elements
NASA Technical Reports Server (NTRS)
Bliss, Donald B.; Teske, Milton E.; Quackenbush, Todd R.
1987-01-01
A method using curved vortex elements was developed for helicopter rotor free wake calculations. The Basic Curve Vortex Element (BCVE) is derived from the approximate Biot-Savart integration for a parabolic arc filament. When used in conjunction with a scheme to fit the elements along a vortex filament contour, this method has a significant advantage in overall accuracy and efficiency when compared to the traditional straight-line element approach. A theoretical and numerical analysis shows that free wake flows involving close interactions between filaments should utilize curved vortex elements in order to guarantee a consistent level of accuracy. The curved element method was implemented into a forward flight free wake analysis, featuring an adaptive far wake model that utilizes free wake information to extend the vortex filaments beyond the free wake regions. The curved vortex element free wake, coupled with this far wake model, exhibited rapid convergence, even in regions where the free wake and far wake turns are interlaced. Sample calculations are presented for tip vortex motion at various advance ratios for single and multiple blade rotors. Cross-flow plots reveal that the overall downstream wake flow resembles a trailing vortex pair. A preliminary assessment shows that the rotor downwash field is insensitive to element size, even for relatively large curved elements.
Accuracy of AFM force distance curves via direct solution of the Euler-Bernoulli equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eppell, Steven J., E-mail: steven.eppell@case.edu; Liu, Yehe; Zypman, Fredy R.
2016-03-15
In an effort to improve the accuracy of force-separation curves obtained from atomic force microscope data, we compare force-separation curves computed using two methods to solve the Euler-Bernoulli equation. A recently introduced method using a direct sequential forward solution, Causal Time-Domain Analysis, is compared against a previously introduced Tikhonov Regularization method. Using the direct solution as a benchmark, it is found that the regularization technique is unable to reproduce accurate curve shapes. Using L-curve analysis and adjusting the regularization parameter, λ, to match either the depth or the full width at half maximum of the force curves, the two techniquesmore » are contrasted. Matched depths result in full width at half maxima that are off by an average of 27% and matched full width at half maxima produce depths that are off by an average of 109%.« less
Shi, Benlong; Mao, Saihu; Xu, Leilei; Sun, Xu; Liu, Zhen; Zhu, Zezhang; Lam, Tsz Ping; Cheng, Jack Cy; Ng, Bobby; Qiu, Yong
2016-07-04
Height gain is a common beneficial consequence following correction surgery in adolescent idiopathic scoliosis (AIS), yet little is known concerning factors favoring regain of the lost vertical spinal height (SH) through posterior spinal fusion. A consecutive series of AIS patients from February 2013 to August 2015 were reviewed. Surgical changes in SH (ΔSH), as well as the multiple coronal and sagittal deformity parameters were measured and correlated. Factors associated with ΔSH were identified through Pearson correlation analysis and multivariate regression analysis. A total of 172 single curve and 104 double curve patients were reviewed. The ΔSH averaged 2.5 ± 0.9 cm in single curve group and 2.9 ± 1.0 cm in double curve group. The multivariate regression analysis revealed the following pre-operative variables contributed significantly to ΔSH: pre-op Cobb angle, pre-op TK (single curve group only), pre-op GK (double curve group only) and pre-op LL (double curve group only) (p < 0.05). Thus change in height (in cm) = 0.044 × (pre-op Cobb angle) + 0.012 × (pre-op TK) (Single curve, adjusted R(2) = 0.549) or 0.923 + 0.021 × (pre-op Cobb angle1) + 0.028 × (pre-op Cobb angle2) + 0.015 × (pre-op GK)-0.012 × (pre-op LL) (Double curve, adjusted R(2) = 0.563). Severer pre-operative coronal Cobb angle and greater sagittal curves were beneficial factors favoring more contribution to the surgical lengthening effect in vertical spinal height in AIS.
MetaboAnalyst 3.0--making metabolomics more meaningful.
Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S
2015-07-01
MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Robust Electrical Transfer System (RETS) for Solar Array Drive Mechanism SlipRing Assembly
NASA Astrophysics Data System (ADS)
Bommottet, Daniel; Bossoney, Luc; Schnyder, Ralph; Howling, Alan; Hollenstein, Christoph
2013-09-01
Demands for robust and reliable power transmission systems for sliprings for SADM (Solar Array Drive Mechanism) are increasing steadily. As a consequence, it is required to know their performances regarding the voltage breakdown limit.An understanding of the overall shape of the breakdown voltage versus pressure curve is established, based on experimental measurements of DC (Direct Current) gas breakdown in complex geometries compared with a numerical simulation model.In addition a detailed study was made of the functional behaviour of an entire wing of satellite in a like- operational mode, comprising the solar cells, the power transmission lines, the SRA (SlipRing Assembly), the power S3R (Sequential Serial/shunt Switching Regulators) and the satellite load to simulate the electrical power consumption.A test bench able to measure automatically the: a)breakdown voltage versus pressure curve and b)the functional switching performances, was developed and validated.
Research on Rigid Body Motion Tracing in Space based on NX MCD
NASA Astrophysics Data System (ADS)
Wang, Junjie; Dai, Chunxiang; Shi, Karen; Qin, Rongkang
2018-03-01
In the use of MCD (Mechatronics Concept Designer) which is a module belong to SIEMENS Ltd industrial design software UG (Unigraphics NX), user can define rigid body and kinematic joint to make objects move according to the existing plan in simulation. At this stage, user may have the desire to see the path of some points in the moving object intuitively. In response to this requirement, this paper will compute the pose through the transformation matrix which can be available from the solver engine, and then fit these sampling points through B-spline curve. Meanwhile, combined with the actual constraints of rigid bodies, the traditional equal interval sampling strategy was optimized. The result shown that this method could satisfy the demand and make up for the deficiency in traditional sampling method. User can still edit and model on this 3D curve. Expected result has been achieved.
Reinforcing value and hypothetical behavioral economic demand for food and their relation to BMI.
Epstein, Leonard H; Paluch, Rocco A; Carr, Katelyn A; Temple, Jennifer L; Bickel, Warren K; MacKillop, James
2018-04-01
Food is a primary reinforcer, and food reinforcement is related to obesity. The reinforcing value of food can be measured by establishing how hard someone will work to get food on progressive-ratio schedules. An alternative way to measure food reinforcement is a hypothetical purchase task which creates behavioral economic demand curves. This paper studies whether reinforcing value and hypothetical behavioral demand approaches are assessing the same or unique aspects of food reinforcement for low (LED) and high (HED) energy density foods using a combination of analytic approaches in females of varying BMI. Results showed absolute reinforcing value for LED and HED foods and relative reinforcing value were related to demand intensity (r's = 0.20-0.30, p's < 0.01), and demand elasticity (r's = 0.17-0.22, p's < 0.05). Correlations between demographic, BMI and restraint, disinhibition and hunger variables with the two measures of food reinforcement were different. Finally, the two measures provided unique contributions to predicting BMI. Potential reasons for differences between the reinforcing value and hypothetical purchase tasks were actual responding versus hypothetical purchasing, choice of reinforcers versus purchasing of individual foods in the demand task, and the differential role of effort in the two tasks. Examples of how a better understanding of food reinforcement may be useful to prevent or treat obesity are discussed, including engaging in alternative non-food reinforcers as substitutes for food, such as crafts or socializing in a non-food environment, and reducing the value of immediate food reinforcers by episodic future thinking. Copyright © 2018. Published by Elsevier Ltd.
Restructuring the rotor analysis program C-60
NASA Technical Reports Server (NTRS)
1985-01-01
The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.
Evaluation of the MEDLARS Demand Search Service.
ERIC Educational Resources Information Center
Lancaster, F.W.
A detailed analysis was made by the National Library of Medicine of the performance of the Medical Literature and Analysis System (MEDLARS) in relation to 300 actual "demand search" requests made to the systems in 1966 and 1967. The objectives of the study were : (1) to study the demand search requirements of MEDLARS users, (2) to…
[Effect of occupational stress on mental health].
Yu, Shan-fa; Zhang, Rui; Ma, Liang-qing; Gu, Gui-zhen; Yang, Yan; Li, Kui-rong
2003-02-01
To study the effect of job psychological demands and job control on mental health and their interaction. 93 male freight train dispatchers were evaluated by using revised Job Demand-Control Scale and 7 strain scales. Stepwise regression analysis, Univariate ANOVA, Kruskal-Wallis H and Modian methods were used in statistic analysis. Kruskal-Wallis H and Modian methods analysis revealed the difference in mental health scores among groups of decision latitude (mean rank 55.57, 47.95, 48.42, 33.50, P < 0.05), the differences in scores of mental health (37.45, 40.01, 58.35), job satisfaction (53.18, 46.91, 32.43), daily life strains (33.00, 44.96, 56.12) and depression (36.45, 42.25, 53.61) among groups of job time demands (P < 0.05) were all statistically significant. ANOVA showed that job time demands and decision latitude had interaction effects on physical complains (R(2) = 0.24), state-anxiety (R(2) = 0.26), and daytime fatigue (R(2) = 0.28) (P < 0.05). Regression analysis revealed a significant job time demands and job decision latitude interaction effect as well as significant main effects of the some independent variables on different job strains (R(2) > 0.05). Job time demands and job decision latitude have direct and interactive effects on psychosomatic health, the more time demands, the more psychological strains, the effect of job time demands is greater than that of job decision latitude.
Genetic analysis of growth curves for a woody perennial species, Pinus taeda L.
D.P. Gwaze; F.E. Bridgwater; C.G. Williams
2002-01-01
Inheritance of growth curves is critical for understanding evolutionary change and formulating efficient breeding plans, yet has received limited attention. Growth curves, like other characters that change in concert with development, often have higher heritability than age-specific traits. This study compared genetic parameters of height-growth curves with those of...
LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS
Einstein, Daniel R.; Dyedov, Vladimir
2010-01-01
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546
On the distribution of saliency.
Berengolts, Alexander; Lindenbaum, Michael
2006-12-01
Detecting salient structures is a basic task in perceptual organization. Saliency algorithms typically mark edge-points with some saliency measure, which grows with the length and smoothness of the curve on which these edge-points lie. Here, we propose a modified saliency estimation mechanism that is based on probabilistically specified grouping cues and on curve length distributions. In this framework, the Shashua and Ullman saliency mechanism may be interpreted as a process for detecting the curve with maximal expected length. Generalized types of saliency naturally follow. We propose several specific generalizations (e.g., gray-level-based saliency) and rigorously derive the limitations on generalized saliency types. We then carry out a probabilistic analysis of expected length saliencies. Using ergodicity and asymptotic analysis, we derive the saliency distributions associated with the main curves and with the rest of the image. We then extend this analysis to finite-length curves. Using the derived distributions, we derive the optimal threshold on the saliency for discriminating between figure and background and bound the saliency-based figure-from-ground performance.
Exploring Demand Charge Savings from Commercial Solar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darghouth, Naim; Barbose, Galen; Mills, Andrew
Commercial retail electricity rates commonly include a demand charge component, based on some measure of the customer’s peak demand. Customer-sited solar PV can potentially reduce demand charges, but the magnitude of these savings can be difficult to predict, given variations in demand charge designs, customer loads, and PV generation profiles. Moreover, depending on the circumstances, demand charges from solar may or may not align well with associated utility cost savings. Lawrence Berkeley National Laboratory (Berkeley Lab) and the National Renewable Energy Laboratory (NREL) are collaborating in a series of studies to understand how solar PV can reduce demand charge levelsmore » for a variety of customer types and demand charges designs. Previous work focused on residential customs with solar. This study, instead, focuses on commercial customers and seeks to understand the extent and conditions under which rooftop can solar reduce commercial demand charges. To answer these questions, we simulate demand charge savings for a broad range of commercial customer types, demand charge designs, locations, and PV system characteristics. This particular analysis does not include storage, but a subsequent analysis in this series will evaluate demand charge savings for commercial customers with solar and storage.« less
NASA Astrophysics Data System (ADS)
Ahmadianfar, Iman; Adib, Arash; Taghian, Mehrdad
2017-10-01
The reservoir hedging rule curves are used to avoid severe water shortage during drought periods. In this method reservoir storage is divided into several zones, wherein the rationing factors are changed immediately when water storage level moves from one zone to another. In the present study, a hedging rule with fuzzy rationing factors was applied for creating a transition zone in up and down each rule curve, and then the rationing factor will be changed in this zone gradually. For this propose, a monthly simulation model was developed and linked to the non-dominated sorting genetic algorithm for calculation of the modified shortage index of two objective functions involving water supply of minimum flow and agriculture demands in a long-term simulation period. Zohre multi-reservoir system in south Iran has been considered as a case study. The results of the proposed hedging rule have improved the long-term system performance from 10 till 27 percent in comparison with the simple hedging rule, where these results demonstrate that the fuzzification of hedging factors increase the applicability and the efficiency of the new hedging rule in comparison to the conventional rule curve for mitigating the water shortage problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruce McCarl and Dhazngilly
2004-01-07
The results of this project include: (1) Development of econometrically estimated marginal abatement and associated production curves describing response of agricultural and forestry emissions/sink/offsets enhancements for use in integrated assessments. Curves were developed that reflected agricultural, and forestry production of traditional commodities, carbon and other greenhouse gas offsets and biofuels given signals of general commodity demand, and carbon and energy prices. (2) Integration of the non-dynamic curves from (1) into a version of the PNNL SGM integrated assessment model was done in cooperation with Dr. Ronald Sands at PNNL. The results were reported at the second DOE conference on sequestrationmore » in the paper listed and the abstract is in Annex B of this report. (3) Alternative agricultural sequestration estimates were developed in conjunction with personnel at Colorado State University using CENTURY and analyses can operate under the use of agricultural soil carbon data from either the EPIC or CENTURY models. (4) A major effort was devoted to understanding the possible role and applicable actions from agriculture. (5) Work was done with EPA and EIA to update the biofuel data and assumptions resulting in some now emerging results showing the criticality of biofuel assumptions.« less
Overlap of movement planning and movement execution reduces reaction time.
Orban de Xivry, Jean-Jacques; Legrain, Valéry; Lefèvre, Philippe
2017-01-01
Motor planning is the process of preparing the appropriate motor commands in order to achieve a goal. This process has largely been thought to occur before movement onset and traditionally has been associated with reaction time. However, in a virtual line bisection task we observed an overlap between movement planning and execution. In this task performed with a robotic manipulandum, we observed that participants (n = 30) made straight movements when the line was in front of them (near target) but often made curved movements when the same target was moved sideways (far target, which had the same orientation) in such a way that they crossed the line perpendicular to its orientation. Unexpectedly, movements to the far targets had shorter reaction times than movements to the near targets (mean difference: 32 ms, SE: 5 ms, max: 104 ms). In addition, the curvature of the movement modulated reaction time. A larger increase in movement curvature from the near to the far target was associated with a larger reduction in reaction time. These highly curved movements started with a transport phase during which accuracy demands were not taken into account. We conclude that an accuracy demand imposes a reaction time penalty if processed before movement onset. This penalty is reduced if the start of the movement consists of a transport phase and if the movement plan can be refined with respect to accuracy demands later in the movement, hence demonstrating an overlap between movement planning and execution. In the planning of a movement, the brain has the opportunity to delay the incorporation of accuracy requirements of the motor plan in order to reduce the reaction time by up to 100 ms (average: 32 ms). Such shortening of reaction time is observed here when the first phase of the movement consists of a transport phase. This forces us to reconsider the hypothesis that motor plans are fully defined before movement onset. Copyright © 2017 the American Physiological Society.
Overlap of movement planning and movement execution reduces reaction time
Legrain, Valéry; Lefèvre, Philippe
2016-01-01
Motor planning is the process of preparing the appropriate motor commands in order to achieve a goal. This process has largely been thought to occur before movement onset and traditionally has been associated with reaction time. However, in a virtual line bisection task we observed an overlap between movement planning and execution. In this task performed with a robotic manipulandum, we observed that participants (n = 30) made straight movements when the line was in front of them (near target) but often made curved movements when the same target was moved sideways (far target, which had the same orientation) in such a way that they crossed the line perpendicular to its orientation. Unexpectedly, movements to the far targets had shorter reaction times than movements to the near targets (mean difference: 32 ms, SE: 5 ms, max: 104 ms). In addition, the curvature of the movement modulated reaction time. A larger increase in movement curvature from the near to the far target was associated with a larger reduction in reaction time. These highly curved movements started with a transport phase during which accuracy demands were not taken into account. We conclude that an accuracy demand imposes a reaction time penalty if processed before movement onset. This penalty is reduced if the start of the movement consists of a transport phase and if the movement plan can be refined with respect to accuracy demands later in the movement, hence demonstrating an overlap between movement planning and execution. NEW & NOTEWORTHY In the planning of a movement, the brain has the opportunity to delay the incorporation of accuracy requirements of the motor plan in order to reduce the reaction time by up to 100 ms (average: 32 ms). Such shortening of reaction time is observed here when the first phase of the movement consists of a transport phase. This forces us to reconsider the hypothesis that motor plans are fully defined before movement onset. PMID:27733598
NAS Demand Predictions, Transportation Systems Analysis Model (TSAM) Compared with Other Forecasts
NASA Technical Reports Server (NTRS)
Viken, Jeff; Dollyhigh, Samuel; Smith, Jeremy; Trani, Antonio; Baik, Hojong; Hinze, Nicholas; Ashiabor, Senanu
2006-01-01
The current work incorporates the Transportation Systems Analysis Model (TSAM) to predict the future demand for airline travel. TSAM is a multi-mode, national model that predicts the demand for all long distance travel at a county level based upon population and demographics. The model conducts a mode choice analysis to compute the demand for commercial airline travel based upon the traveler s purpose of the trip, value of time, cost and time of the trip,. The county demand for airline travel is then aggregated (or distributed) to the airport level, and the enplanement demand at commercial airports is modeled. With the growth in flight demand, and utilizing current airline flight schedules, the Fratar algorithm is used to develop future flight schedules in the NAS. The projected flights can then be flown through air transportation simulators to quantify the ability of the NAS to meet future demand. A major strength of the TSAM analysis is that scenario planning can be conducted to quantify capacity requirements at individual airports, based upon different future scenarios. Different demographic scenarios can be analyzed to model the demand sensitivity to them. Also, it is fairly well know, but not well modeled at the airport level, that the demand for travel is highly dependent on the cost of travel, or the fare yield of the airline industry. The FAA projects the fare yield (in constant year dollars) to keep decreasing into the future. The magnitude and/or direction of these projections can be suspect in light of the general lack of airline profits and the large rises in airline fuel cost. Also, changes in travel time and convenience have an influence on the demand for air travel, especially for business travel. Future planners cannot easily conduct sensitivity studies of future demand with the FAA TAF data, nor with the Boeing or Airbus projections. In TSAM many factors can be parameterized and various demand sensitivities can be predicted for future travel. These resulting demand scenarios can be incorporated into future flight schedules, therefore providing a quantifiable demand for flights in the NAS for a range of futures. In addition, new future airline business scenarios are investigated that illustrate when direct flights can replace connecting flights and larger aircraft can be substituted, only when justified by demand.
Kao, Peng-Kai; Hsu, Cheng-Che
2014-09-02
A portable microplasma generation device (MGD) operated in ambient air is introduced for making a microfluidic paper-based analytical device (μPAD) that serves as a primary healthcare platform. By utilizing a printed circuit board fabrication process, a flexible and lightweight MGD can be fabricated within 30 min with ultra low-cost. This MGD can be driven by a portable power supply (less than two pounds), which can be powered using 12 V-batteries or ac-dc converters. This MGD is used to perform maskless patterning of hydrophilic patterns with sub-millimeter spatial resolution on hydrophobic paper substrates with good pattern transfer fidelity. Using this MGD to fabricate μPADs is demonstrated. With a proper design of the MGD electrode geometry, μPADs with 500-μm-wide flow channels can be fabricated within 1 min and with a cost of less than $USD 0.05/device. We then test the μPADs by performing quantitative colorimetric assay tests and establish a calibration curve for detection of glucose and nitrite. The results show a linear response to a glucose assay for 1-50 mM and a nitrite assay for 0.1-5 mM. The low cost, miniaturized, and portable MGD can be used to fabricate μPADs on demand, which is suitable for in-field diagnostic tests. We believe this concept brings impact to the field of biomedical analysis, environmental monitoring, and food safety survey.
The ASAC Air Carrier Investment Model (Second Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Johnson, Jesse P.; Sickles, Robin C.; Good, David H.
1997-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based mode with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models that are envisioned for ASAC. We describe a second-generation Air Carrier Investment Model that meets these requirements. The enhanced model incorporates econometric results from the supply and demand curves faced by U.S.-scheduled passenger air carriers. It uses detailed information about their fleets in 1995 to make predictions about future aircraft purchases. It enables analysts with the ability to project revenue passenger-miles flown, airline industry employment, airline operating profit margins, numbers and types of aircraft in the fleet, and changes in aircraft manufacturing employment under various user-defined scenarios.
Modelling supply and demand of bioenergy from short rotation coppice and Miscanthus in the UK.
Bauen, A W; Dunnett, A J; Richter, G M; Dailey, A G; Aylott, M; Casella, E; Taylor, G
2010-11-01
Biomass from lignocellulosic energy crops can contribute to primary energy supply in the short term in heat and electricity applications and in the longer term in transport fuel applications. This paper estimates the optimal feedstock allocation of herbaceous and woody lignocellulosic energy crops for England and Wales based on empirical productivity models. Yield maps for Miscanthus, willow and poplar, constrained by climatic, soil and land use factors, are used to estimate the potential resource. An energy crop supply-cost curve is estimated based on the resource distribution and associated production costs. The spatial resource model is then used to inform the supply of biomass to geographically distributed demand centres, with co-firing plants used as an illustration. Finally, the potential contribution of energy crops to UK primary energy and renewable energy targets is discussed. Copyright 2010 Elsevier Ltd. All rights reserved.
geometrical shape of the finite element in both of the models is a doubly-curved quadrilateral element whose edge curves are the lines-of-curvature coordinates employed to define the shell midsurface . (Author)
Smith, Sarah Josephine; Wei, Max; Sohn, Michael D.
2016-09-17
Experience curves are useful for understanding technology development and can aid in the design and analysis of market transformation programs. Here, we employ a novel approach to create experience curves, to examine both global and North American compact fluorescent lamp (CFL) data for the years 1990–2007. We move away from the prevailing method of fitting a single, constant, exponential curve to data and instead search for break points where changes in the learning rate may have occurred. Our analysis suggests a learning rate of approximately 21% for the period of 1990–1997, and 51% and 79% in global and North Americanmore » datasets, respectively, after 1998. We use price data for this analysis; therefore our learning rates encompass developments beyond typical “learning by doing”, including supply chain impacts such as market competition. We examine correlations between North American learning rates and the initiation of new programs, abrupt technological advances, and economic and political events, and find an increased learning rate associated with design advancements and federal standards programs. Our findings support the use of segmented experience curves for retrospective and prospective technology analysis, and may imply that investments in technology programs have contributed to an increase of the CFL learning rate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Sarah Josephine; Wei, Max; Sohn, Michael D.
Experience curves are useful for understanding technology development and can aid in the design and analysis of market transformation programs. Here, we employ a novel approach to create experience curves, to examine both global and North American compact fluorescent lamp (CFL) data for the years 1990–2007. We move away from the prevailing method of fitting a single, constant, exponential curve to data and instead search for break points where changes in the learning rate may have occurred. Our analysis suggests a learning rate of approximately 21% for the period of 1990–1997, and 51% and 79% in global and North Americanmore » datasets, respectively, after 1998. We use price data for this analysis; therefore our learning rates encompass developments beyond typical “learning by doing”, including supply chain impacts such as market competition. We examine correlations between North American learning rates and the initiation of new programs, abrupt technological advances, and economic and political events, and find an increased learning rate associated with design advancements and federal standards programs. Our findings support the use of segmented experience curves for retrospective and prospective technology analysis, and may imply that investments in technology programs have contributed to an increase of the CFL learning rate.« less
Current clinical use of reteplase for thrombolysis. A pharmacokinetic-pharmacodynamic perspective.
Martin, U; Kaufmann, B; Neugebauer, G
1999-04-01
Clinical evaluation of a new thrombolytic agent should start with a dose that provides adequate efficacy and has an acceptably low bleeding risk; this results in a narrow therapeutic window at the upper end of the dose-response curve. Angiographic patency of the infarct-related artery is still the clinical surrogate end-point for mortality in phase II dose-ranging studies. There is experimental and clinical evidence that the area under the concentration-time curve (AUC) for plasminogenolytic activity of a thrombolytic agent is positively correlated with patency of the infarct-related artery. Dose-ranging studies of the novel recombinant plasminogen activator reteplase in healthy volunteers enabled computation of a linear regression curve by which a clinical starting dose could be calculated for an adapted target AUC that would be clinically effective. Pharmacokinetic analysis also revealed that the half-life of reteplase is 4 times longer than that of the reference thrombolytic alteplase, thus allowing bolus injection. The suggested single bolus starting dose of 10U was supported by results from studies in a canine model of coronary thrombolysis. The feedback of insufficiently high patency rates compared with the increased efficacy of front-loaded and accelerated alteplase demanded optimisation strategies for reteplase. Animal experiments suggested that a double bolus regimen of reteplase would be preferable to doubling the single bolus dose. Pharmacokinetic modelling suggested a time interval of 30 min between the 2 bolus injections. Selection of the tested double bolus regimens was conservative and empirical. First, the previously tested single bolus of 15U was divided to 10 + 5U; secondly, the second bolus dose was increased to 10U. This strategy proved to be successful. The current dosage recommendation for reteplase is a double bolus intravenous injection of 10 + 10U, each over 2 min, 30 min apart. This produces a reduction in mortality in patients with acute myocardial infarction that is equivalent to that produced by front-loaded and accelerated infusion of alteplase.
Tongass National Forest timber demand: projections for 2015 to 2030
Jean M. Daniels; Michael D. Paruszkiewicz; Susan J. Alexander
2016-01-01
Projections of Alaska timber products output; the derived demand for logs, lumber, residues, and niche products; and timber harvest by owner were developed using a trend-based analysis. This is the fifth such analysis performed since 1990 to assist planners in meeting statutory requirements for estimating planning-cycle demand for timber from the Tongass National...
Langevin Equation on Fractal Curves
NASA Astrophysics Data System (ADS)
Satin, Seema; Gangal, A. D.
2016-07-01
We analyze random motion of a particle on a fractal curve, using Langevin approach. This involves defining a new velocity in terms of mass of the fractal curve, as defined in recent work. The geometry of the fractal curve, plays an important role in this analysis. A Langevin equation with a particular model of noise is proposed and solved using techniques of the Fα-Calculus.
1998 UBV Light Curves of Eclipsing Binary AI Draconis and Absolute Parameters
NASA Astrophysics Data System (ADS)
Jassur, D. M. Z.; Khaledian, M. S.; Kermani, M. H.
New UBV photometry of Algol-Type eclipsing binary star AI Dra and the absolute physical parameters of this system have been presented. The light curve analysis carried out by the method of differential corrections indicates that both components are inside their Roche-Lobes. From combining the photometric solution with spectroscopic data obtained from velocity curve analysis, it has been found that the system consist of a main sequence primary and an evolved (subgiant) secondary.
Work Demands and Work-to-Family and Family-to-Work Conflict: Direct and Indirect Relationships
ERIC Educational Resources Information Center
Voydanoff, Patricia
2005-01-01
This article uses a demands-and-resources approach to examine relationships between three types of work demands and work-to-family and family-to-work conflict: time-based demands, strain-based demands, and boundary-spanning demands. The analysis is based on data from 2,155 employed adults living with a family member who were interviewed for the…
NASA Astrophysics Data System (ADS)
Chen, Fei-Yan; Yi, Jing-Wei; Gu, Zhe-Jia; Tang, Bin-Bing; Li, Jian-Qi; Li, Li; Kulkarni, Padmakar; Liu, Li; Mason, Ralph P.; Tang, Qun
2016-03-01
On-demand drug delivery is becoming feasible via the design of either exogenous or endogenous stimulus-responsive drug delivery systems. Herein we report the development of gadolinium arsenite nanoparticles as a self-delivery platform to store, deliver and release arsenic trioxide (ATO, Trisenox), a clinical anti-cancer drug. Specifically, unloading of the small molecule drug is triggered by an endogenous stimulus: inorganic phosphate (Pi) in the blood, fluid, and soft or hard tissue. Kinetics in vitro demonstrated that ATO is released with high ON/OFF specificity and no leakage was observed in the silent state. The nanoparticles induced tumor cell apoptosis, and reduced cancer cell migration and invasion. Plasma pharmacokinetics verified extended retention time, but no obvious disturbance of phosphate balance. Therapeutic efficacy on a liver cancer xenograft mouse model was dramatically potentiated with reduced toxicity compared to the free drug. These results suggest a new drug delivery strategy which might be applied for ATO therapy on solid tumors.On-demand drug delivery is becoming feasible via the design of either exogenous or endogenous stimulus-responsive drug delivery systems. Herein we report the development of gadolinium arsenite nanoparticles as a self-delivery platform to store, deliver and release arsenic trioxide (ATO, Trisenox), a clinical anti-cancer drug. Specifically, unloading of the small molecule drug is triggered by an endogenous stimulus: inorganic phosphate (Pi) in the blood, fluid, and soft or hard tissue. Kinetics in vitro demonstrated that ATO is released with high ON/OFF specificity and no leakage was observed in the silent state. The nanoparticles induced tumor cell apoptosis, and reduced cancer cell migration and invasion. Plasma pharmacokinetics verified extended retention time, but no obvious disturbance of phosphate balance. Therapeutic efficacy on a liver cancer xenograft mouse model was dramatically potentiated with reduced toxicity compared to the free drug. These results suggest a new drug delivery strategy which might be applied for ATO therapy on solid tumors. Electronic supplementary information (ESI) available: HRTEM image and electron diffraction pattern of individual GdAsOx NPs, cell viability measurements after 48 and 72 hours of incubation, body weight change curves, hematology curves, liver function curves, and renal function curves. See DOI: 10.1039/c6nr00536e
NASA Astrophysics Data System (ADS)
Brandt, Adam Robert
This dissertation explores the environmental and economic impacts of the transition to hydrocarbon substitutes for conventional petroleum (SCPs). First, mathematical models of oil depletion are reviewed, including the Hubbert model, curve-fitting methods, simulation models, and economic models. The benefits and drawbacks of each method are outlined. I discuss the predictive value of the models and our ability to determine if one model type works best. I argue that forecasting oil depletion without also including substitution with SCPs results in unrealistic projections of future energy supply. I next use information theoretic techniques to test the Hubbert model of oil depletion against five other asymmetric and symmetric curve-fitting models using data from 139 oil producing regions. I also test the assumptions that production curves are symmetric and that production is more bell-shaped in larger regions. Results show that if symmetry is enforced, Gaussian production curves perform best, while if asymmetry is allowed, asymmetric exponential models prove most useful. I also find strong evidence for asymmetry: production declines are consistently less steep than inclines. In order to understand the impacts of oil depletion on GHG emissions, I developed the Regional Optimization Model for Emissions from Oil Substitutes (ROMEO). ROMEO is an economic optimization model of investment and production of fuels. Results indicate that incremental emissions (with demand held constant) from SCPs could be 5-20 GtC over the next 50 years. These results are sensitive to the endowment of conventional oil and not sensitive to a carbon tax. If demand can vary, total emissions could decline under a transition because the higher cost of SCPs lessens overall fuel consumption. Lastly, I study the energetic and environmental characteristics of the in situ conversion process, which utilizes electricity to generate liquid hydrocarbons from oil shale. I model the energy inputs and outputs from the ICP use them to calculate the GHG emissions from the ICP. Energy outputs (as refined liquid fuel) range from 1.2 to 1.6 times the total primary energy inputs. Well-to-tank greenhouse gas emissions range from 30.6 to 37.1 gCeq./MJ of final fuel delivered, 21 to 47% larger than those from conventionally produced petroleum-based fuels.
A sustainable development of a city electrical grid via a non-contractual Demand-Side Management
NASA Astrophysics Data System (ADS)
Samoylenko, Vladislav O.; Pazderin, Andrew V.
2017-06-01
An increasing energy consumption of large cities as well as an extreme high density of city electrical loads leads to the necessity to search for an alternative approaches to city grid development. The ongoing implementation of the energy accounting tariffs with differentiated rates depending upon the market conditions and changing in a short-term perspective, provide the possibility to use it as a financial incentive base of a Demand-Side Management (DSM). Modern hi-technology energy metering and accounting systems with a large number of functions and consumer feedback are supposed to be the good means of DSM. Existing systems of Smart Metering (SM) billing usually provide general information about consumption curve, bills and compared data, but not the advanced statistics about the correspondence of financial and electric parameters. Also, consumer feedback is usually not fully used. So, the efforts to combine the market principle, Smart Metering and a consumer feedback for an active non-contractual load control are essential. The paper presents the rating-based multi-purpose system of mathematical statistics and algorithms of DSM efficiency estimation useful for both the consumers and the energy companies. The estimation is performed by SM Data processing systems. The system is aimed for load peak shaving and load curve smoothing. It is focused primarily on a retail market support. The system contributes to the energy efficiency and a distribution process improvement by the manual management or by the automated Smart Appliances interaction.
Automatic Target Recognition Classification System Evaluation Methodology
2002-09-01
Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in
Solar + Storage Synergies for Managing Commercial-Customer Demand Charges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, P.; Govindarajan, A.; Bird, L.
Demand charges, which are based on a customer’s maximum demand in kilowatts (kW), are a common element of electricity rate structures for commercial customers. Customer-sited solar photovoltaic (PV) systems can potentially reduce demand charges, but the level of savings is difficult to predict, given variations in demand charge designs, customer loads, and PV generation profiles. Lawrence Berkeley National Laboratory (Berkeley Lab) and the National Renewable Energy Laboratory (NREL) are collaborating on a series of studies to understand how solar PV can impact demand charges. Prior studies in the series examined demand charge reductions from solar on a stand-alone basis formore » residential and commercial customers. Those earlier analyses found that solar, alone, has limited ability to reduce demand charges depending on the specific design of the demand charge and on the shape of the customer’s load profile. This latest analysis estimates demand charge savings from solar in commercial buildings when co-deployed with behind-the-meter storage, highlighting the complementary roles of the two technologies. The analysis is based on simulated loads, solar generation, and storage dispatch across a wide variety of building types, locations, system configurations, and demand charge designs.« less
MULTIVARIATE CURVE RESOLUTION OF NMR SPECTROSCOPY METABONOMIC DATA
Sandia National Laboratories is working with the EPA to evaluate and develop mathematical tools for analysis of the collected NMR spectroscopy data. Initially, we have focused on the use of Multivariate Curve Resolution (MCR) also known as molecular factor analysis (MFA), a tech...
A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas
Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan
2016-01-01
Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202
A Behavioral Economic Approach to Assessing Demand for Marijuana
Collins, R. Lorraine; Vincent, Paula C.; Yu, Jihnhee; Liu, Liu; Epstein, Leonard H.
2014-01-01
In the U.S., marijuana is the most commonly used illicit drug. Its prevalence is growing, particularly among young adults. Behavioral economic indices of the relative reinforcing efficacy (RRE) of substances have been used to examine the appeal of licit (e.g., alcohol) and illicit (e.g., heroin) drugs. The present study is the first to use an experimental, simulated purchasing task to examine the RRE of marijuana. Young-adult (M age = 21.64 years) recreational marijuana users (N = 59) completed a computerized marijuana purchasing task designed to generate demand curves and the related RRE indices (e.g., intensity of demand - purchases at lowest price; Omax - max. spent on marijuana; Pmax - price at which marijuana expenditure is max). Participants “purchased” high-grade marijuana across 16 escalating prices that ranged from $0/free to $160/joint. They also provided 2-weeks of real-time, ecological momentary assessment reports on their marijuana use. The purchasing task generated multiple RRE indices. Consistent with research on other substances, the demand for marijuana was inelastic at lower prices but became elastic at higher prices, suggesting that increases in the price of marijuana could lessen its use. In regression analyses, the intensity of demand, Omax and Pmax, and elasticity each accounted for significant variance in real-time marijuana use. These results provide support for the validity of a simulated marijuana purchasing task to examine its reinforcing efficacy. This study highlights the value of applying a behavioral economic framework to young-adult marijuana use and has implications for prevention, treatment, and policies to regulate marijuana use. PMID:24467370
A behavioral economic approach to assessing demand for marijuana.
Collins, R Lorraine; Vincent, Paula C; Yu, Jihnhee; Liu, Liu; Epstein, Leonard H
2014-06-01
In the United States, marijuana is the most commonly used illicit drug. Its prevalence is growing, particularly among young adults. Behavioral economic indices of the relative reinforcing efficacy (RRE) of substances have been used to examine the appeal of licit (e.g., alcohol) and illicit (e.g., heroin) drugs. The present study is the first to use an experimental, simulated purchasing task to examine the RRE of marijuana. Young-adult (M age = 21.64 years) recreational marijuana users (N = 59) completed a computerized marijuana purchasing task designed to generate demand curves and the related RRE indices (e.g., intensity of demand-purchases at lowest price; Omax-max. spent on marijuana; Pmax-price at which marijuana expenditure is max). Participants "purchased" high-grade marijuana across 16 escalating prices that ranged from $0/free to $160/joint. They also provided 2 weeks of real-time, ecological momentary assessment reports on their marijuana use. The purchasing task generated multiple RRE indices. Consistent with research on other substances, the demand for marijuana was inelastic at lower prices but became elastic at higher prices, suggesting that increases in the price of marijuana could lessen its use. In regression analyses, the intensity of demand, Omax, and Pmax, and elasticity each accounted for significant variance in real-time marijuana use. These results provide support for the validity of a simulated marijuana purchasing task to examine marijuana's reinforcing efficacy. This study highlights the value of applying a behavioral economic framework to young-adult marijuana use and has implications for prevention, treatment, and policies to regulate marijuana use. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Surendran, Sowmya Velekkatt; Hussain, Sharmila; Bhoominthan, S; Nayar, Sanjna; Jayesh, Ragavendra
2016-01-01
When reconstructing the occlusal curvatures dentists often use a 4-inch radii arc as a rough standard based on Monson spherical theory. The use of an identical radius for the curve of Spee for all patients may not be appropriate because each patient is individually different. The validity of application of this theory in the Indian population and the present study has been undertaken. This study is an attempt to evaluate the curve of Spee and curve of Wilson in young Indian population using three dimensional analysis. This study compared the radius and the depth of right and left, maxillary and mandibular curves of Spee and the radius of maxillary and mandibular curves of Wilson in males and females. The cusp tips of canines, buccal cusp tips of premolars and molars and palatal/lingual cusp tips of second molars of 60 maxillary and 60 mandibular casts were obtained. Three-dimensional (x, y, z) coordinates of the cusp tips of the molars, premolars, and canines of the right and left sides of the maxilla and mandible were obtained with three dimensional coordinate measuring machine. The radius and the depth of right and left, maxillary and mandibular curves of Spee and the radius of maxillary and mandibular curves of Wilson were measured by means of computer software Metrologic-XG. Pearson's correlation test and Independent t-test were used to test the statistical significance (α=.05). The values of curve of Spee and curve of Wilson in Indian population obtained from this study were higher than the 4 inch (100 mm) radius proposed by Monson. These findings suggest ethnic differences in the radius of curve of Spee and curve of Wilson.
Ruijter, Jan M; Pfaffl, Michael W; Zhao, Sheng; Spiess, Andrej N; Boggy, Gregory; Blom, Jochen; Rutledge, Robert G; Sisti, Davide; Lievens, Antoon; De Preter, Katleen; Derveaux, Stefaan; Hellemans, Jan; Vandesompele, Jo
2013-01-01
RNA transcripts such as mRNA or microRNA are frequently used as biomarkers to determine disease state or response to therapy. Reverse transcription (RT) in combination with quantitative PCR (qPCR) has become the method of choice to quantify small amounts of such RNA molecules. In parallel with the democratization of RT-qPCR and its increasing use in biomedical research or biomarker discovery, we witnessed a growth in the number of gene expression data analysis methods. Most of these methods are based on the principle that the position of the amplification curve with respect to the cycle-axis is a measure for the initial target quantity: the later the curve, the lower the target quantity. However, most methods differ in the mathematical algorithms used to determine this position, as well as in the way the efficiency of the PCR reaction (the fold increase of product per cycle) is determined and applied in the calculations. Moreover, there is dispute about whether the PCR efficiency is constant or continuously decreasing. Together this has lead to the development of different methods to analyze amplification curves. In published comparisons of these methods, available algorithms were typically applied in a restricted or outdated way, which does not do them justice. Therefore, we aimed at development of a framework for robust and unbiased assessment of curve analysis performance whereby various publicly available curve analysis methods were thoroughly compared using a previously published large clinical data set (Vermeulen et al., 2009) [11]. The original developers of these methods applied their algorithms and are co-author on this study. We assessed the curve analysis methods' impact on transcriptional biomarker identification in terms of expression level, statistical significance, and patient-classification accuracy. The concentration series per gene, together with data sets from unpublished technical performance experiments, were analyzed in order to assess the algorithms' precision, bias, and resolution. While large differences exist between methods when considering the technical performance experiments, most methods perform relatively well on the biomarker data. The data and the analysis results per method are made available to serve as benchmark for further development and evaluation of qPCR curve analysis methods (http://qPCRDataMethods.hfrc.nl). Copyright © 2012 Elsevier Inc. All rights reserved.
Light curve of the optical counterpart of 2A0311-227
NASA Technical Reports Server (NTRS)
Williams, G.; Hiltner, W. A.
1980-01-01
Visual and blue light curves are presented for the optical counterpart of the X-ray source 2A0311-227. This system, which is the newest member of the AM Herculis class of binaries, has an orbital period of 81 minutes which also modulates the visual light curve. A Fourier analysis of the data has revealed the presence of a 6-minute oscillation, at least in the visual light curve. Whether or not it is also present in the blue light curve is unclear.
Simonsohn, Uri; Simmons, Joseph P; Nelson, Leif D
2015-12-01
When studies examine true effects, they generate right-skewed p-curves, distributions of statistically significant results with more low (.01 s) than high (.04 s) p values. What else can cause a right-skewed p-curve? First, we consider the possibility that researchers report only the smallest significant p value (as conjectured by Ulrich & Miller, 2015), concluding that it is a very uncommon problem. We then consider more common problems, including (a) p-curvers selecting the wrong p values, (b) fake data, (c) honest errors, and (d) ambitiously p-hacked (beyond p < .05) results. We evaluate the impact of these common problems on the validity of p-curve analysis, and provide practical solutions that substantially increase its robustness. (c) 2015 APA, all rights reserved).
Using Count Data and Ordered Models in National Forest Recreation Demand Analysis
NASA Astrophysics Data System (ADS)
Simões, Paula; Barata, Eduardo; Cruz, Luis
2013-11-01
This research addresses the need to improve our knowledge on the demand for national forests for recreation and offers an in-depth data analysis supported by the complementary use of count data and ordered models. From a policy-making perspective, while count data models enable the estimation of monetary welfare measures, ordered models allow for the wider use of the database and provide a more flexible analysis of data. The main purpose of this article is to analyse the individual forest recreation demand and to derive a measure of its current use value. To allow a more complete analysis of the forest recreation demand structure the econometric approach supplements the use of count data models with ordered category models using data obtained by means of an on-site survey in the Bussaco National Forest (Portugal). Overall, both models reveal that travel cost and substitute prices are important explanatory variables, visits are a normal good and demographic variables seem to have no influence on demand. In particular, estimated price and income elasticities of demand are quite low. Accordingly, it is possible to argue that travel cost (price) in isolation may be expected to have a low impact on visitation levels.
Byon, Ha Do; Harrington, Donna; Storr, Carla L; Lipscomb, Jane
2017-08-01
Workplace violence research in health care settings using the Job Demands-Resources (JD-R) framework is hindered by the lack of comprehensive examination of the factor structure of the JD-R measure when it includes patient violence. Is patient violence a component of job demands or its own factor as an occupational outcome? Exploratory factor analysis and confirmatory factor analysis were conducted using a sample of direct care workers in the home setting (n = 961). The overall 2-construct JD-R structure persisted. Patient violence was not identified as a separate factor from job demands; rather, two demand factors emerged: violence/emotional and workload/physical demands. Although the three-factor model fits the data, the two-factor model with patient violence being a component of job demands is a parsimonious and effective measurement framework.
Robert F. Powers
1972-01-01
Four sets of standard site index curves based on statewide or regionwide averages were compared with data on natural growth from nine young stands of ponderosa pine in northern California. The curves tested were by Meyer; Dunning; Dunning and Reineke; and Arvanitis, Lindquist, and Palley. The effects of soils on height growth were also studied. Among the curves tested...
NASA Astrophysics Data System (ADS)
Wang, Meng; Zhang, Huaiqiang; Zhang, Kan
2017-10-01
Focused on the circumstance that the equipment using demand in the short term and the development demand in the long term should be made overall plans and took into consideration in the weapons portfolio planning and the practical problem of the fuzziness in the definition of equipment capacity demand. The expression of demand is assumed to be an interval number or a discrete number. With the analysis method of epoch-era, a long planning cycle is broke into several short planning cycles with different demand value. The multi-stage stochastic programming model is built aimed at maximize long-term planning cycle demand under the constraint of budget, equipment development time and short planning cycle demand. The scenario tree is used to discretize the interval value of the demand, and genetic algorithm is designed to solve the problem. At last, a case is studied to demonstrate the feasibility and effectiveness of the proposed mode.
Giraud, Nicolas; Blackledge, Martin; Goldman, Maurice; Böckmann, Anja; Lesage, Anne; Penin, François; Emsley, Lyndon
2005-12-28
A detailed analysis of nitrogen-15 longitudinal relaxation times in microcrystalline proteins is presented. A theoretical model to quantitatively interpret relaxation times is developed in terms of motional amplitude and characteristic time scale. Different averaging schemes are examined in order to propose an analysis of relaxation curves that takes into account the specificity of MAS experiments. In particular, it is shown that magic angle spinning averages the relaxation rate experienced by a single spin over one rotor period, resulting in individual relaxation curves that are dependent on the orientation of their corresponding carousel with respect to the rotor axis. Powder averaging thus leads to a nonexponential behavior in the observed decay curves. We extract dynamic information from experimental decay curves, using a diffusion in a cone model. We apply this study to the analysis of spin-lattice relaxation rates of the microcrystalline protein Crh at two different fields and determine differential dynamic parameters for several residues in the protein.
Demand and capacity planning in the emergency department: how to do it.
Higginson, I; Whyatt, J; Silvester, K
2011-02-01
Unless emergency departments have adequate capacity to meet demand, they will fail to meet clinical and performance standards and will be operating in the 'coping zone'. This carries risks both for staff and patients. As part of a quality improvement programme, the authors undertook an in-depth analysis of demand and capacity for an emergency department in the UK. The paper describes this rigorous approach to capacity planning, which draws on techniques from other industries. Proper capacity planning is vital, but is often poorly done. Planning using aggregated data will lead to inadequate capacity. Understanding demand, and particularly the variation in that demand, is critical to success. Analysis of emergency department demand and capacity is the first step towards effective workforce planning and process redesign.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
Advanced Distribution Network Modelling with Distributed Energy Resources
NASA Astrophysics Data System (ADS)
O'Connell, Alison
The addition of new distributed energy resources, such as electric vehicles, photovoltaics, and storage, to low voltage distribution networks means that these networks will undergo major changes in the future. Traditionally, distribution systems would have been a passive part of the wider power system, delivering electricity to the customer and not needing much control or management. However, the introduction of these new technologies may cause unforeseen issues for distribution networks, due to the fact that they were not considered when the networks were originally designed. This thesis examines different types of technologies that may begin to emerge on distribution systems, as well as the resulting challenges that they may impose. Three-phase models of distribution networks are developed and subsequently utilised as test cases. Various management strategies are devised for the purposes of controlling distributed resources from a distribution network perspective. The aim of the management strategies is to mitigate those issues that distributed resources may cause, while also keeping customers' preferences in mind. A rolling optimisation formulation is proposed as an operational tool which can manage distributed resources, while also accounting for the uncertainties that these resources may present. Network sensitivities for a particular feeder are extracted from a three-phase load flow methodology and incorporated into an optimisation. Electric vehicles are the focus of the work, although the method could be applied to other types of resources. The aim is to minimise the cost of electric vehicle charging over a 24-hour time horizon by controlling the charge rates and timings of the vehicles. The results demonstrate the advantage that controlled EV charging can have over an uncontrolled case, as well as the benefits provided by the rolling formulation and updated inputs in terms of cost and energy delivered to customers. Building upon the rolling optimisation, a three-phase optimal power flow method is developed. The formulation has the capability to provide optimal solutions for distribution system control variables, for a chosen objective function, subject to required constraints. It can, therefore, be utilised for numerous technologies and applications. The three-phase optimal power flow is employed to manage various distributed resources, such as photovoltaics and storage, as well as distribution equipment, including tap changers and switches. The flexibility of the methodology allows it to be applied in both an operational and a planning capacity. The three-phase optimal power flow is employed in an operational planning capacity to determine volt-var curves for distributed photovoltaic inverters. The formulation finds optimal reactive power settings for a number of load and solar scenarios and uses these reactive power points to create volt-var curves. Volt-var curves are determined for 10 PV systems on a test feeder. A universal curve is also determined which is applicable to all inverters. The curves are validated by testing them in a power flow setting over a 24-hour test period. The curves are shown to provide advantages to the feeder in terms of reduction of voltage deviations and unbalance, with the individual curves proving to be more effective. It is also shown that adding a new PV system to the feeder only requires analysis for that system. In order to represent the uncertainties that inherently occur on distribution systems, an information gap decision theory method is also proposed and integrated into the three-phase optimal power flow formulation. This allows for robust network decisions to be made using only an initial prediction for what the uncertain parameter will be. The work determines tap and switch settings for a test network with demand being treated as uncertain. The aim is to keep losses below a predefined acceptable value. The results provide the decision maker with the maximum possible variation in demand for a given acceptable variation in the losses. A validation is performed with the resulting tap and switch settings being implemented, and shows that the control decisions provided by the formulation keep losses below the acceptable value while adhering to the limits imposed by the network.
NASA Astrophysics Data System (ADS)
Latorre, Borja; Peña-Sancho, Carolina; Angulo-Jaramillo, Rafaël; Moret-Fernández, David
2015-04-01
Measurement of soil hydraulic properties is of paramount importance in fields such as agronomy, hydrology or soil science. Fundamented on the analysis of the Haverkamp et al. (1994) model, the aim of this paper is to explain a technique to estimate the soil hydraulic properties (sorptivity, S, and hydraulic conductivity, K) from the full-time cumulative infiltration curves. The method (NSH) was validated by means of 12 synthetic infiltration curves generated with HYDRUS-3D from known soil hydraulic properties. The K values used to simulate the synthetic curves were compared to those estimated with the proposed method. A procedure to identify and remove the effect of the contact sand layer on the cumulative infiltration curve was also developed. A sensitivity analysis was performed using the water level measurement as uncertainty source. Finally, the procedure was evaluated using different infiltration times and data noise. Since a good correlation between the K used in HYDRUS-3D to model the infiltration curves and those estimated by the NSH method was obtained, (R2 =0.98), it can be concluded that this technique is robust enough to estimate the soil hydraulic conductivity from complete infiltration curves. The numerical procedure to detect and remove the influence of the contact sand layer on the K and S estimates seemed to be robust and efficient. An effect of the curve infiltration noise on the K estimate was observed, which uncertainty increased with increasing noise. Finally, the results showed that infiltration time was an important factor to estimate K. Lower values of K or smaller uncertainty needed longer infiltration times.
AtomicJ: An open source software for analysis of force curves
NASA Astrophysics Data System (ADS)
Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina
2014-06-01
We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.
NASA Astrophysics Data System (ADS)
Schmidt, R. D.; Taylor, R. G.; Stodick, L. D.; Contor, B. A.
2009-12-01
A recent federal interagency report on climate change and water management (Brekke et. al., 2009) describes several possible management responses to the impacts of climate change on water supply and demand. Management alternatives include changes to water supply infrastructure, reservoir system operations, and water demand policies. Water users in the Bureau of Reclamation’s Boise Project (located in the Lower Boise River basin in southwestern Idaho) would be among those impacted both hydrologically and economically by climate change. Climate change and management responses to climate change are expected to cause shifts in water supply and demand. Supply shifts would result from changes in basin precipitation patterns, and demand shifts would result from higher evapotranspiration rates and a longer growing season. The impacts would also extend to non-Project water users in the basin, since most non-Project groundwater pumpers and drain water diverters rely on hydrologic externalities created by seepage losses from Boise Project water deliveries. An integrated hydrologic-economic model was developed for the Boise basin to aid Reclamation in evaluating the hydrologic and economic impacts of various management responses to climate change. A spatial, partial-equilibrium, economic optimization model calculates spatially-distinct equilibrium water prices and quantities, and maximizes a social welfare function (the sum of consumer and producers surpluses) for all agricultural and municipal water suppliers and demanders (both Project and non-Project) in the basin. Supply-price functions and demand-price functions are exogenous inputs to the economic optimization model. On the supply side, groundwater and river/reservoir models are used to generate hydrologic responses to various management alternatives. The response data is then used to develop water supply-price functions for Project and non-Project water users. On the demand side, crop production functions incorporating crop distribution, evapotranspiration rates, irrigation efficiencies, and crop prices are used to develop water demand-price functions for agricultural water users. Demand functions for municipal and industrial water users are also developed. Recent applications of the integrated model have focused on the hydrologic and economic impacts of demand management alternatives, including large-scale canal lining conservation measures, and market-based water trading between canal diverters and groundwater pumpers. A supply management alternative being investigated involves revising reservoir rule curves to compensate for climate change impacts on timing of reservoir filling.
A generic hydroeconomic model to assess future water scarcity
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice
2015-04-01
We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on the maximization of water benefits, over time and space. A parameterisation-simulation-optimisation approach is used. This gives a projection of future water scarcity in the different locations and an estimation of the associated direct economic losses from unsatisfied demands. This generic hydroeconomic model can be easily applied to large-scale regions, in particular developing regions where little reliable data is available. We will present an application to Algeria, up to the 2050 horizon.
PMAnalyzer: a new web interface for bacterial growth curve analysis.
Cuevas, Daniel A; Edwards, Robert A
2017-06-15
Bacterial growth curves are essential representations for characterizing bacteria metabolism within a variety of media compositions. Using high-throughput, spectrophotometers capable of processing tens of 96-well plates, quantitative phenotypic information can be easily integrated into the current data structures that describe a bacterial organism. The PMAnalyzer pipeline performs a growth curve analysis to parameterize the unique features occurring within microtiter wells containing specific growth media sources. We have expanded the pipeline capabilities and provide a user-friendly, online implementation of this automated pipeline. PMAnalyzer version 2.0 provides fast automatic growth curve parameter analysis, growth identification and high resolution figures of sample-replicate growth curves and several statistical analyses. PMAnalyzer v2.0 can be found at https://edwards.sdsu.edu/pmanalyzer/ . Source code for the pipeline can be found on GitHub at https://github.com/dacuevas/PMAnalyzer . Source code for the online implementation can be found on GitHub at https://github.com/dacuevas/PMAnalyzerWeb . dcuevas08@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Longitudinal Models of Reliability and Validity: A Latent Curve Approach.
ERIC Educational Resources Information Center
Tisak, John; Tisak, Marie S.
1996-01-01
Dynamic generalizations of reliability and validity that will incorporate longitudinal or developmental models, using latent curve analysis, are discussed. A latent curve model formulated to depict change is incorporated into the classical definitions of reliability and validity. The approach is illustrated with sociological and psychological…
Simulator evaluation of manually flown curved instrument approaches. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sager, D.
1973-01-01
Pilot performance in flying horizontally curved instrument approaches was analyzed by having nine test subjects fly curved approaches in a fixed-base simulator. Approaches were flown without an autopilot and without a flight director. Evaluations were based on deviation measurements made at a number of points along the curved approach path and on subject questionnaires. Results indicate that pilots can fly curved approaches, though less accurately than straight-in approaches; that a moderate wind does not effect curve flying performance; and that there is no performance difference between 60 deg. and 90 deg. turns. A tradeoff of curve path parameters and a paper analysis of wind compensation were also made.
Time dependence of solid-particle impingement erosion of an aluminum alloy
NASA Technical Reports Server (NTRS)
Veerabhadrarao, P.; Buckley, D. H.
1983-01-01
Erosion studies were conducted on 6061-T6511 aluminum alloy by using jet impingement of glass beads and crushed glass particles to investigate the influence of exposure time on volume loss rate at different pressures. The results indicate a direct relationship between erosion-versus-time curves and pitmorphology (width, depth, and width-depth ratio)-versus-time curves for both glass forms. Extensive erosion data from the literature were analyzed to find the variations of erosion-rate-versus-time curves with respect to the type of device, the size and shape of erodent particles, the abrasive charge, the impact velocity, etc. Analysis of the experimental data, obtained with two forms of glass, resulted in three types of erosion-rate-versus-time curves: (1) curves with incubation, acceleration, and steadystate periods (type 1); (2) curves with incubation, acceleration, decleration, and steady-state periods (type 3); and (3) curves with incubation, acceleration, peak rate, and deceleration periods (type 4). The type 4 curve is a less frequently seen curve and was not reported in the literature. Analysis of extensive literature data generally indicated three types of erosion-rate-versus-time curves. Two types (types 1 and 3) were observed in the present study; the third type involves incubation (and deposition), acceleration, and steady-state periods (type 2). Examination of the extensive literature data indicated that it is absolutely necessary to consider the corresponding stages or periods of erosion in correlating and characterizing erosion resistance of a wide spectrum of ductile materials.
Structural analysis of cylindrical thrust chambers, volume 1
NASA Technical Reports Server (NTRS)
Armstrong, W. H.
1979-01-01
Life predictions of regeneratively cooled rocket thrust chambers are normally derived from classical material fatigue principles. The failures observed in experimental thrust chambers do not appear to be due entirely to material fatigue. The chamber coolant walls in the failed areas exhibit progressive bulging and thinning during cyclic firings until the wall stress finally exceeds the material rupture stress and failure occurs. A preliminary analysis of an oxygen free high conductivity (OFHC) copper cylindrical thrust chamber demonstrated that the inclusion of cumulative cyclic plastic effects enables the observed coolant wall thinout to be predicted. The thinout curve constructed from the referent analysis of 10 firing cycles was extrapolated from the tenth cycle to the 200th cycle. The preliminary OFHC copper chamber 10-cycle analysis was extended so that the extrapolated thinout curve could be established by performing cyclic analysis of deformed configurations at 100 and 200 cycles. Thus the original range of extrapolation was reduced and the thinout curve was adjusted by using calculated thinout rates at 100 and 100 cycles. An analysis of the same underformed chamber model constructed of half-hard Amzirc to study the effect of material properties on the thinout curve is included.
Montgomery, Anthony; Spânu, Florina; Băban, Adriana; Panagopoulou, Efharis
2015-01-01
According to the Job Demands-Resources (JD-R) model, burnout and engagement are psychological reactions that develop when individual characteristics interact with work characteristics. This study tests the JD-R model using multilevel analysis to test the main and moderating effects of teamwork effectiveness among 1156 nurses in 93 departments from seven European countries. Workload, emotional and organizational demands were positively associated with emotional exhaustion, depersonalization, and negatively with vigor. Emotional and organizational demands were negatively associated with dedication. Teamwork effectiveness was positively associated with engagement. We found no evidence for the moderating effect of teamwork effectiveness in reducing individual perceptions of demands. PMID:26877971
Montgomery, Anthony; Spânu, Florina; Băban, Adriana; Panagopoulou, Efharis
2015-09-01
According to the Job Demands-Resources (JD-R) model, burnout and engagement are psychological reactions that develop when individual characteristics interact with work characteristics. This study tests the JD-R model using multilevel analysis to test the main and moderating effects of teamwork effectiveness among 1156 nurses in 93 departments from seven European countries. Workload, emotional and organizational demands were positively associated with emotional exhaustion, depersonalization, and negatively with vigor. Emotional and organizational demands were negatively associated with dedication. Teamwork effectiveness was positively associated with engagement. We found no evidence for the moderating effect of teamwork effectiveness in reducing individual perceptions of demands.
A computer program (MACPUMP) for interactive aquifer-test analysis
Day-Lewis, F. D.; Person, M.A.; Konikow, Leonard F.
1995-01-01
This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.
Bianchi, Lorenzo; Schiavina, Riccardo; Borghesi, Marco; Bianchi, Federico Mineo; Briganti, Alberto; Carini, Marco; Terrone, Carlo; Mottrie, Alex; Gacci, Mauro; Gontero, Paolo; Imbimbo, Ciro; Marchioro, Giansilvio; Milanese, Giulio; Mirone, Vincenzo; Montorsi, Francesco; Morgia, Giuseppe; Novara, Giacomo; Porreca, Angelo; Volpe, Alessandro; Brunocilla, Eugenio
2018-04-06
To assess the predictive accuracy and the clinical value of a recent nomogram predicting cancer-specific mortality-free survival after surgery in pN1 prostate cancer patients through an external validation. We evaluated 518 prostate cancer patients treated with radical prostatectomy and pelvic lymph node dissection with evidence of nodal metastases at final pathology, at 10 tertiary centers. External validation was carried out using regression coefficients of the previously published nomogram. The performance characteristics of the model were assessed by quantifying predictive accuracy, according to the area under the curve in the receiver operating characteristic curve and model calibration. Furthermore, we systematically analyzed the specificity, sensitivity, positive predictive value and negative predictive value for each nomogram-derived probability cut-off. Finally, we implemented decision curve analysis, in order to quantify the nomogram's clinical value in routine practice. External validation showed inferior predictive accuracy as referred to in the internal validation (65.8% vs 83.3%, respectively). The discrimination (area under the curve) of the multivariable model was 66.7% (95% CI 60.1-73.0%) by testing with receiver operating characteristic curve analysis. The calibration plot showed an overestimation throughout the range of predicted cancer-specific mortality-free survival rates probabilities. However, in decision curve analysis, the nomogram's use showed a net benefit when compared with the scenarios of treating all patients or none. In an external setting, the nomogram showed inferior predictive accuracy and suboptimal calibration characteristics as compared to that reported in the original population. However, decision curve analysis showed a clinical net benefit, suggesting a clinical implication to correctly manage pN1 prostate cancer patients after surgery. © 2018 The Japanese Urological Association.
Electrical Assessment, Capacity, and Demand Study for Fort Wainwright, Alaska
2007-09-01
0 108 5 117 20 97 5 127 40 87 5 136 60 76 5 147 80 67 5 157 100 57 5 167 120 47 5 169 125 44 Several lessons can be learned from the curves and...Note: a. The legibility of the original is extremely marginal, and the steam rates may be incorrect. Several important lessons can be learned from...Loads with inductive motors and little or no compensating capacitance would have a PF of less than one. PFs assumed for design are commonly taken as 0.80
AceCloud: Molecular Dynamics Simulations in the Cloud.
Harvey, M J; De Fabritiis, G
2015-05-26
We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.
Corominas, Albert; Fossas, Enric
2015-01-01
We assume a monopolistic market for a non-durable non-renewable resource such as crude oil, phosphates or fossil water. Stating the problem of obtaining optimal policies on extraction and pricing of the resource as a non-linear program allows general conclusions to be drawn under diverse assumptions about the demand curve, discount rates and length of the planning horizon. We compare the results with some common beliefs about the pace of exhaustion of this kind of resources.
Using TPO data to estimate timber demand in support of planning on the Tongass National Forest
Jean M. Daniels; Michael D. Paruszkiewicz; Susan J. Alexander
2015-01-01
Projections of Alaska timber products output, the derived demand for logs, lumber, residues, and niche products, and timber harvest by owner are developed by using a trend-based analysis. This is the fifth such analysis performed since 1990 to assist planners in meeting statutory requirements for estimating planning cycle demand for timber from the Tongass National...
The validity of survivorship analysis in total joint arthroplasty.
Dorey, F; Amstutz, H C
1989-04-01
The use of survivorship analysis requires an assumption that patients who are lost to follow-up are no more or less likely to be at risk of failure of an operation or a procedure than are patients who are still being followed. This is a major assumption in long-term orthopaedic studies, in which a high percentage of patients are usually lost to follow-up. We compared the survivorship curve for the first 100 Tharies replacements done at our institution (which were completed by September 1977), using data that were collected in the standard way up to 1985, through a letter requesting a follow-up visit, with the curve for the same patients that was based on almost complete follow-up data that were gathered by telephone from 1985 on. The similarity of the two curves suggested that the assumptions that are necessary for the validity of survivorship analysis are reasonable, even in the orthopaedic setting, in which many patients are lost to follow-up. The usefulness of the survivorship curve for prediction was also evaluated by comparing the curve based on the first forty-six of the 100 Tharies replacements (before 1977) with the curve based on the last fifty-four such operations (from January 1977 to September 1977). The results of these two comparisons suggest that survivorship analysis is a valid technique to use in the long-term evaluation of patients who have had a joint replacement.
Upper arm circumference development in Chinese children and adolescents: a pooled analysis.
Tong, Fang; Fu, Tong
2015-05-30
Upper arm development in children is different in different ethnic groups. There have been few reports on upper arm circumference (UAC) at different stages of development in children and adolescents in China. The purpose of this study was to provide a reference for growth with weighted assessment of the overall level of development. Using a pooled analysis, an authoritative journal database search and reports of UAC, we created a new database on developmental measures in children. In conducting a weighted analysis, we compared reference values for 0~60 months of development according to the World Health Organization (WHO) statistics considering gender and nationality and used Z values as interval values for the second sampling to obtain an exponential smooth curve to analyze the mean, standard deviation, and sites of attachment. Ten articles were included in the pooled analysis, and these articles included participants from different areas of China. The point of intersection with the WHO curve was 3.5 years with higher values at earlier ages and lower values at older ages. Boys curve was steeper after puberty. The curves in the studies had a merged line compatible. The Z values of exponential smoothing showed the curves were similar for body weight and had a right normal distribution. The integrated index of UAC in Chinese children and adolescents indicated slightly variations with regions. Exponential curve smoothing was suitable for assessment at different developmental stages.
Retrospective North American CFL Experience Curve Analysis and Correlation to Deployment Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Sarah J.; Wei, Max; Sohn, Michael D.
Retrospective experience curves are a useful tool for understanding historic technology development, and can contribute to investment program analysis and future cost estimation efforts. This work documents our development of an analysis approach for deriving retrospective experience curves with a variable learning rate, and its application to develop an experience curve for compact fluorescent lamps for the global and North American markets over the years 1990-2007. Uncertainties and assumptions involved in interpreting data for our experience curve development are discussed, including the processing and transformation of empirical data, the selection of system boundaries, and the identification of historical changes inmore » the learning rate over the course of 15 years. In the results that follow, we find that that the learning rate has changed at least once from 1990-2007. We also explore if, and to what degree, public deployment programs may have contributed to an increased technology learning rate in North America. We observe correlations between the changes in the learning rate and the initiation of new policies, abrupt technological advances, including improvements to ballast technology, and economic and political events such as trade tariffs and electricity prices. Finally, we discuss how the findings of this work (1) support the use of segmented experience curves for retrospective and prospective analysis and (2) may imply that investments in technological research and development have contributed to a change in market adoption and penetration.« less
Cognitive task demands, self-control demands and the mental well-being of office workers.
Bridger, Robert S; Brasher, Kate
2011-09-01
The cognitive task demands of office workers and the self-control demands of their work roles were measured in a sample of 196 employees in two different office layouts using a self-report questionnaire, which was circulated electronically. Multiple linear regression analysis revealed that both factors were associated with mental well-being, but not with physical well-being, while controlling for exposure to psychosocial stressors. The interaction between cognitive task demands and self-control demands had the strongest association with mental well-being, suggesting that the deleterious effect of one was greater when the other was present. An exploratory analysis revealed that the association was stronger for employees working in a large open-plan office than for those working in smaller offices with more privacy. Frustration of work goals was the cognitive task demand having the strongest negative impact on mental well-being. Methodological limitations and scale psychometrics (particularly the use of the NASA Task Load Index) are discussed. STATEMENT OF RELEVANCE: Modern office work has high mental demands and low physical demands and there is a need to design offices to prevent adverse psychological reactions. It is shown that cognitive task demands interact with self-control demands to degrade mental well-being. The association was stronger in an open-plan office.
Parallel Curves: Getting There and Getting Back
ERIC Educational Resources Information Center
Agnew, A. F.; Mathews, J. H.
2006-01-01
This note takes up the issue of parallel curves while illustrating the utility of "Mathematica" in computations. This work complements results presented earlier. The presented treatment, considering the more general case of parametric curves, provides an analysis of the appearance of cusp singularities, and emphasizes the utility of symbolic…
The Simulation of an Oxidation-Reduction Titration Curve with Computer Algebra
ERIC Educational Resources Information Center
Whiteley, Richard V., Jr.
2015-01-01
Although the simulation of an oxidation/reduction titration curve is an important exercise in an undergraduate course in quantitative analysis, that exercise is frequently simplified to accommodate computational limitations. With the use of readily available computer algebra systems, however, such curves for complicated systems can be generated…
NASA Astrophysics Data System (ADS)
Zuo, Ye; Sun, Guangjun; Li, Hongjing
2018-01-01
Under the action of near-fault ground motions, curved bridges are prone to pounding, local damage of bridge components and even unseating. A multi-scale fine finite element model of a typical three-span curved bridge is established by considering the elastic-plastic behavior of piers and pounding effect of adjacent girders. The nonlinear time-history method is used to study the seismic response of the curved bridge equipped with unseating failure control system under the action of near-fault ground motion. An in-depth analysis is carried to evaluate the control effect of the proposed unseating failure control system. The research results indicate that under the near-fault ground motion, the seismic response of the curved bridge is strong. The unseating failure control system perform effectively to reduce the pounding force of the adjacent girders and the probability of deck unseating.
Mathematical and Statistical Software Index.
1986-08-01
geometric) mean HMEAN - harmonic mean MEDIAN - median MODE - mode QUANT - quantiles OGIVE - distribution curve IQRNG - interpercentile range RANGE ... range mutliphase pivoting algorithm cross-classification multiple discriminant analysis cross-tabul ation mul tipl e-objecti ve model curve fitting...Statistics). .. .. .... ...... ..... ...... ..... .. 21 *RANGEX (Correct Correlations for Curtailment of Range ). .. .. .... ...... ... 21 *RUMMAGE II (Analysis
A Software Tool for the Rapid Analysis of the Sintering Behavior of Particulate Bodies
2017-11-01
bounded by a region that the user selects via cross hairs . Future plot analysis features, such as more complicated curve fitting and modeling functions...German RM. Grain growth behavior of tungsten heavy alloys based on the master sintering curve concept. Metallurgical and Materials Transactions A
Modeling of cumulative ash curve in hard red spring wheat
USDA-ARS?s Scientific Manuscript database
Analysis of cumulative ash curves (CAC) is very important for evaluation of milling quality of wheat and blending different millstreams for specific applications. The aim of this research was to improve analysis of CAC. Five hard red spring wheat genotype composites from two regions were milled on...
Probability Density Functions of Observed Rainfall in Montana
NASA Technical Reports Server (NTRS)
Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.
1995-01-01
The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.
Life cycle based analysis of demands and emissions for residential water-using appliances.
Lee, Mengshan; Tansel, Berrin
2012-06-30
Environmental impacts of energy and water demand and greenhouse gas emissions from three residential water-using appliances were analyzed using life cycle assessment (LCA) based approach in collaboration of economic input-output model. This study especially focused on indirect consumption and environmental impacts from end-use/demand phase of each appliance. Water-related activities such as water supply, water heating and wastewater treatment were included in the analysis. The results showed that environmental impacts from end-use/demand phase are most significant for the water system, particularly for the energy demand for water heating (73% for clothes washer and 93% for showerheads). Reducing water/hot water consumption during the end-use/demand phase is expected to improve the overall water-related energy burden and water use sustainability. In the analysis of optimal lifespan for appliances, the estimated values (8-21 years) using energy consumption balance approach were found to be lower than that using other methods (10-25 years). This implies that earlier replacement with efficiency models is encouraged to minimize the environmental impacts of the product. Copyright © 2012 Elsevier Ltd. All rights reserved.
Wilson, Kristina; Senay, Ibrahim; Durantini, Marta; Sánchez, Flor; Hennessy, Michael; Spring, Bonnie; Albarracín, Dolores
2015-03-01
A meta-analysis of 150 research reports summarizing the results of multiple behavior domain interventions examined theoretical predictions about the effects of the included number of recommendations on behavioral and clinical change in the domains of smoking, diet, and physical activity. The meta-analysis yielded 3 main conclusions. First, there is a curvilinear relation between the number of behavioral recommendations and improvements in behavioral and clinical measures, with a moderate number of recommendations producing the highest level of change. A moderate number of recommendations is likely to be associated with stronger effects because the intervention ensures the necessary level of motivation to implement the recommended changes, thereby increasing compliance with the goals set by the intervention, without making the intervention excessively demanding. Second, this curve was more pronounced when samples were likely to have low motivation to change, such as when interventions were delivered to nonpatient (vs. patient) populations, were implemented in nonclinic (vs. clinic) settings, used lay community (vs. expert) facilitators, and involved group (vs. individual) delivery formats. Finally, change in behavioral outcomes mediated the effects of number of recommended behaviors on clinical change. These findings provide important insights that can help guide the design of effective multiple behavior domain interventions. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Wilson, Kristina; Senay, Ibrahim; Durantini, Marta; Sánchez, Flor; Hennessy, Michael; Spring, Bonnie; Albarracín, Dolores
2016-01-01
A meta-analysis of 150 research reports summarizing the results of multiple behavior domain interventions examined theoretical predictions about the effects of the included number of recommendations on behavioral and clinical change in the domains of smoking, diet, and physical activity. The meta-analysis yielded three main conclusions. First, there is a curvilinear relation between the number of behavioral recommendations and improvements in behavioral and clinical measures, with a moderate number of recommendations producing the highest level of change. A moderate number of recommendations is likely to be associated with stronger effects because the intervention ensures the necessary level of motivation to implement the recommended changes, thereby increasing compliance with the goals set by the intervention, without making the intervention excessively demanding. Second, this curve was more pronounced when samples were likely to have low motivation to change, such as when interventions were delivered to non-patient (vs. patient) populations, were implemented in non-clinic (vs. clinic) settings, used lay community (vs. expert) facilitators, and involved group (vs. individual) delivery formats. Finally, change in behavioral outcomes mediated the effects of number of recommended behaviors on clinical change. These findings provide important insights that can help guide the design of effective multiple behavior domain interventions. PMID:25528345
Active transportation and demand management (ATDM) foundational research : analysis plan.
DOT National Transportation Integrated Search
2013-06-01
As part of the Federal Highway Administrations (FHWAs) Active Transportation and Demand Management (ATDM) Foundational Research, this publication presents a high-level analysis approach to evaluate four illustrative described in the AMS CONOPS ...
NASA Astrophysics Data System (ADS)
Seah, Lay Hoon; Clarke, David John; Hart, Christina Eugene
2014-04-01
This case study of a science lesson, on the topic thermal expansion, examines the language demands on students from an integrated science and language perspective. The data were generated during a sequence of 9 lessons on the topic of 'States of Matter' in a Grade 7 classroom (12-13 years old students). We identify the language demands by comparing students' writings with the scientific account of expansion that the teacher intended the students to learn. The comparison involved both content analysis and lexicogrammatical (LG) analysis. The framework of Systemic Functional Linguistics was adopted for the LG analysis. Our analysis reveals differences in the meaning and the way LG resources were employed between the students' writings and the scientific account. From these differences, we found the notion of condition-of-use for LG resources to be a significant aspect of the language that students need to appropriate in order to employ the language of school science appropriately. This notion potentially provides a means by which teachers could concurrently address the conceptual and representational demands of science learning. Finally, we reflect on how the complementary use of content analysis and LG analysis provides a way for integrating the science and language perspectives in order to understand the demands of learning science through language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ubelaker, D H; Buchholz, B A; Stewart, J
Radiocarbon dating, with special reference to the modern bomb-curve, can provide useful information to elucidate the date of death of skeletonized human remains. Interpretation can be enhanced with analysis of different types of tissues within a single skeleton because of the known variability of formation times and remodeling rates. Analysis of radiocarbon content of teeth, especially the enamel in tooth crowns provides information about the date of formation in the childhood years and in consideration of the known timing of tooth formation can be used to estimate the birth date after 1950 A.D. Radiocarbon analysis of modern cortical and trabecularmore » bone samples from the same skeleton may allow proper placement on the pre-1963 or post-1963 sides of the bomb-curve since most trabecular bone generally undergoes more rapid remodeling than does most cortical bone. Pre-1963 bone formation would produce higher radiocarbon values for most trabecular bone than for most cortical bone. This relationship is reversed for formation after 1963. Radiocarbon analysis was conducted in this study on dental, cortical and trabecular bone samples from two adult individuals of known birth (1925 and 1926) and death dates (1995 and 1959). As expected, the dental results correspond to pre-bomb bomb-curve values reflecting conditions during the childhoods of the individuals. The curve radiocarbon content of most bone samples reflected the higher modern bomb-curve values. Within the bone sample analyses, the values of the trabecular bone were higher than those of cortical bone and supported the known placement on the pre-1963 side of the bomb-curve.« less
Pizones, Javier; Martín-Buitrago, Mar Pérez; Sánchez Márquez, José Miguel; Fernández-Baíllo, Nicomedes; Baldan-Martin, Montserrat; Sánchez Pérez-Grueso, Francisco Javier
Retrospective comparative analysis. Study early-onset scoliosis (EOS) graduated patients to establish founded criteria for graduation decision making and determine the risks and benefits of definitive fusion. EOS is treated by growth-friendly techniques until skeletal maturity. Afterwards, patients can be "graduated," either by definitive fusion (posterior spinal fusion [PSF]) or by retaining the previous implants (Observation) with no additional surgery. Criteria for this decision making and the outcomes of definitive fusion are still underexplored. We analyzed a consecutive cohort of "graduated" patients after a distraction-based lengthening program. We gathered demographic, radiographic, and surgical data. The results of the two final treatment options were compared after 2 years' follow-up. A total of 32 patients were included. Four patients had incomplete records. Thirteen underwent PSF, and 15 were observed. The mean age at initial treatment was 8 ± 3 years, with a mean follow-up of 8.3 ± 2.9 years. Both groups had similar preoperative and final radiographic parameters (p > .05). The criteria for undergoing PSF were as follows: implant-related complications, main curve magnitude (PSF = 63.2° ± 9° vs. OBS = 47.9° ± 15°; p = .008), curve progression >10°, and sagittal misalignment (SVA). During PSF 12/13 patients underwent multiple osteotomies, one vertebrectomy, and 3 costoplasties. Surgical time was 291.5 ± 58 minutes; blood loss was 946 ± 375 mL; and the number of levels fused was 13.7. Coronal deformity was corrected 31%, T1-S1 length gained was 31 ± 19.6 mm and T1-T12 length gained was 9.3 ± 39 mm; kyphosis was reduced by 22%. However, coronal balance worsened by 2.3 ± 30.8 mm. No major complications were encountered in these patients. Graduation by PSF depended on unacceptable or progressive major curve deformity, sagittal misalignment, or complications with previous implants. Observation depended on curve stabilization, Cobb <50°, and coronal misalignment <20 mm. Definitive fusion effectively corrected coronal and sagittal deformity and increased trunk height. However, it exposed patients to a very demanding surgery without improvement in coronal balance. Level III, therapeutic. Copyright © 2017 Scoliosis Research Society. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Coulter, Kyle Jane, Ed.; Stanton, Marge, Ed.
Information on the current and projected supply of and demand for graduates of higher education in the food and agricultural sciences is presented, based on federal data bases. The supply data are aggregated by 11 educational clusters, and employment demand data are aggregated by eight occupational clusters. Analysis reveals imbalances in the…
NASA Astrophysics Data System (ADS)
Mazza, Fabio
2017-08-01
The curved surface sliding (CSS) system is one of the most in-demand techniques for the seismic isolation of buildings; yet there are still important aspects of its behaviour that need further attention. The CSS system presents variation of friction coefficient, depending on the sliding velocity of the CSS bearings, while friction force and lateral stiffness during the sliding phase are proportional to the axial load. Lateral-torsional response needs to be better understood for base-isolated structures located in near-fault areas, where fling-step and forward-directivity effects can produce long-period (horizontal) velocity pulses. To analyse these aspects, a six-storey reinforced concrete (r.c.) office framed building, with an L-shaped plan and setbacks in elevation, is designed assuming three values of the radius of curvature for the CSS system. Seven in-plan distributions of dynamic-fast friction coefficient for the CSS bearings, ranging from a constant value for all isolators to a different value for each, are considered in the case of low- and medium-type friction properties. The seismic analysis of the test structures is carried out considering an elastic-linear behaviour of the superstructure, while a nonlinear force-displacement law of the CSS bearings is considered in the horizontal direction, depending on sliding velocity and axial load. Given the lack of knowledge of the horizontal direction at which near-fault ground motions occur, the maximum torsional effects and residual displacements are evaluated with reference to different incidence angles, while the orientation of the strongest observed pulses is considered to obtain average values.
Are driving and overtaking on right curves more dangerous than on left curves?
Othman, Sarbaz; Thomson, Robert; Lannér, Gunnar
2010-01-01
It is well known that crashes on horizontal curves are a cause for concern in all countries due to the frequency and severity of crashes at curves compared to road tangents. A recent study of crashes in western Sweden reported a higher rate of crashes in right curves than left curves. To further understand this result, this paper reports the results of novel analyses of the responses of vehicles and drivers during negotiating and overtaking maneuvers on curves for right hand traffic. The overall objectives of the study were to find road parameters for curves that affect vehicle dynamic responses, to analyze these responses during overtaking maneuvers on curves, and to link the results with driver behavior for different curve directions. The studied road features were speed, super-elevation, radius and friction including their interactions, while the analyzed vehicle dynamic factors were lateral acceleration and yaw angular velocity. A simulation program, PC-Crash, has been used to simulate road parameters and vehicle response interaction in curves. Overtaking maneuvers have been simulated for all road feature combinations in a total of 108 runs. Analysis of variances (ANOVA) was performed, using two sided randomized block design, to find differences in vehicle responses for the curve parameters. To study driver response, a field test using an instrumented vehicle and 32 participants was reviewed as it contained longitudinal speed and acceleration data for analysis. The simulation results showed that road features affect overtaking performance in right and left curves differently. Overtaking on right curves was sensitive to radius and the interaction of radius with road condition; while overtaking on left curves was more sensitive to super-elevation. Comparisons of lateral acceleration and yaw angular velocity during these maneuvers showed different vehicle response configurations depending on curve direction and maneuver path. The field test experiments also showed that drivers behave differently depending on the curve direction where both speed and acceleration were higher on right than left curves. The implication of this study is that curve direction should be taken into consideration to a greater extent when designing and redesigning curves. It appears that the driver and the vehicle are influenced by different infrastructure factors depending on the curve direction. In addition, the results suggest that the vehicle dynamics response alone cannot explain the higher crash risk in right curves. Further studies of the links between driver, vehicle, and highway characteristics are needed, such as naturalistic driving studies, to identify the key safety indicators for highway safety.
Optical Variability Analysis of UU Aqr - an Eclipsing Nova-like System
NASA Astrophysics Data System (ADS)
Khruzina, T.; Katysheva, N.; Golysheva, P.; Shugarov, S.
2015-12-01
By using our photometric observations of nova-like system UU Aqr with unstable light curve during a few nights, we plotted phase-folded light curves and calculated a model of the system. We show that the complicated character of light curves can be explained by the spiral arms in the disk. We decomposed the syntesis photometric curve into separated components as accretion disk, white and red dwarf, hot line.
NASA Technical Reports Server (NTRS)
Strangways, R.
1981-01-01
The international demand for and supply of oil between the years 1980 and 2000 is assessed and future world oil prices and their implications for the price of jet fuel are estimated. Three critical questions are investigated: (1) how long will the world supply of oil continue to keep pace with its demand under likely trends in its use and discovery; (2) at what price will demand and supply clear the world oil market; (3) what does the analysis imply about the price of jet fuel. Projection of oil price is based upon supply and demand, which is consistent with microeconomic analysis.
Analysis of recent projections of electric power demand
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, Jr, D V
1993-08-01
This report reviews the changes and potential changes in the outlook for electric power demand since the publication of Review and Analysis of Electricity Supply Market Projections (B. Swezey, SERI/MR-360-3322, National Renewable Energy Laboratory). Forecasts of the following organizations were reviewed: DOE/Energy Information Administration, DOE/Policy Office, DRI/McGraw-Hill, North American Electric Reliability Council, and Gas Research Institute. Supply uncertainty was briefly reviewed to place the uncertainties of the demand outlook in perspective. Also discussed were opportunities for modular technologies, such as renewable energy technologies, to fill a potential gap in energy demand and supply.
Component Analysis of Remanent Magnetization Curves: A Revisit with a New Model Distribution
NASA Astrophysics Data System (ADS)
Zhao, X.; Suganuma, Y.; Fujii, M.
2017-12-01
Geological samples often consist of several magnetic components that have distinct origins. As the magnetic components are often indicative of their underlying geological and environmental processes, it is therefore desirable to identify individual components to extract associated information. This component analysis can be achieved using the so-called unmixing method, which fits a mixture model of certain end-member model distribution to the measured remanent magnetization curve. In earlier studies, the lognormal, skew generalized Gaussian and skewed Gaussian distributions have been used as the end-member model distribution in previous studies, which are performed on the gradient curve of remanent magnetization curves. However, gradient curves are sensitive to measurement noise as the differentiation of the measured curve amplifies noise, which could deteriorate the component analysis. Though either smoothing or filtering can be applied to reduce the noise before differentiation, their effect on biasing component analysis is vaguely addressed. In this study, we investigated a new model function that can be directly applied to the remanent magnetization curves and therefore avoid the differentiation. The new model function can provide more flexible shape than the lognormal distribution, which is a merit for modeling the coercivity distribution of complex magnetic component. We applied the unmixing method both to model and measured data, and compared the results with those obtained using other model distributions to better understand their interchangeability, applicability and limitation. The analyses on model data suggest that unmixing methods are inherently sensitive to noise, especially when the number of component is over two. It is, therefore, recommended to verify the reliability of component analysis by running multiple analyses with synthetic noise. Marine sediments and seafloor rocks are analyzed with the new model distribution. Given the same component number, the new model distribution can provide closer fits than the lognormal distribution evidenced by reduced residuals. Moreover, the new unmixing protocol is automated so that the users are freed from the labor of providing initial guesses for the parameters, which is also helpful to improve the subjectivity of component analysis.
Conceptualizing and measuring demand for recreation on national forests: a review and synthesis.
Brian E. Garber-Yonts
2005-01-01
This analysis examines the problem of measuring demand for recreation on national forests and other public lands. Current measures of recreation demand in Forest Service resource assessments and planning emphasize population-level participation rates and activity-based economic values for visitor days. Alternative measures and definitions of recreation demand are...
NASA Astrophysics Data System (ADS)
Zuhdi, Ubaidillah
2014-03-01
The purpose of this study is to analyze the impacts of final demand changes on total output of Japanese Information and Communication Technologies (ICT) sectors in future time. This study employs one of analysis tool in Input-Output (IO) analysis, demand-pull IO quantity model, in achieving the purpose. There are three final demand changes used in this study, namely (1) export, (2) import, and (3) outside households consumption changes. This study focuses on "pure change" condition, the condition that final demand changes only appear in analyzed sectors. The results show that export and outside households consumption modifications give positive impact while opposite impact could be seen in import change.
Demand projections for the northeast corridor : financial analysis
DOT National Transportation Integrated Search
1976-06-30
This report describes the development and results of intercity travel demand projections by city-pair prepared for the Northeast Corridor financial analysis. In addition associated analyses of projected passenger volumes by station and of selected al...
NASA Astrophysics Data System (ADS)
Papilaya, Renoldy L.
2018-02-01
Development of tourism in this era must balance between supply and demand aspects. The tendency of policy makers pay more attention to aspects of supply than demand aspect will lead to the development of tourist products and services do not get good results. This research examined the relationship between marine tourism demand, characteristics and number of visits to the level of perception and willingness to pay (WTP) for a tourists on a marine tourism destination in Ambon city. Respondents come from overseas tourists, domestic and local, amounting to 140 people. The analysis was performed descriptively and further using SEM analysis Amos 19.00 with path analysis. The analysis shows the close relationship between marine tourism demand variables, characteristics, and the number of tourist visits to the perception of value and WTP for tourists. Variable marine tourism demand when connected directly to the tourists perception and WTP tends to correlate negatively than when using a variable as a variable number of tourist visits as a intervening variables. Instead tourists characteristic variables are positively correlated directly or indirectly with the perception of value and WTP for tourists. It is hoped that with this study will motivate tourism policy makers and local communities to be concerned and studied aspects of perception, WTP, marine tourism demand, the number of tourist visits and the characteristics that turned out to be related to each other.
USDA-ARS?s Scientific Manuscript database
Dose-response curves with semiochemicals are reported in many articles in insect chemical ecology regarding neurophysiology and behavioral bioassays. Most such curves are shown in figures where the x-axis has order of magnitude increases in dosages versus responses on the y-axis represented by point...
DOT National Transportation Integrated Search
2009-06-01
This report describes how new injury risk curves for the knee/distal femur and the hip were : developed through reanalyses of existing peak knee impact force data. New hip injury risk : curves were developed using survival analysis with a lognormal d...
Davis, Lynne C; Rane, Shruti; Hiscock, Merrill
2013-01-01
A longstanding question in working memory (WM) research concerns the fractionation of verbal and nonverbal processing. Although some contemporary models include both domain-specific and general-purpose mechanisms, the necessity to postulate differential processing of verbal and nonverbal material remains unclear. In the present two-experiment series we revisit the order reconstruction paradigm that Jones, Farrand, Stuart, and Morris (1995) used to support a unitary model of WM. Goals were to assess (1) whether serial position curves for dot positions differ from curves for letter names; and (2) whether selective interference can be demonstrated. Although we replicated Jones et al.'s finding of similar serial position curves for the two tasks, this similarity could reflect the demands of the order reconstruction paradigm rather than undifferentiated processing of verbal and nonverbal stimuli. Both generalised and material-specific interference was found, which can be attributed to competition between primary and secondary tasks for attentional resources. As performance levels for the combined primary and secondary tasks exceed active WM capacity limits, primary task items apparently are removed from active memory during processing of the secondary list and held temporarily in maintenance storage. We conclude that active WM is multimodal but maintenance stores may be domain specific.
NASA Astrophysics Data System (ADS)
He, Shiyuan; Wang, Lifan; Huang, Jianhua Z.
2018-04-01
With growing data from ongoing and future supernova surveys, it is possible to empirically quantify the shapes of SNIa light curves in more detail, and to quantitatively relate the shape parameters with the intrinsic properties of SNIa. Building such relationships is critical in controlling systematic errors associated with supernova cosmology. Based on a collection of well-observed SNIa samples accumulated in the past years, we construct an empirical SNIa light curve model using a statistical method called the functional principal component analysis (FPCA) for sparse and irregularly sampled functional data. Using this method, the entire light curve of an SNIa is represented by a linear combination of principal component functions, and the SNIa is represented by a few numbers called “principal component scores.” These scores are used to establish relations between light curve shapes and physical quantities such as intrinsic color, interstellar dust reddening, spectral line strength, and spectral classes. These relations allow for descriptions of some critical physical quantities based purely on light curve shape parameters. Our study shows that some important spectral feature information is being encoded in the broad band light curves; for instance, we find that the light curve shapes are correlated with the velocity and velocity gradient of the Si II λ6355 line. This is important for supernova surveys (e.g., LSST and WFIRST). Moreover, the FPCA light curve model is used to construct the entire light curve shape, which in turn is used in a functional linear form to adjust intrinsic luminosity when fitting distance models.
Sánchez-Jiménez, Pedro E; Pérez-Maqueda, Luis A; Perejón, Antonio; Criado, José M
2013-02-05
This paper provides some clarifications regarding the use of model-fitting methods of kinetic analysis for estimating the activation energy of a process, in response to some results recently published in Chemistry Central journal. The model fitting methods of Arrhenius and Savata are used to determine the activation energy of a single simulated curve. It is shown that most kinetic models correctly fit the data, each providing a different value for the activation energy. Therefore it is not really possible to determine the correct activation energy from a single non-isothermal curve. On the other hand, when a set of curves are recorded under different heating schedules are used, the correct kinetic parameters can be clearly discerned. Here, it is shown that the activation energy and the kinetic model cannot be unambiguously determined from a single experimental curve recorded under non isothermal conditions. Thus, the use of a set of curves recorded under different heating schedules is mandatory if model-fitting methods are employed.
NASA Technical Reports Server (NTRS)
Mukkamala, R.; Cohen, R. J.; Mark, R. G.
2002-01-01
Guyton developed a popular approach for understanding the factors responsible for cardiac output (CO) regulation in which 1) the heart-lung unit and systemic circulation are independently characterized via CO and venous return (VR) curves, and 2) average CO and right atrial pressure (RAP) of the intact circulation are predicted by graphically intersecting the curves. However, this approach is virtually impossible to verify experimentally. We theoretically evaluated the approach with respect to a nonlinear, computational model of the pulsatile heart and circulation. We developed two sets of open circulation models to generate CO and VR curves, differing by the manner in which average RAP was varied. One set applied constant RAPs, while the other set applied pulsatile RAPs. Accurate prediction of intact, average CO and RAP was achieved only by intersecting the CO and VR curves generated with pulsatile RAPs because of the pulsatility and nonlinearity (e.g., systemic venous collapse) of the intact model. The CO and VR curves generated with pulsatile RAPs were also practically independent. This theoretical study therefore supports the validity of Guyton's graphical analysis.
Perceived demands during modern military operations.
Boermans, Sylvie M; Kamphuis, Wim; Kamhuis, Wim; Delahaij, Roos; Korteling, J E Hans; Euwema, Martin C
2013-07-01
Using a cross-sectional design, this study explored operational demands during the International Security Assistance Force for Afghanistan (2009-2010) across distinct military units. A total of 1,413 Dutch soldiers, nested within four types of units (i.e., combat, combat support, service support, and command support units) filled out a 23-item self-survey in which they were asked to evaluate the extent to which they experienced operational characteristics as demanding. Exploratory factor analysis identified six underlying dimensions of demands. Multivariate analysis of variance revealed that distinct units are characterized by their own unique constellation of perceived demands, even after controlling for previous deployment experience. Most notable findings were found when comparing combat units to other types of units. These insights can be used to better prepare different types of military units for deployment, and support them in the specific demands they face during deployment. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Mazur, Lukasz M; Mosaly, Prithima R; Moore, Carlton; Comitz, Elizabeth; Yu, Fei; Falchook, Aaron D; Eblan, Michael J; Hoyle, Lesley M; Tracton, Gregg; Chera, Bhishamjit S; Marks, Lawrence B
2016-11-01
To assess the relationship between (1) task demands and workload, (2) task demands and performance, and (3) workload and performance, all during physician-computer interactions in a simulated environment. Two experiments were performed in 2 different electronic medical record (EMR) environments: WebCIS (n = 12) and Epic (n = 17). Each participant was instructed to complete a set of prespecified tasks on 3 routine clinical EMR-based scenarios: urinary tract infection (UTI), pneumonia (PN), and heart failure (HF). Task demands were quantified using behavioral responses (click and time analysis). At the end of each scenario, subjective workload was measured using the NASA-Task-Load Index (NASA-TLX). Physiological workload was measured using pupillary dilation and electroencephalography (EEG) data collected throughout the scenarios. Performance was quantified based on the maximum severity of omission errors. Data analysis indicated that the PN and HF scenarios were significantly more demanding than the UTI scenario for participants using WebCIS (P < .01), and that the PN scenario was significantly more demanding than the UTI and HF scenarios for participants using Epic (P < .01). In both experiments, the regression analysis indicated a significant relationship only between task demands and performance (P < .01). Results suggest that task demands as experienced by participants are related to participants' performance. Future work may support the notion that task demands could be used as a quality metric that is likely representative of performance, and perhaps patient outcomes. The present study is a reasonable next step in a systematic assessment of how task demands and workload are related to performance in EMR-evolving environments. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Kondo, M; Nagao, Y; Mahbub, M H; Tanabe, T; Tanizawa, Y
2018-04-29
To identify factors predicting early postpartum glucose intolerance in Japanese women with gestational diabetes mellitus, using decision-curve analysis. A retrospective cohort study was performed. The participants were 123 Japanese women with gestational diabetes who underwent 75-g oral glucose tolerance tests at 8-12 weeks after delivery. They were divided into a glucose intolerance and a normal glucose tolerance group based on postpartum oral glucose tolerance test results. Analysis of the pregnancy oral glucose tolerance test results showed predictive factors for postpartum glucose intolerance. We also evaluated the clinical usefulness of the prediction model based on decision-curve analysis. Of 123 women, 78 (63.4%) had normoglycaemia and 45 (36.6%) had glucose intolerance. Multivariable logistic regression analysis showed insulinogenic index/fasting immunoreactive insulin and summation of glucose levels, assessed during pregnancy oral glucose tolerance tests (total glucose), to be independent risk factors for postpartum glucose intolerance. Evaluating the regression models, the best discrimination (area under the curve 0.725) was obtained using the basic model (i.e. age, family history of diabetes, BMI ≥25 kg/m 2 and use of insulin during pregnancy) plus insulinogenic index/fasting immunoreactive insulin <1.1. Decision-curve analysis showed that combining insulinogenic index/fasting immunoreactive insulin <1.1 with basic clinical information resulted in superior net benefits for prediction of postpartum glucose intolerance. Insulinogenic index/fasting immunoreactive insulin calculated using oral glucose tolerance test results during pregnancy is potentially useful for predicting early postpartum glucose intolerance in Japanese women with gestational diabetes. © 2018 Diabetes UK.
Hewson, Kylie; Noormohammadi, Amir H; Devlin, Joanne M; Mardani, Karim; Ignjatovic, Jagoda
2009-01-01
Infectious bronchitis virus (IBV) is a coronavirus that causes upper respiratory, renal and/or reproductive diseases with high morbidity in poultry. Classification of IBV is important for implementation of vaccination strategies to control the disease in commercial poultry. Currently, the lengthy process of sequence analysis of the IBV S1 gene is considered the gold standard for IBV strain identification, with a high nucleotide identity (e.g. > or =95%) indicating related strains. However, this gene has a high propensity to mutate and/or undergo recombination, and alone it may not be reliable for strain identification. A real-time polymerase chain reaction (RT-PCR) combined with high-resolution melt (HRM) curve analysis was developed based on the 3'UTR of IBV for rapid detection and classification of IBV from commercial poultry. HRM curves generated from 230 to 435-bp PCR products of several IBV strains were subjected to further analysis using a mathematical model also developed during this study. It was shown that a combination of HRM curve analysis and the mathematical model could reliably group 189 out of 190 comparisons of pairs of IBV strains in accordance with their 3'UTR and S1 gene identities. The newly developed RT-PCR/HRM curve analysis model could detect and rapidly identify novel and vaccine-related IBV strains, as confirmed by S1 gene and 3'UTR nucleotide sequences. This model is a rapid, reliable, accurate and non-subjective system for detection of IBVs in poultry flocks.
Pedicle screw versus hybrid posterior instrumentation for dystrophic neurofibromatosis scoliosis.
Wang, Jr-Yi; Lai, Po-Liang; Chen, Wen-Jer; Niu, Chi-Chien; Tsai, Tsung-Ting; Chen, Lih-Huei
2017-06-01
Surgical management of severe rigid dystrophic neurofibromatosis (NF) scoliosis is technically demanding and produces varying results. In the current study, we reviewed 9 patients who were treated with combined anterior and posterior fusion using different types of instrumentation (i.e., pedicle screw, hybrid, and all-hook constructs) at our institute.Between September 2001 and July 2010 at our institute, 9 patients received anterior release/fusion and posterior fusion with different types of instrumentation, including a pedicle screw construct (n = 5), a hybrid construct (n = 3), and an all-hook construct (n = 1). We compared the pedicle screw group with the hybrid group to analyze differences in preoperative curve angle, immediate postoperative curve reduction, and latest follow-up curve angle.The mean follow-up period was 9.5 ± 2.9 years. The average age at surgery was 10.3 ± 3.9 years. The average preoperative scoliosis curve was 61.3 ± 13.8°, and the average preoperative kyphosis curve was 39.8 ± 19.7°. The average postoperative scoliosis and kyphosis curves were 29.7 ± 10.7° and 21.0 ± 13.5°, respectively. The most recent follow-up scoliosis and kyphosis curves were 43.4 ± 17.3° and 29.4 ± 18.9°, respectively. There was no significant difference in the correction angle (either coronal or sagittal), and there was no significant difference in the loss of sagittal correction between the pedicle screw construct group and the hybrid construct group. However, the patients who received pedicle screw constructs had significantly less loss of coronal correction (P < .05). Two patients with posterior instrumentation, one with an all-hook construct and the other with a hybrid construct, required surgical revision because of progression of deformity.It is difficult to intraoperatively correct dystrophic deformity and to maintain this correction after surgery. Combined anterior release/fusion and posterior fusion using either a pedicle screw construct or a hybrid construct provide similar curve corrections both sagittally and coronally. After long-term follow-up, sagittal correction was maintained with both constructs. However, patients treated with posterior instrumentation using pedicle screw constructs had significantly less loss of coronal correction.
Pedicle screw versus hybrid posterior instrumentation for dystrophic neurofibromatosis scoliosis
Wang, Jr-Yi; Lai, Po-Liang; Chen, Wen-Jer; Niu, Chi-Chien; Tsai, Tsung-Ting; Chen, Lih-Huei
2017-01-01
Abstract Surgical management of severe rigid dystrophic neurofibromatosis (NF) scoliosis is technically demanding and produces varying results. In the current study, we reviewed 9 patients who were treated with combined anterior and posterior fusion using different types of instrumentation (i.e., pedicle screw, hybrid, and all-hook constructs) at our institute. Between September 2001 and July 2010 at our institute, 9 patients received anterior release/fusion and posterior fusion with different types of instrumentation, including a pedicle screw construct (n = 5), a hybrid construct (n = 3), and an all-hook construct (n = 1). We compared the pedicle screw group with the hybrid group to analyze differences in preoperative curve angle, immediate postoperative curve reduction, and latest follow-up curve angle. The mean follow-up period was 9.5 ± 2.9 years. The average age at surgery was 10.3 ± 3.9 years. The average preoperative scoliosis curve was 61.3 ± 13.8°, and the average preoperative kyphosis curve was 39.8 ± 19.7°. The average postoperative scoliosis and kyphosis curves were 29.7 ± 10.7° and 21.0 ± 13.5°, respectively. The most recent follow-up scoliosis and kyphosis curves were 43.4 ± 17.3° and 29.4 ± 18.9°, respectively. There was no significant difference in the correction angle (either coronal or sagittal), and there was no significant difference in the loss of sagittal correction between the pedicle screw construct group and the hybrid construct group. However, the patients who received pedicle screw constructs had significantly less loss of coronal correction (P < .05). Two patients with posterior instrumentation, one with an all-hook construct and the other with a hybrid construct, required surgical revision because of progression of deformity. It is difficult to intraoperatively correct dystrophic deformity and to maintain this correction after surgery. Combined anterior release/fusion and posterior fusion using either a pedicle screw construct or a hybrid construct provide similar curve corrections both sagittally and coronally. After long-term follow-up, sagittal correction was maintained with both constructs. However, patients treated with posterior instrumentation using pedicle screw constructs had significantly less loss of coronal correction. PMID:28562548
NASA Technical Reports Server (NTRS)
Sepehry-Fard, F.; Coulthard, Maurice H.
1995-01-01
The process of predicting the values of maintenance time dependent variable parameters such as mean time between failures (MTBF) over time must be one that will not in turn introduce uncontrolled deviation in the results of the ILS analysis such as life cycle costs, spares calculation, etc. A minor deviation in the values of the maintenance time dependent variable parameters such as MTBF over time will have a significant impact on the logistics resources demands, International Space Station availability and maintenance support costs. There are two types of parameters in the logistics and maintenance world: a. Fixed; b. Variable Fixed parameters, such as cost per man hour, are relatively easy to predict and forecast. These parameters normally follow a linear path and they do not change randomly. However, the variable parameters subject to the study in this report such as MTBF do not follow a linear path and they normally fall within the distribution curves which are discussed in this publication. The very challenging task then becomes the utilization of statistical techniques to accurately forecast the future non-linear time dependent variable arisings and events with a high confidence level. This, in turn, shall translate in tremendous cost savings and improved availability all around.
Development of damage probability matrices based on Greek earthquake damage data
NASA Astrophysics Data System (ADS)
Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.
2011-03-01
A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.
SENS-5D trajectory and wind-sensitivity calculations for unguided rockets
NASA Technical Reports Server (NTRS)
Singh, R. P.; Huang, L. C. P.; Cook, R. A.
1975-01-01
A computational procedure is described which numerically integrates the equations of motion of an unguided rocket. Three translational and two angular (roll discarded) degrees of freedom are integrated through the final burnout; and then, through impact, only three translational motions are considered. Input to the routine is: initial time, altitude and velocity, vehicle characteristics, and other defined options. Input format has a wide range of flexibility for special calculations. Output is geared mainly to the wind-weighting procedure, and includes summary of trajectory at burnout, apogee and impact, summary of spent-stage trajectories, detailed position and vehicle data, unit-wind effects for head, tail and cross winds, coriolis deflections, range derivative, and the sensitivity curves (the so called F(Z) and DF(Z) curves). The numerical integration procedure is a fourth-order, modified Adams-Bashforth Predictor-Corrector method. This method is supplemented by a fourth-order Runge-Kutta method to start the integration at t=0 and whenever error criteria demand a change in step size.
Inventory Management for Irregular Shipment of Goods in Distribution Centre
NASA Astrophysics Data System (ADS)
Takeda, Hitoshi; Kitaoka, Masatoshi; Usuki, Jun
2016-01-01
The shipping amount of commodity goods (Foods, confectionery, dairy products, such as public cosmetic pharmaceutical products) changes irregularly at the distribution center dealing with the general consumer goods. Because the shipment time and the amount of the shipment are irregular, the demand forecast becomes very difficult. For this, the inventory control becomes difficult, too. It cannot be applied to the shipment of the commodity by the conventional inventory control methods. This paper proposes the method for inventory control by cumulative flow curve method. It proposed the method of deciding the order quantity of the inventory control by the cumulative flow curve. Here, it proposes three methods. 1) Power method,2) Polynomial method and 3)Revised Holt's linear method that forecasts data with trends that is a kind of exponential smoothing method. This paper compares the economics of the conventional method, which is managed by the experienced and three new proposed methods. And, the effectiveness of the proposal method is verified from the numerical calculations.
A study on suppressing transmittance fluctuations for air-gapped Glan-type polarizing prisms
NASA Astrophysics Data System (ADS)
Zhang, Chuanfa; Li, Dailin; Zhu, Huafeng; Li, Chuanzhi; Jiao, Zhiyong; Wang, Ning; Xu, Zhaopeng; Wang, Xiumin; Song, Lianke
2018-05-01
Light intensity transmittance is a key parameter for the design of polarizing prisms, while sometimes its experimental curves based on spatial incident angle presents periodical fluctuations. Here, we propose a novel method for completely suppressing these fluctuations via setting a glued error angle in the air gap of Glan-Taylor prisms. The proposal consists of: an accurate formula of the intensity transmittance for Glan-Taylor prisms, a numerical simulation and a contrast experiment of Glan-Taylor prisms for analyzing the causes of the fluctuations, and a simple method for accurately measuring the glued error angle. The result indicates that when the setting glued error angle is larger than the critical angle for a certain polarizing prism, the fluctuations can be completely suppressed, and a smooth intensity transmittance curve can be obtained. Besides, the critical angle in the air gap for suppressing the fluctuations is decreased with the increase of beam spot size. This method has the advantage of having less demand for the prism position in optical systems.
NASA Astrophysics Data System (ADS)
Knapik, Maciej
2018-02-01
The article presents an economic analysis and comparison of selected (district heating, natural gas, heat pump with renewable energy sources) methods for the preparation of domestic hot water in a building with low energy demand. In buildings of this type increased demand of energy for domestic hot water preparation in relation to the total energy demand can be observed. As a result, the proposed solutions allow to further lower energy demand by using the renewable energy sources. This article presents the results of numerical analysis and calculations performed mainly in MATLAB software, based on typical meteorological years. The results showed that system with heat pump and renewable energy sources Is comparable with district heating system.
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Exploring Algorithms for Stellar Light Curves With TESS
NASA Astrophysics Data System (ADS)
Buzasi, Derek
2018-01-01
The Kepler and K2 missions have produced tens of thousands of stellar light curves, which have been used to measure rotation periods, characterize photometric activity levels, and explore phenomena such as differential rotation. The quasi-periodic nature of rotational light curves, combined with the potential presence of additional periodicities not due to rotation, complicates the analysis of these time series and makes characterization of uncertainties difficult. A variety of algorithms have been used for the extraction of rotational signals, including autocorrelation functions, discrete Fourier transforms, Lomb-Scargle periodograms, wavelet transforms, and the Hilbert-Huang transform. In addition, in the case of K2 a number of different pipelines have been used to produce initial detrended light curves from the raw image frames.In the near future, TESS photometry, particularly that deriving from the full-frame images, will dramatically further expand the number of such light curves, but details of the pipeline to be used to produce photometry from the FFIs remain under development. K2 data offers us an opportunity to explore the utility of different reduction and analysis tool combinations applied to these astrophysically important tasks. In this work, we apply a wide range of algorithms to light curves produced by a number of popular K2 pipeline products to better understand the advantages and limitations of each approach and provide guidance for the most reliable and most efficient analysis of TESS stellar data.
NASA Astrophysics Data System (ADS)
Yang, Zhichun; Zhou, Jian; Gu, Yingsong
2014-10-01
A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.
Utilizing Traveler Demand Modeling to Predict Future Commercial Flight Schedules in the NAS
NASA Technical Reports Server (NTRS)
Viken, Jeff; Dollyhigh, Samuel; Smith, Jeremy; Trani, Antonio; Baik, Hojong; Hinze, Nicholas; Ashiabor, Senanu
2006-01-01
The current work incorporates the Transportation Systems Analysis Model (TSAM) to predict the future demand for airline travel. TSAM is a multi-mode, national model that predicts the demand for all long distance travel at a county level based upon population and demographics. The model conducts a mode choice analysis to compute the demand for commercial airline travel based upon the traveler s purpose of the trip, value of time, cost and time of the trip,. The county demand for airline travel is then aggregated (or distributed) to the airport level, and the enplanement demand at commercial airports is modeled. With the growth in flight demand, and utilizing current airline flight schedules, the Fratar algorithm is used to develop future flight schedules in the NAS. The projected flights can then be flown through air transportation simulators to quantify the ability of the NAS to meet future demand. A major strength of the TSAM analysis is that scenario planning can be conducted to quantify capacity requirements at individual airports, based upon different future scenarios. Different demographic scenarios can be analyzed to model the demand sensitivity to them. Also, it is fairly well know, but not well modeled at the airport level, that the demand for travel is highly dependent on the cost of travel, or the fare yield of the airline industry. The FAA projects the fare yield (in constant year dollars) to keep decreasing into the future. The magnitude and/or direction of these projections can be suspect in light of the general lack of airline profits and the large rises in airline fuel cost. Also, changes in travel time and convenience have an influence on the demand for air travel, especially for business travel. Future planners cannot easily conduct sensitivity studies of future demand with the FAA TAF data, nor with the Boeing or Airbus projections. In TSAM many factors can be parameterized and various demand sensitivities can be predicted for future travel. These resulting demand scenarios can be incorporated into future flight schedules, therefore providing a quantifiable demand for flights in the NAS for a range of futures. In addition, new future airline business scenarios are investigated that illustrate when direct flights can replace connecting flights and larger aircraft can be substituted, only when justified by demand.
VStar: Variable star data visualization and analysis tool
NASA Astrophysics Data System (ADS)
VStar Team
2014-07-01
VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.
Defining the learning curve in laparoscopic paraesophageal hernia repair: a CUSUM analysis.
Okrainec, Allan; Ferri, Lorenzo E; Feldman, Liane S; Fried, Gerald M
2011-04-01
There are numerous reports in the literature documenting high recurrence rates after laparoscopic paraesophageal hernia repair. The purpose of this study was to determine the learning curve for this procedure using the Cumulative Summation (CUSUM) technique. Forty-six consecutive patients with paraesophageal hernia were evaluated prospectively after laparoscopic paraesophageal hernia repair. Upper GI series was performed 3 months postoperatively to look for recurrence. Patients were stratified based on the surgeon's early (first 20 cases) and late experience (>20 cases). The CUSUM method was then used to further analyze the learning curve. Nine patients (21%) had anatomic recurrence. There was a trend toward a higher recurrence rate during the first 20 cases, although this did not achieve statistical significance (33% vs. 13%, p = 0.10). However, using a CUSUM analysis to plot the learning curve, we found that the recurrence rate diminishes after 18 cases and reaches an acceptable rate after 26 cases. Surgeon experience is an important predictor of recurrence after laparoscopic paraesophageal hernia repair. CUSUM analysis revealed there is a significant learning curve to become proficient at this procedure, with approximately 20 cases required before a consistent decrease in hernia recurrence rate is observed.
Runoff potentiality of a watershed through SCS and functional data analysis technique.
Adham, M I; Shirazi, S M; Othman, F; Rahman, S; Yusop, Z; Ismail, Z
2014-01-01
Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling.
Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique
Adham, M. I.; Shirazi, S. M.; Othman, F.; Rahman, S.; Yusop, Z.; Ismail, Z.
2014-01-01
Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling. PMID:25152911
Azoulay, Bracha; Orkibi, Hod
2018-01-01
Although the literature indicates that students in mental health professions start to form their professional identity and competence in graduate school, there are few studies on the in-training experience of creative arts therapies students. This mixed methods study examined how five first-year students in a psychodrama master's degree program in Israel experienced their field training, with the aim of identifying the factors likely to promote or hinder the development of their professional identity and sense of professional ability. Longitudinal data were collected weekly throughout the 20-week field training experience. The students reported qualitatively on helpful and hindering factors and were assessed quantitatively on questionnaires measuring professional identity, perceived demands-abilities fit, client involvement, and therapy session evaluations. A thematic analysis of the students' reports indicated that a clear and defined setting and structure, observing the instructor as a role model, actively leading parts of the session, and observing fellow students were all helpful factors. The hindering factors included role confusion, issues related to coping with client resistance and disciplinary problems, as well as school end-of-year activities that disrupted the continuity of therapy. The quantitative results indicated that students' professional identity did not significantly change over the year, whereas a U-shaped curve trajectory characterized the changes in demands-abilities fit and other measures. Students began their field training with an overstated sense of ability that soon declined and later increased. These findings provide indications of which helping and hindering factors should be maximized and minimized, to enhance students' field training.
Azoulay, Bracha; Orkibi, Hod
2018-01-01
Although the literature indicates that students in mental health professions start to form their professional identity and competence in graduate school, there are few studies on the in-training experience of creative arts therapies students. This mixed methods study examined how five first-year students in a psychodrama master’s degree program in Israel experienced their field training, with the aim of identifying the factors likely to promote or hinder the development of their professional identity and sense of professional ability. Longitudinal data were collected weekly throughout the 20-week field training experience. The students reported qualitatively on helpful and hindering factors and were assessed quantitatively on questionnaires measuring professional identity, perceived demands-abilities fit, client involvement, and therapy session evaluations. A thematic analysis of the students’ reports indicated that a clear and defined setting and structure, observing the instructor as a role model, actively leading parts of the session, and observing fellow students were all helpful factors. The hindering factors included role confusion, issues related to coping with client resistance and disciplinary problems, as well as school end-of-year activities that disrupted the continuity of therapy. The quantitative results indicated that students’ professional identity did not significantly change over the year, whereas a U-shaped curve trajectory characterized the changes in demands-abilities fit and other measures. Students began their field training with an overstated sense of ability that soon declined and later increased. These findings provide indications of which helping and hindering factors should be maximized and minimized, to enhance students’ field training. PMID:29515504
Image-derived input function with factor analysis and a-priori information.
Simončič, Urban; Zanotti-Fregonara, Paolo
2015-02-01
Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.
Gender-Role Attitudes and Behavior Across the Transition to Parenthood
Katz-Wise, Sabra L.; Priess, Heather A.; Hyde, Janet S.
2013-01-01
Based on social structural theory and identity theory, the current study examined changes in gender-role attitudes and behavior across the first-time transition to parenthood, and following the birth of a second child for experienced mothers and fathers. Data were analyzed from the ongoing longitudinal Wisconsin Study of Families and Work (WSFW). Gender-role attitudes, work and family identity salience, and division of household labor were measured for 205 first-time and 198 experienced mothers and fathers across four time points from five months pregnant to 12 months postpartum. Multi-level latent growth curve analysis was used to analyze the data. In general, parents became more traditional in their gender-role attitudes and behavior following the birth of a child, women changed more than men, and first-time parents changed more than experienced parents. Findings suggest that changes in gender-role attitudes and behavior following the birth of a child may be attributed both to transitioning to parenthood for the first time, and to negotiating the demands of having a new baby in the family. PMID:20053003
Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping
2016-10-01
Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.
The Future Potential of Wave Power in the US
NASA Astrophysics Data System (ADS)
Previsic, M.; Epler, J.; Hand, M.; Heimiller, D.; Short, W.; Eurek, K.
2012-12-01
The theoretical ocean wave energy resource potential exceeds 50% of the annual domestic energy demand of the US, is located in close proximity of coastal population centers, and, although variable in nature, may be more consistent and predictable than some other renewable generation technologies. As renewable electricity generation technologies, ocean wave energy offers a low air pollutant option for diversifying the US electricity generation portfolio. Furthermore, the output characteristics of these technologies may complement other renewable technologies. This study addresses: (1) The energy extraction potential from the US wave energy resource, (2) The present cost of wave technology in /kW, (3) The estimated cost of energy in /kWh, and (4) Cost levels at which the technology should see significant deployment. RE Vision Consulting in collaboration with NREL engaged in various analyses to establish present-day and future cost profiles for MHK technologies, compiled existing resource assessments and wave energy supply curves, and developed cost and deployment scenarios using the ReEDS analysis model to estimate the present-day technology cost reductions necessary to facilitate significant technology deployment in the US.
Teacher Supply & Demand in Michigan and the United States 1994-95.
ERIC Educational Resources Information Center
Scheetz, L. Patrick; Gratz, Becky
This publication provides analysis of current data on the supply and demand for teachers nationally and in Michigan in 1994-95 along with tips for new teachers who are still seeking jobs. The text covers areas of education where demand is highest including special education and science education, notes the persistent demand for minority teachers…
NASA Astrophysics Data System (ADS)
Shiki, Akira; Yokoyama, Akihiko; Baba, Jyunpei; Takano, Tomihiro; Gouda, Takahiro; Izui, Yoshio
Recently, because of the environmental burden mitigation, energy conservations, energy security, and cost reductions, distributed generations are attracting our strong attention. These distributed generations (DGs) have been already installed to the distribution system, and much more DGs will be expected to be connected in the future. On the other hand, a new concept called “Microgrid” which is a small power supply network consisting of only DGs was proposed and some prototype projects are ongoing in Japan. The purpose of this paper is to develop the three-phase instantaneous valued digital simulator of microgrid consisting of a lot of inverter based DGs and to develop a supply and demand control method in isolated microgrid. First, microgrid is modeled using MATLAB/SIMULINK. We develop models of three-phase instantaneous valued inverter type CVCF generator, PQ specified generator, PV specified generator, PQ specified load as storage battery, photovoltaic generation, fuel cell and inverter load respectively. Then we propose an autonomous decentralized control method of supply and demand in isolated microgrid where storage batteries, fuel cells, photovoltaic generations and loads are connected. It is proposed here that the system frequency is used as a means to control DG output. By changing the frequency of the storage battery due to unbalance of supply and demand, all inverter based DGs detect the frequency fluctuation and change their own outputs. Finally, a new frequency control method in autonomous decentralized control of supply and demand is proposed. Though the frequency is used to transmit the information on the supply and demand unbalance to DGs, after the frequency plays the role, the frequency finally has to return to a standard value. To return the frequency to the standard value, the characteristic curve of the fuel cell is shifted in parallel. This control is carried out corresponding to the fluctuation of the load. The simulation shows that the frequency can be controlled well and has been made clear the effectiveness of the frequency control system.
Technology implications of a cognitive task analysis for locomotive engineers
DOT National Transportation Integrated Search
2009-01-01
This report documents the results of a cognitive task analysis (CTA) that examined the cognitive demands and activities of locomotive engineers in todays environment and the changes in cognitive demands and activities that are likely to arise with...
Analysis of an inventory model for both linearly decreasing demand and holding cost
NASA Astrophysics Data System (ADS)
Malik, A. K.; Singh, Parth Raj; Tomar, Ajay; Kumar, Satish; Yadav, S. K.
2016-03-01
This study proposes the analysis of an inventory model for linearly decreasing demand and holding cost for non-instantaneous deteriorating items. The inventory model focuses on commodities having linearly decreasing demand without shortages. The holding cost doesn't remain uniform with time due to any form of variation in the time value of money. Here we consider that the holding cost decreases with respect to time. The optimal time interval for the total profit and the optimal order quantity are determined. The developed inventory model is pointed up through a numerical example. It also includes the sensitivity analysis.
Sensitivity analysis of water consumption in an office building
NASA Astrophysics Data System (ADS)
Suchacek, Tomas; Tuhovcak, Ladislav; Rucka, Jan
2018-02-01
This article deals with sensitivity analysis of real water consumption in an office building. During a long-term real study, reducing of pressure in its water connection was simulated. A sensitivity analysis of uneven water demand was conducted during working time at various provided pressures and at various time step duration. Correlations between maximal coefficients of water demand variation during working time and provided pressure were suggested. The influence of provided pressure in the water connection on mean coefficients of water demand variation was pointed out, altogether for working hours of all days and separately for days with identical working hours.
Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V
2007-02-01
Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
Pulleyblank, Ryan; Chuma, Jefter; Gilbody, Simon M; Thompson, Carl
2013-09-01
For a test to be considered useful for making treatment decisions, it is necessary that making treatment decisions based on the results of the test be a preferable strategy to making treatment decisions without the test. Decision curve analysis is a framework for assessing when a test would be expected to be useful, which integrates evidence of a test's performance characteristics (sensitivity and specificity), condition prevalence among at-risk patients, and patient preferences for treatment. We describe decision curve analysis generally and illustrate its potential through an application to tests for prodromal psychosis. Clinical psychosis is often preceded by a prodromal phase, but not all those with prodromal symptoms proceed to develop full psychosis. Patients identified as at risk for developing psychosis may be considered for proactive treatment to mitigate development of clinically defined psychosis. Tests exist to help identify those at-risk patients most likely to develop psychosis, but it is uncertain when these tests would be considered useful for making proactive treatment decisions. We apply decision curve analysis to results from a systematic review of studies investigating clinical tests for predicting the development of psychosis in at-risk populations, and present resulting decision curves that illustrate when the tests may be expected to be useful for making proactive treatment decisions.
A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.
O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A
2015-02-01
Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.
Testing and analysis of flat and curved panels with multiple cracks
NASA Technical Reports Server (NTRS)
Broek, David; Jeong, David Y.; Thomson, Douglas
1994-01-01
An experimental and analytical investigation of multiple cracking in various types of test specimens is described in this paper. The testing phase is comprised of a flat unstiffened panel series and curved stiffened and unstiffened panel series. The test specimens contained various configurations for initial damage. Static loading was applied to these specimens until ultimate failure, while loads and crack propagation were recorded. This data provides the basis for developing and validating methodologies for predicting linkup of multiple cracks, progression to failure, and overall residual strength. The results from twelve flat coupon and ten full scale curved panel tests are presented. In addition, an engineering analysis procedure was developed to predict multiple crack linkup. Reasonable agreement was found between predictions and actual test results for linkup and residual strength for both flat and curved panels. The results indicate that an engineering analysis approach has the potential to quantitatively assess the effect of multiple cracks in the arrest capability of an aircraft fuselage structure.
2013-01-01
Background Plasma glucose levels are important measures in medical care and research, and are often obtained from oral glucose tolerance tests (OGTT) with repeated measurements over 2–3 hours. It is common practice to use simple summary measures of OGTT curves. However, different OGTT curves can yield similar summary measures, and information of physiological or clinical interest may be lost. Our mean aim was to extract information inherent in the shape of OGTT glucose curves, compare it with the information from simple summary measures, and explore the clinical usefulness of such information. Methods OGTTs with five glucose measurements over two hours were recorded for 974 healthy pregnant women in their first trimester. For each woman, the five measurements were transformed into smooth OGTT glucose curves by functional data analysis (FDA), a collection of statistical methods developed specifically to analyse curve data. The essential modes of temporal variation between OGTT glucose curves were extracted by functional principal component analysis. The resultant functional principal component (FPC) scores were compared with commonly used simple summary measures: fasting and two-hour (2-h) values, area under the curve (AUC) and simple shape index (2-h minus 90-min values, or 90-min minus 60-min values). Clinical usefulness of FDA was explored by regression analyses of glucose tolerance later in pregnancy. Results Over 99% of the variation between individually fitted curves was expressed in the first three FPCs, interpreted physiologically as “general level” (FPC1), “time to peak” (FPC2) and “oscillations” (FPC3). FPC1 scores correlated strongly with AUC (r=0.999), but less with the other simple summary measures (−0.42≤r≤0.79). FPC2 scores gave shape information not captured by simple summary measures (−0.12≤r≤0.40). FPC2 scores, but not FPC1 nor the simple summary measures, discriminated between women who did and did not develop gestational diabetes later in pregnancy. Conclusions FDA of OGTT glucose curves in early pregnancy extracted shape information that was not identified by commonly used simple summary measures. This information discriminated between women with and without gestational diabetes later in pregnancy. PMID:23327294
Frøslie, Kathrine Frey; Røislien, Jo; Qvigstad, Elisabeth; Godang, Kristin; Bollerslev, Jens; Voldner, Nanna; Henriksen, Tore; Veierød, Marit B
2013-01-17
Plasma glucose levels are important measures in medical care and research, and are often obtained from oral glucose tolerance tests (OGTT) with repeated measurements over 2-3 hours. It is common practice to use simple summary measures of OGTT curves. However, different OGTT curves can yield similar summary measures, and information of physiological or clinical interest may be lost. Our mean aim was to extract information inherent in the shape of OGTT glucose curves, compare it with the information from simple summary measures, and explore the clinical usefulness of such information. OGTTs with five glucose measurements over two hours were recorded for 974 healthy pregnant women in their first trimester. For each woman, the five measurements were transformed into smooth OGTT glucose curves by functional data analysis (FDA), a collection of statistical methods developed specifically to analyse curve data. The essential modes of temporal variation between OGTT glucose curves were extracted by functional principal component analysis. The resultant functional principal component (FPC) scores were compared with commonly used simple summary measures: fasting and two-hour (2-h) values, area under the curve (AUC) and simple shape index (2-h minus 90-min values, or 90-min minus 60-min values). Clinical usefulness of FDA was explored by regression analyses of glucose tolerance later in pregnancy. Over 99% of the variation between individually fitted curves was expressed in the first three FPCs, interpreted physiologically as "general level" (FPC1), "time to peak" (FPC2) and "oscillations" (FPC3). FPC1 scores correlated strongly with AUC (r=0.999), but less with the other simple summary measures (-0.42≤r≤0.79). FPC2 scores gave shape information not captured by simple summary measures (-0.12≤r≤0.40). FPC2 scores, but not FPC1 nor the simple summary measures, discriminated between women who did and did not develop gestational diabetes later in pregnancy. FDA of OGTT glucose curves in early pregnancy extracted shape information that was not identified by commonly used simple summary measures. This information discriminated between women with and without gestational diabetes later in pregnancy.
Relationship between the curve of Spee and craniofacial variables: A regression analysis.
Halimi, Abdelali; Benyahia, Hicham; Azeroual, Mohamed-Faouzi; Bahije, Loubna; Zaoui, Fatima
2018-06-01
The aim of this regression analysis was to identify the determining factors, which impact the curve of Spee during its genesis, its therapeutic reconstruction, and its stability, within a continuously evolving craniofacial morphology throughout life. We selected a total of 107 patients, according to the inclusion criteria. A morphological and functional clinical examination was performed for each patient: plaster models, tracing of the curve of Spee, crowding, Angle's classification, overjet and overbite were thus recorded. Then, we made a cephalometric analysis based on the standardized lateral cephalograms. In the sagittal dimension, we measured the values of angles ANB, SNA, SNB, SND, I/i; and the following distances: AoBo, I/NA, i/NB, SE and SL. In the vertical dimension, we measured the values of angles FMA, GoGn/SN, the occlusal plane, and the following distances: SAr, ArD, Ar/Con, Con/Gn, GoPo, HFP, HFA and IF. The statistical analysis was performed using the SPSS software with a significance level of 0.05. Our sample including 107 subjects was composed of 77 female patients (71.3%) and 30 male patients (27.8%) 7 hypodivergent patients (6.5%), 56 hyperdivergent patients (52.3%) and 44 normodivergent patients (41.1%). Patients' mean age was 19.35±5.95 years. The hypodivergent patients presented more pronounced curves of Spee compared to the normodivergent and the hyperdivergent populations; patients in skeletal Class I presented less pronounced curves of Spee compared to patients in skeletal Class II and Class III. These differences were non significant (P>0.05). The curve of Spee was positively and moderately correlated with Angle's classification, overjet, overbite, sellion-articulare distance, and breathing type (P<0.05). We found no correlation between age, gender and the other parameters included in the study with the curve of Spee (P>0.05). Seventy five percent (75%) of the hyperdivergent patients with an oral breathing presented an overbite of 3mm, which is quite excessive given the characteristics often admitted for this typology; this parameter could explain the overbite observed in the hyperdivergent population included in this study. For the multivariate analysis, the overbite and the sellion-articulare distance remained independently related to the curve of Spee according to the breathing type, Angle's classification, and overjet. This regression model explains 21.4% of the changes in the curve of Spee. Copyright © 2018. Published by Elsevier Masson SAS.
NASA Astrophysics Data System (ADS)
Moret-Fernández, D.; Latorre, B.
2017-01-01
The water retention curve (θ(h)), which defines the relationship between the volumetric water content (θ) and the matric potential (h), is of paramount importance to characterize the hydraulic behaviour of soils. Because current methods to estimate θ(h) are, in general, tedious and time consuming, alternative procedures to determine θ(h) are needed. Using an upward infiltration curve, the main objective of this work is to present a method to determine the parameters of the van Genuchten (1980) water retention curve (α and n) from the sorptivity (S) and the β parameter defined in the 1D infiltration equation proposed by Haverkamp et al. (1994). The first specific objective is to present an equation, based on the Haverkamp et al. (1994) analysis, which allows describing an upward infiltration process. Secondary, assuming a known saturated hydraulic conductivity, Ks, calculated on a finite soil column by the Darcy's law, a numerical procedure to calculate S and β by the inverse analysis of an exfiltration curve is presented. Finally, the α and n values are numerically calculated from Ks, S and β. To accomplish the first specific objective, cumulative upward infiltration curves simulated with HYDRUS-1D for sand, loam, silt and clay soils were compared to those calculated with the proposed equation, after applying the corresponding β and S calculated from the theoretical Ks, α and n. The same curves were used to: (i) study the influence of the exfiltration time on S and β estimations, (ii) evaluate the limits of the inverse analysis, and (iii) validate the feasibility of the method to estimate α and n. Next, the θ(h) parameters estimated with the numerical method on experimental soils were compared to those obtained with pressure cells. The results showed that the upward infiltration curve could be correctly described by the modified Haverkamp et al. (1994) equation. While S was only affected by early-time exfiltration data, the β parameter had a significant influence on the long-time exfiltration curve, which accuracy increased with time. The 1D infiltration model was only suitable for β < 1.7 (sand, loam and silt). After omitting the clay soil, an excellent relationship (R2 = 0.99, p < 0.005) was observed between the theoretical α and n values of the synthetic soils and those estimated from the inverse analysis. Consistent results, with a significant relationship (p < 0.001) between the n values estimated with the pressure cell and the upward infiltration analysis, were also obtained on the experimental soils.
Gokulan, Kuppan; Williams, Katherine; Khare, Sangeeta
2017-04-01
Limited antibacterial activity of silver ions leached from silver-impregnated food contact materials could be due to: 1) the presence of silver resistance genes in tested bacteria ; or 2) lack of susceptibility to silver ion-mediated killing in the bacterial strain (K. Williams, L. Valencia, K. Gokulan, R. Trbojevich, S. Khare, 2016 [1]). This study contains data to address the specificity of silver resistance genes in Salmonella Typhimurium during the real time PCR using melting curve analysis and an assessment of the minimum inhibitory concentration of silver ions for Salmonella .
Criado-Fornelio, A; Buling, A; Barba-Carretero, J C
2009-02-01
We developed and validated a real-time polymerase chain reaction (PCR) assay using fluorescent hybridization probes and melting curve analysis to identify the PKD1 exon 29 (C-->A) mutation, which is implicated in polycystic kidney disease of cats. DNA was isolated from peripheral blood of 20 Persian cats. The employ of the new real-time PCR and melting curve analysis in these samples indicated that 13 cats (65%) were wild type homozygotes and seven cats (35%) were heterozygotes. Both PCR-RFLP and sequencing procedures were in full agreement with real-time PCR test results. Sequence analysis showed that the mutant gene had the expected base change compared to the wild type gene. The new procedure is not only very reliable but also faster than the techniques currently applied for diagnosis of the mutation.
An approach to market analysis for lighter than air transportation of freight
NASA Technical Reports Server (NTRS)
Roberts, P. O.; Marcus, H. S.; Pollock, J. H.
1975-01-01
An approach is presented to marketing analysis for lighter than air vehicles in a commercial freight market. After a discussion of key characteristics of supply and demand factors, a three-phase approach to marketing analysis is described. The existing transportation systems are quantitatively defined and possible roles for lighter than air vehicles within this framework are postulated. The marketing analysis views the situation from the perspective of both the shipper and the carrier. A demand for freight service is assumed and the resulting supply characteristics are determined. Then, these supply characteristics are used to establish the demand for competing modes. The process is then iterated to arrive at the market solution.
A Multi-Sector Assessment of the Effects of Climate Change at the Energy-Water-Land Nexus in the US
NASA Astrophysics Data System (ADS)
McFarland, J.; Sarofim, M. C.; Martinich, J.
2017-12-01
Rising temperatures and changing precipitation patterns due to climate change are projected to alter many sectors of the US economy. A growing body of research has examined these effects in the energy, water, and agricultural sectors. Rising summer temperatures increase the demand for electricity. Changing precipitation patterns effect the availability of water for hydropower generation, thermo-electric cooling, irrigation, and municipal and industrial consumption. A combination of changes to temperature and precipitation alter crop yields and cost-effective farming practices. Although a significant body of research exists on analyzing impacts to individual sectors, fewer studies examine the effects using a common set of assumptions (e.g., climatic and socio-economic) within a coupled modeling framework. The present analysis uses a multi-sector, multi-model framework with common input assumptions to assess the projected effects of climate change on energy, water, and land-use in the United States. The analysis assesses the climate impacts for across 5 global circulation models for representative concentration pathways (RCP) of 8.5 and 4.5 W/m2. The energy sector models - Pacific Northwest National Lab's Global Change Assessment Model (GCAM) and the National Renewable Energy Laboratory's Regional Energy Deployment System (ReEDS) - show the effects of rising temperature on energy and electricity demand. Electricity supply in ReEDS is also affected by the availability of water for hydropower and thermo-electric cooling. Water availability is calculated from the GCM's precipitation using the US Basins model. The effects on agriculture are estimated using both a process-based crop model (EPIC) and an agricultural economic model (FASOM-GHG), which adjusts water supply curves based on information from US Basins. The sectoral models show higher economic costs of climate change under RCP 8.5 than RCP 4.5 averaged across the country and across GCM's.
NASA Astrophysics Data System (ADS)
Morgan, M. G.; Vaishnav, P.; Azevedo, I. L.; Dowlatabadi, H.
2016-12-01
Rising temperatures and changing precipitation patterns due to climate change are projected to alter many sectors of the US economy. A growing body of research has examined these effects in the energy, water, and agricultural sectors. Rising summer temperatures increase the demand for electricity. Changing precipitation patterns effect the availability of water for hydropower generation, thermo-electric cooling, irrigation, and municipal and industrial consumption. A combination of changes to temperature and precipitation alter crop yields and cost-effective farming practices. Although a significant body of research exists on analyzing impacts to individual sectors, fewer studies examine the effects using a common set of assumptions (e.g., climatic and socio-economic) within a coupled modeling framework. The present analysis uses a multi-sector, multi-model framework with common input assumptions to assess the projected effects of climate change on energy, water, and land-use in the United States. The analysis assesses the climate impacts for across 5 global circulation models for representative concentration pathways (RCP) of 8.5 and 4.5 W/m2. The energy sector models - Pacific Northwest National Lab's Global Change Assessment Model (GCAM) and the National Renewable Energy Laboratory's Regional Energy Deployment System (ReEDS) - show the effects of rising temperature on energy and electricity demand. Electricity supply in ReEDS is also affected by the availability of water for hydropower and thermo-electric cooling. Water availability is calculated from the GCM's precipitation using the US Basins model. The effects on agriculture are estimated using both a process-based crop model (EPIC) and an agricultural economic model (FASOM-GHG), which adjusts water supply curves based on information from US Basins. The sectoral models show higher economic costs of climate change under RCP 8.5 than RCP 4.5 averaged across the country and across GCM's.
A Microlensing Analysis of the Central Engine in the Lensed Quasar WFI J2033-4723
Chile. We combined these new data with published measurements from Vuissoz et al. (2008 )to create a 13-season set of optical light curves. Employing the...Bayesian Monte Carlo micro lensing analysis technique of Kochanek (2004), we analyzed these light curves to yield the first-ever measurement of the
ERIC Educational Resources Information Center
Preacher, Kristopher J.; Curran, Patrick J.; Bauer, Daniel J.
2006-01-01
Simple slopes, regions of significance, and confidence bands are commonly used to evaluate interactions in multiple linear regression (MLR) models, and the use of these techniques has recently been extended to multilevel or hierarchical linear modeling (HLM) and latent curve analysis (LCA). However, conducting these tests and plotting the…
[Identification of Animal Whole Blood Based on Near Infrared Transmission Spectroscopy].
Wan, Xiong; Wang, Jian; Liu, Peng-xi; Zhang, Ting-ting
2016-01-01
The inspection and classification for blood products are important but complicated in import-export ports or inspection and quarantine departments. For the inspection of whole blood products, open sampling can cause pollution and virulence factors in bloods samples may even endanger inspectors. Thus non-contact classification and identification methods for whole bloods of animals are needed. Spectroscopic techniques adopted in the flowcytometry need sampling blood cells during the detection; therefore they can not meet the demand of non-contact identification and classification for whole bloods of animals. Infrared absorption spectroscopy is a technique that can be used to analyze the molecular structure and chemical bonds of detected samples under the condition of non-contact. To find a feasible spectroscopic approach of non-contact detection for the species variation in whole blood samples, a near infrared transmitted spectra (NITS, 4 497.669 - 7 506.4 cm(-1)) experiment of whole blood samples of three common animals including chickens, dogs and cats has been conducted. During the experiment, the spectroscopic resolution is 5 cm(-1), and each spectrogram is an average of 5 measured spectral data. Experimental results show that all samples have a sharp absorption peak between 5 184 and 5 215 cm(-1), and a gentle absorption peak near 7 000 cm(-1). Besides, the NITS curves of different samples of same animals are similar, and only have slight differences in the whole transmittance. A correlation coefficient (CC) is induced to distinguish the differences of the three animals' whole bloods in NITS curves, and the computed CCs between NITS curves of different samples of the same animals, are greater than 0.99, whereas CCs between NITS curves of the whole bloods of different animals are from 0.509 48 to 0.916 13. Among which CCs between NITS curves of the whole bloods of chickens and cats are from 0.857 23 to 0.912 44, CCs between NITS curves of the whole bloods of chickens and dogs are from 0.509 48 to 0.664 82, and CCs between NITS curves of the whole bloods of cats and dogs are from 0.872 75 to 0.916 13. The cat and the dog belong to the class of mammal, and the CCs between their whole bloods NITS curves are greater than those between chickens and cats, or chickens and dogs, which are hetero-class animals. Namely, the whole bloods NITS curves of the cat and the dog have higher similarity. These results of NITS provide a feasible method of non-contact identification of animal whole bloods.
Enhancements of Bayesian Blocks; Application to Large Light Curve Databases
NASA Technical Reports Server (NTRS)
Scargle, Jeff
2015-01-01
Bayesian Blocks are optimal piecewise linear representations (step function fits) of light-curves. The simple algorithm implementing this idea, using dynamic programming, has been extended to include more data modes and fitness metrics, multivariate analysis, and data on the circle (Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations, Scargle, Norris, Jackson and Chiang 2013, ApJ, 764, 167), as well as new results on background subtraction and refinement of the procedure for precise timing of transient events in sparse data. Example demonstrations will include exploratory analysis of the Kepler light curve archive in a search for "star-tickling" signals from extraterrestrial civilizations. (The Cepheid Galactic Internet, Learned, Kudritzki, Pakvasa1, and Zee, 2008, arXiv: 0809.0339; Walkowicz et al., in progress).